Ground Vibration Test Planning and Pre-Test Analysis for the X-33 Vehicle
NASA Technical Reports Server (NTRS)
Bedrossian, Herand; Tinker, Michael L.; Hidalgo, Homero
2000-01-01
This paper describes the results of the modal test planning and the pre-test analysis for the X-33 vehicle. The pre-test analysis included the selection of the target modes, selection of the sensor and shaker locations and the development of an accurate Test Analysis Model (TAM). For target mode selection, four techniques were considered, one based on the Modal Cost technique, one based on Balanced Singular Value technique, a technique known as the Root Sum Squared (RSS) method, and a Modal Kinetic Energy (MKE) approach. For selecting sensor locations, four techniques were also considered; one based on the Weighted Average Kinetic Energy (WAKE), one based on Guyan Reduction (GR), one emphasizing engineering judgment, and one based on an optimum sensor selection technique using Genetic Algorithm (GA) search technique combined with a criteria based on Hankel Singular Values (HSV's). For selecting shaker locations, four techniques were also considered; one based on the Weighted Average Driving Point Residue (WADPR), one based on engineering judgment and accessibility considerations, a frequency response method, and an optimum shaker location selection based on a GA search technique combined with a criteria based on HSV's. To evaluate the effectiveness of the proposed sensor and shaker locations for exciting the target modes, extensive numerical simulations were performed. Multivariate Mode Indicator Function (MMIF) was used to evaluate the effectiveness of each sensor & shaker set with respect to modal parameter identification. Several TAM reduction techniques were considered including, Guyan, IRS, Modal, and Hybrid. Based on a pre-test cross-orthogonality checks using various reduction techniques, a Hybrid TAM reduction technique was selected and was used for all three vehicle fuel level configurations.
Electrical termination techniques
NASA Technical Reports Server (NTRS)
Oakey, W. E.; Schleicher, R. R.
1976-01-01
A technical review of high reliability electrical terminations for electronic equipment was made. Seven techniques were selected from this review for further investigation, experimental work, and preliminary testing. From the preliminary test results, four techniques were selected for final testing and evaluation. These four were: (1) induction soldering, (2) wire wrap, (3) percussive arc welding, and (4) resistance welding. Of these four, induction soldering was selected as the best technique in terms of minimizing operator errors, controlling temperature and time, minimizing joint contamination, and ultimately producing a reliable, uniform, and reusable electrical termination.
Study to determine cloud motion from meteorological satellite data
NASA Technical Reports Server (NTRS)
Clark, B. B.
1972-01-01
Processing techniques were tested for deducing cloud motion vectors from overlapped portions of pairs of pictures made from meteorological satellites. This was accomplished by programming and testing techniques for estimating pattern motion by means of cross correlation analysis with emphasis placed upon identifying and reducing errors resulting from various factors. Techniques were then selected and incorporated into a cloud motion determination program which included a routine which would select and prepare sample array pairs from the preprocessed test data. The program was then subjected to limited testing with data samples selected from the Nimbus 4 THIR data provided by the 11.5 micron channel.
Flight selection at United Airlines
NASA Technical Reports Server (NTRS)
Traub, W.
1980-01-01
Airline pilot selection proceedures are discussed including psychogical and personality tests, psychomotor performance requirements, and flight skills evaluation. Necessary attitude and personality traits are described and an outline of computer selection, testing, and training techniques is given.
Evaluation of methods for rapid determination of freezing point of aviation fuels
NASA Technical Reports Server (NTRS)
Mathiprakasam, B.
1982-01-01
Methods for identification of the more promising concepts for the development of a portable instrument to rapidly determine the freezing point of aviation fuels are described. The evaluation process consisted of: (1) collection of information on techniques previously used for the determination of the freezing point, (2) screening and selection of these techniques for further evaluation of their suitability in a portable unit for rapid measurement, and (3) an extensive experimental evaluation of the selected techniques and a final selection of the most promising technique. Test apparatuses employing differential thermal analysis and the change in optical transparency during phase change were evaluated and tested. A technique similar to differential thermal analysis using no reference fuel was investigated. In this method, the freezing point was obtained by digitizing the data and locating the point of inflection. Results obtained using this technique compare well with those obtained elsewhere using different techniques. A conceptual design of a portable instrument incorporating this technique is presented.
ERIC Educational Resources Information Center
Groseclose, Richard
This fourth in a series of six modules for a course titled Nondestructive Examination (NDE) Techniques II describes the specific technique variables and options which are available to the test technician, provides instructions for selecting and operating the appropriate test equipment, describes physical criteria for detectable discontinuities,…
Nondestructive testing techniques
NASA Astrophysics Data System (ADS)
Bray, Don E.; McBride, Don
A comprehensive reference covering a broad range of techniques in nondestructive testing is presented. Based on years of extensive research and application at NASA and other government research facilities, the book provides practical guidelines for selecting the appropriate testing methods and equipment. Topics discussed include visual inspection, penetrant and chemical testing, nuclear radiation, sonic and ultrasonic, thermal and microwave, magnetic and electromagnetic techniques, and training and human factors. (No individual items are abstracted in this volume)
Good, Andrew C; Hermsmeier, Mark A
2007-01-01
Research into the advancement of computer-aided molecular design (CAMD) has a tendency to focus on the discipline of algorithm development. Such efforts are often wrought to the detriment of the data set selection and analysis used in said algorithm validation. Here we highlight the potential problems this can cause in the context of druglikeness classification. More rigorous efforts are applied to the selection of decoy (nondruglike) molecules from the ACD. Comparisons are made between model performance using the standard technique of random test set creation with test sets derived from explicit ontological separation by drug class. The dangers of viewing druglike space as sufficiently coherent to permit simple classification are highlighted. In addition the issues inherent in applying unfiltered data and random test set selection to (Q)SAR models utilizing large and supposedly heterogeneous databases are discussed.
Application of nomographs for analysis and prediction of receiver spurious response EMI
NASA Astrophysics Data System (ADS)
Heather, F. W.
1985-07-01
Spurious response EMI for the front end of a superheterodyne receiver follows a simple mathematic formula; however, the application of the formula to predict test frequencies produces more data than can be evaluated. An analysis technique has been developed to graphically depict all receiver spurious responses usig a nomograph and to permit selection of optimum test frequencies. The discussion includes the math model used to simulate a superheterodyne receiver, the implementation of the model in the computer program, the approach to test frequency selection, interpretation of the nomographs, analysis and prediction of receiver spurious response EMI from the nomographs, and application of the nomographs. In addition, figures are provided of sample applications. This EMI analysis and prediction technique greatly improves the Electromagnetic Compatibility (EMC) test engineer's ability to visualize the scope of receiver spurious response EMI testing and optimize test frequency selection.
Finishing Techniques for Silicon Nitride Bearings
1976-03-01
finishing procedures. Rolling contact fatigue lives of silicon nitride with selected smoother finishes tested at 800 ksi Hertz stress were an order...grinding. Rolling contact fatigue lives of silicon nitride with selected smoother finishes tested at 800 ksi Hertz stress were an order of magnitude...lives of silicon nitride with selected smoother finishes tested at 800 ksi Hertz stress were an order of magnitude longer than those
A Comparison of a Bayesian and a Maximum Likelihood Tailored Testing Procedure.
ERIC Educational Resources Information Center
McKinley, Robert L.; Reckase, Mark D.
A study was conducted to compare tailored testing procedures based on a Bayesian ability estimation technique and on a maximum likelihood ability estimation technique. The Bayesian tailored testing procedure selected items so as to minimize the posterior variance of the ability estimate distribution, while the maximum likelihood tailored testing…
Reducing Sweeping Frequencies in Microwave NDT Employing Machine Learning Feature Selection
Moomen, Abdelniser; Ali, Abdulbaset; Ramahi, Omar M.
2016-01-01
Nondestructive Testing (NDT) assessment of materials’ health condition is useful for classifying healthy from unhealthy structures or detecting flaws in metallic or dielectric structures. Performing structural health testing for coated/uncoated metallic or dielectric materials with the same testing equipment requires a testing method that can work on metallics and dielectrics such as microwave testing. Reducing complexity and expenses associated with current diagnostic practices of microwave NDT of structural health requires an effective and intelligent approach based on feature selection and classification techniques of machine learning. Current microwave NDT methods in general based on measuring variation in the S-matrix over the entire operating frequency ranges of the sensors. For instance, assessing the health of metallic structures using a microwave sensor depends on the reflection or/and transmission coefficient measurements as a function of the sweeping frequencies of the operating band. The aim of this work is reducing sweeping frequencies using machine learning feature selection techniques. By treating sweeping frequencies as features, the number of top important features can be identified, then only the most influential features (frequencies) are considered when building the microwave NDT equipment. The proposed method of reducing sweeping frequencies was validated experimentally using a waveguide sensor and a metallic plate with different cracks. Among the investigated feature selection techniques are information gain, gain ratio, relief, chi-squared. The effectiveness of the selected features were validated through performance evaluations of various classification models; namely, Nearest Neighbor, Neural Networks, Random Forest, and Support Vector Machine. Results showed good crack classification accuracy rates after employing feature selection algorithms. PMID:27104533
Development of a technique for inflight jet noise simulation. I, II
NASA Technical Reports Server (NTRS)
Clapper, W. S.; Stringas, E. J.; Mani, R.; Banerian, G.
1976-01-01
Several possible noise simulation techniques were evaluated, including closed circuit wind tunnels, free jets, rocket sleds and high speed trains. The free jet technique was selected for demonstration and verification. The first paper describes the selection and development of the technique and presents results for simulation and in-flight tests of the Learjet, F106, and Bertin Aerotrain. The second presents a theoretical study relating the two sets of noise signatures. It is concluded that the free jet simulation technique provides a satisfactory assessment of in-flight noise.
The report describes a new technique for sulfur forms analysis based on low-temperature oxygen plasma ashing. The technique involves analyzing the low-temperature plasma ash by modified ASTM techniques after selectively removing the organic material. The procedure has been tested...
Parameter estimation techniques and application in aircraft flight testing
NASA Technical Reports Server (NTRS)
1974-01-01
Technical papers presented at the symposium by selected representatives from industry, universities, and various Air Force, Navy, and NASA installations are given. The topics covered include the newest developments in identification techniques, the most recent flight-test experience, and the projected potential for the near future.
Data re-arranging techniques leading to proper variable selections in high energy physics
NASA Astrophysics Data System (ADS)
Kůs, Václav; Bouř, Petr
2017-12-01
We introduce a new data based approach to homogeneity testing and variable selection carried out in high energy physics experiments, where one of the basic tasks is to test the homogeneity of weighted samples, mainly the Monte Carlo simulations (weighted) and real data measurements (unweighted). This technique is called ’data re-arranging’ and it enables variable selection performed by means of the classical statistical homogeneity tests such as Kolmogorov-Smirnov, Anderson-Darling, or Pearson’s chi-square divergence test. P-values of our variants of homogeneity tests are investigated and the empirical verification through 46 dimensional high energy particle physics data sets is accomplished under newly proposed (equiprobable) quantile binning. Particularly, the procedure of homogeneity testing is applied to re-arranged Monte Carlo samples and real DATA sets measured at the particle accelerator Tevatron in Fermilab at DØ experiment originating from top-antitop quark pair production in two decay channels (electron, muon) with 2, 3, or 4+ jets detected. Finally, the variable selections in the electron and muon channels induced by the re-arranging procedure for homogeneity testing are provided for Tevatron top-antitop quark data sets.
Gunavathi, Chellamuthu; Premalatha, Kandasamy
2014-01-01
Feature selection in cancer classification is a central area of research in the field of bioinformatics and used to select the informative genes from thousands of genes of the microarray. The genes are ranked based on T-statistics, signal-to-noise ratio (SNR), and F-test values. The swarm intelligence (SI) technique finds the informative genes from the top-m ranked genes. These selected genes are used for classification. In this paper the shuffled frog leaping with Lévy flight (SFLLF) is proposed for feature selection. In SFLLF, the Lévy flight is included to avoid premature convergence of shuffled frog leaping (SFL) algorithm. The SI techniques such as particle swarm optimization (PSO), cuckoo search (CS), SFL, and SFLLF are used for feature selection which identifies informative genes for classification. The k-nearest neighbour (k-NN) technique is used to classify the samples. The proposed work is applied on 10 different benchmark datasets and examined with SI techniques. The experimental results show that the results obtained from k-NN classifier through SFLLF feature selection method outperform PSO, CS, and SFL.
Development of assembly techniques for fire resistant aircraft interior panels
NASA Technical Reports Server (NTRS)
Lee, S. C. S.
1978-01-01
Ten NASA Type A fire resistant aircraft interior panels were fabricated and tested to develop assembly techniques. These techiques were used in the construction of a full scale lavatory test structure for flame propagation testing. The Type A panel is of sandwich construction consisting of Nomex honeycomb filled with quinone dioxime foam, and bismaleimide/glass face sheets bonded to the core with polyimide film adhesive. The materials selected and the assembly techniques developed for the lavatory test structure were designed for obtaining maximum fire containment with minimum smoke and toxic emission.
Design, fabrication and testing of a thermal diode
NASA Technical Reports Server (NTRS)
Swerdling, B.; Kosson, R.
1972-01-01
Heat pipe diode types are discussed. The design, fabrication and test of a flight qualified diode for the Advanced Thermal Control Flight Experiment (ATFE) are described. The review covers the use of non-condensable gas, freezing, liquid trap, and liquid blockage techniques. Test data and parametric performance are presented for the liquid trap and liquid blockage techniques. The liquid blockage technique was selected for the ATFE diode on the basis of small reservoir size, low reverse mode heat transfer, and apparent rapid shut-off.
DOT National Transportation Integrated Search
1982-04-01
A comprehensive review of existing basic diagnostic techniques applicable to the railcar roller bearing defect and failure problem was made. Of the potentially feasible diagnostic techniques identified, high frequency vibration was selected for exper...
Technology development of fabrication techniques for advanced solar dynamic concentrators
NASA Technical Reports Server (NTRS)
Richter, Scott W.
1991-01-01
The objective of the advanced concentrator program is to develop the technology that will lead to lightweight, highly reflective, accurate, scaleable, and long lived space solar dynamic concentrators. The advanced concentrator program encompasses new and innovative concepts, fabrication techniques, materials selection, and simulated space environmental testing. Fabrication techniques include methods of fabricating the substrates and coating substrate surfaces to produce high quality optical surfaces, acceptable for further coating with vapor deposited optical films. The selected materials to obtain a high quality optical surface include microsheet glass and Eccocoat EP-3 epoxy, with DC-93-500 selected as a candidate silicone adhesive and levelizing layer. The following procedures are defined: cutting, cleaning, forming, and bonding microsheet glass. Procedures are also defined for surface cleaning, and EP-3 epoxy application. The results and analyses from atomic oxygen and thermal cycling tests are used to determine the effects of orbital conditions in a space environment.
Relative Utility of Selected Software Requirement Metrics
1991-12-01
testing . They can also help in deciding if and how to use complexity reduction techniques. In summary, requirement metrics can be useful because they...answer items in a test instrument. In order to differentiate between misinterpretation and comprehension, the measurement technique must be able to...effectively test a requirement, it is verifiable. Ramamoorthy and others have proposed requirements complexity metrics that can be used to infer the
Measurement techniques and instruments suitable for life-prediction testing of photovoltaic arrays
NASA Technical Reports Server (NTRS)
Noel, G. T.; Sliemers, F. A.; Deringer, G. C.; Wood, V. E.; Wilkes, K. E.; Gaines, G. B.; Carmichael, D. C.
1978-01-01
Array failure modes, relevant materials property changes, and primary degradation mechanisms are discussed as a prerequisite to identifying suitable measurement techniques and instruments. Candidate techniques and instruments are identified on the basis of extensive reviews of published and unpublished information. These methods are organized in six measurement categories - chemical, electrical, optical, thermal, mechanical, and other physicals. Using specified evaluation criteria, the most promising techniques and instruments for use in life prediction tests of arrays were selected.
Using Spare Logic Resources To Create Dynamic Test Points
NASA Technical Reports Server (NTRS)
Katz, Richard; Kleyner, Igor
2011-01-01
A technique has been devised to enable creation of a dynamic set of test points in an embedded digital electronic system. As a result, electronics contained in an application specific circuit [e.g., gate array, field programmable gate array (FPGA)] can be internally probed, even when contained in a closed housing during all phases of test. In the present technique, the test points are not fixed and limited to a small number; the number of test points can vastly exceed the number of buffers or pins, resulting in a compact footprint. Test points are selected by means of spare logic resources within the ASIC(s) and/or FPGA(s). A register is programmed with a command, which is used to select the signals that are sent off-chip and out of the housing for monitoring by test engineers and external test equipment. The register can be commanded by any suitable means: for example, it could be commanded through a command port that would normally be used in the operation of the system. In the original application of the technique, commanding of the register is performed via a MIL-STD-1553B communication subsystem.
A study for development of aerothermodynamic test model materials and fabrication technique
NASA Technical Reports Server (NTRS)
Dean, W. G.; Connor, L. E.
1972-01-01
A literature survey, materials reformulation and tailoring, fabrication problems, and materials selection and evaluation for fabricating models to be used with the phase-change technique for obtaining quantitative aerodynamic heat transfer data are presented. The study resulted in the selection of two best materials, stycast 2762 FT, and an alumina ceramic. Characteristics of these materials and detailed fabrication methods are presented.
Batta, Yacoub A
2016-01-01
The present article describes the technique used for preparing the invert emulsion (water-in-oil type) then, selecting the most proper formulation of invert emulsion for being used as a carrier formulation of entomopathogenic fungi. It also describes the method used for testing the efficacy of the formulated fungi as biocontrol agents of targeted insects. Detailed examples demonstrating the efficacy of formulated strains of entomopathogenic fungi against certain species of insect pests were included in the present article. The techniques and methods described in this article are reproducible and helpful in enhancing the effectiveness of formulated fungi against wide range of targeted insects in comparison with the unformulated form of these fungi. Also, these techniques and methods can be used effectively in crop protection and in the integrated pest management programs. Finally, it is important to indicate that the ingredients used for preparation of the invert emulsion have no environmental side-effects or health risks since these ingredients are safe to use and can be used in manufacturing of cosmetics or as food additives.•Description of method used for preparation of invert emulsion (water-in-oil type) and selecting the most stable and non-viscous emulsion.•Description of technique used for introducing the entomopathogenic fungi into the selected stable and non-viscous invert emulsion.•Description of method for testing the efficacy of introduced entomopathogenic fungus into the selected invert emulsion against targeted insects with detailed examples on the efficacy testing.
16 CFR 1107.21 - Periodic testing.
Code of Federal Regulations, 2012 CFR
2012-01-01
... samples selected for testing pass the test, there is a high degree of assurance that the other untested... determining the testing interval include, but are not limited to, the following: (i) High variability in test... process management techniques and tests provide a high degree of assurance of compliance if they are not...
16 CFR § 1107.21 - Periodic testing.
Code of Federal Regulations, 2013 CFR
2013-01-01
... samples selected for testing pass the test, there is a high degree of assurance that the other untested... determining the testing interval include, but are not limited to, the following: (i) High variability in test... process management techniques and tests provide a high degree of assurance of compliance if they are not...
16 CFR 1107.21 - Periodic testing.
Code of Federal Regulations, 2014 CFR
2014-01-01
... samples selected for testing pass the test, there is a high degree of assurance that the other untested... determining the testing interval include, but are not limited to, the following: (i) High variability in test... process management techniques and tests provide a high degree of assurance of compliance if they are not...
New Technique for Cryogenically Cooling Small Test Articles
NASA Technical Reports Server (NTRS)
Rodriquez, Karen M.; Henderson, Donald J.
2011-01-01
Convective heat removal techniques to rapidly cool small test articles to Earth-Moon L2 temperatures of 77 K were accomplished through the use of liquid nitrogen (LN2). By maintaining a selected pressure range on the saturation curve, test articles were cooled below the LN2 boiling point at ambient pressure in less than 30 min. Difficulties in achieving test pressures while maintaining the temperature tolerance necessitated a modification to the original system to include a closed loop conductive cold plate and cryogenic shroud
Kuiper, Rebecca M; Nederhoff, Tim; Klugkist, Irene
2015-05-01
In this paper, the performance of six types of techniques for comparisons of means is examined. These six emerge from the distinction between the method employed (hypothesis testing, model selection using information criteria, or Bayesian model selection) and the set of hypotheses that is investigated (a classical, exploration-based set of hypotheses containing equality constraints on the means, or a theory-based limited set of hypotheses with equality and/or order restrictions). A simulation study is conducted to examine the performance of these techniques. We demonstrate that, if one has specific, a priori specified hypotheses, confirmation (i.e., investigating theory-based hypotheses) has advantages over exploration (i.e., examining all possible equality-constrained hypotheses). Furthermore, examining reasonable order-restricted hypotheses has more power to detect the true effect/non-null hypothesis than evaluating only equality restrictions. Additionally, when investigating more than one theory-based hypothesis, model selection is preferred over hypothesis testing. Because of the first two results, we further examine the techniques that are able to evaluate order restrictions in a confirmatory fashion by examining their performance when the homogeneity of variance assumption is violated. Results show that the techniques are robust to heterogeneity when the sample sizes are equal. When the sample sizes are unequal, the performance is affected by heterogeneity. The size and direction of the deviations from the baseline, where there is no heterogeneity, depend on the effect size (of the means) and on the trend in the group variances with respect to the ordering of the group sizes. Importantly, the deviations are less pronounced when the group variances and sizes exhibit the same trend (e.g., are both increasing with group number). © 2014 The British Psychological Society.
Survey Of High Speed Test Techniques
NASA Astrophysics Data System (ADS)
Gheewala, Tushar
1988-02-01
The emerging technologies for the characterization and production testing of high-speed devices and integrated circuits are reviewed. The continuing progress in the field of semiconductor technologies will, in the near future, demand test techniques to test 10ps to lOOps gate delays, 10 GHz to 100 GHz analog functions and 10,000 to 100,000 gates on a single chip. Clearly, no single test technique would provide a cost-effective answer to all the above demands. A divide-and-conquer approach based on a judicial selection of parametric, functional and high-speed tests will be required. In addition, design-for-test methods need to be pursued which will include on-chip test electronics as well as circuit techniques that minimize the circuit performance sensitivity to allowable process variations. The electron and laser beam based test technologies look very promising and may provide the much needed solutions to not only the high-speed test problem but also to the need for high levels of fault coverage during functional testing.
The efficacy of the 'mind map' study technique.
Farrand, Paul; Hussain, Fearzana; Hennessy, Enid
2002-05-01
To examine the effectiveness of using the 'mind map' study technique to improve factual recall from written information. To obtain baseline data, subjects completed a short test based on a 600-word passage of text prior to being randomly allocated to form two groups: 'self-selected study technique' and 'mind map'. After a 30-minute interval the self-selected study technique group were exposed to the same passage of text previously seen and told to apply existing study techniques. Subjects in the mind map group were trained in the mind map technique and told to apply it to the passage of text. Recall was measured after an interfering task and a week later. Measures of motivation were taken. Barts and the London School of Medicine and Dentistry, University of London. 50 second- and third-year medical students. Recall of factual material improved for both the mind map and self-selected study technique groups at immediate test compared with baseline. However this improvement was only robust after a week for those in the mind map group. At 1 week, the factual knowledge in the mind map group was greater by 10% (adjusting for baseline) (95% CI -1% to 22%). However motivation for the technique used was lower in the mind map group; if motivation could have been made equal in the groups, the improvement with mind mapping would have been 15% (95% CI 3% to 27%). Mind maps provide an effective study technique when applied to written material. However before mind maps are generally adopted as a study technique, consideration has to be given towards ways of improving motivation amongst users.
NASA Technical Reports Server (NTRS)
Coggeshall, M. E.; Hoffer, R. M.
1973-01-01
Remote sensing equipment and automatic data processing techniques were employed as aids in the institution of improved forest resource management methods. On the basis of automatically calculated statistics derived from manually selected training samples, the feature selection processor of LARSYS selected, upon consideration of various groups of the four available spectral regions, a series of channel combinations whose automatic classification performances (for six cover types, including both deciduous and coniferous forest) were tested, analyzed, and further compared with automatic classification results obtained from digitized color infrared photography.
Measurement Techniques and Instruments Suitable for Life-prediction Testing of Photovoltaic Arrays
NASA Technical Reports Server (NTRS)
Noel, G. T.; Wood, V. E.; Mcginniss, V. D.; Hassell, J. A.; Richard, N. A.; Gaines, G. B.; Carmichael, D. C.
1979-01-01
The validation of a 20-year service life for low-cost photovoltaic arrays is a critical requirement in the Low-Cost Solar Array (LSA) Project. The validation is accomplished through accelerated life-prediction tests. A two-phase study was conducted to address the needs before such tests are carried out. The results and recommended techniques from the Phase 1 investigation are summarized in the appendix. Phase 2 of the study is covered in this report and consisted of experimental evaluations of three techniques selected from these recommended as a results of the Phase 1 findings. The three techniques evaluated were specular and nonspecular optical reflectometry, chemiluminescence measurements, and electric current noise measurements.
Impacts of Vocabulary Acquisition Techniques Instruction on Students' Learning
ERIC Educational Resources Information Center
Orawiwatnakul, Wiwat
2011-01-01
The objectives of this study were to determine how the selected vocabulary acquisition techniques affected the vocabulary ability of 35 students who took EN 111 and investigate their attitudes towards the techniques instruction. The research study was one-group pretest and post-test design. The instruments employed were in-class exercises…
Black-Box System Testing of Real-Time Embedded Systems Using Random and Search-Based Testing
NASA Astrophysics Data System (ADS)
Arcuri, Andrea; Iqbal, Muhammad Zohaib; Briand, Lionel
Testing real-time embedded systems (RTES) is in many ways challenging. Thousands of test cases can be potentially executed on an industrial RTES. Given the magnitude of testing at the system level, only a fully automated approach can really scale up to test industrial RTES. In this paper we take a black-box approach and model the RTES environment using the UML/MARTE international standard. Our main motivation is to provide a more practical approach to the model-based testing of RTES by allowing system testers, who are often not familiar with the system design but know the application domain well-enough, to model the environment to enable test automation. Environment models can support the automation of three tasks: the code generation of an environment simulator, the selection of test cases, and the evaluation of their expected results (oracles). In this paper, we focus on the second task (test case selection) and investigate three test automation strategies using inputs from UML/MARTE environment models: Random Testing (baseline), Adaptive Random Testing, and Search-Based Testing (using Genetic Algorithms). Based on one industrial case study and three artificial systems, we show how, in general, no technique is better than the others. Which test selection technique to use is determined by the failure rate (testing stage) and the execution time of test cases. Finally, we propose a practical process to combine the use of all three test strategies.
Wing Twist Measurements at the National Transonic Facility
NASA Technical Reports Server (NTRS)
Burner, Alpheus W.; Wahls, Richard A.; Goad, William K.
1996-01-01
A technique for measuring wing twist currently in use at the National Transonic Facility is described. The technique is based upon a single camera photogrammetric determination of two dimensional coordinates with a fixed (and known) third dimensional coordinate. The wing twist is found from a conformal transformation between wind-on and wind-off 2-D coordinates in the plane of rotation. The advantages and limitations of the technique as well as the rationale for selection of this particular technique are discussed. Examples are presented to illustrate run-to-run and test-to-test repeatability of the technique in air mode. Examples of wing twist in cryogenic nitrogen mode are also presented.
Guidance on Nanomaterial Hazards and Risks
2015-05-21
and at room temperature and 37 C°– solid separation by centrifugation, filtration , or chemical techniques (more experimental techniques combining...members in this potency sequence using bolus in vivo testing, verify the bolus results with selective inhalation testing. The potency of members of...measures in in vitro and limited in vivo experimental systems would facilitate the characterization of dose-response relationships across a set of ENMs
Dense and dynamic 3D selection for game-based virtual environments.
Cashion, Jeffrey; Wingrave, Chadwick; LaViola, Joseph J
2012-04-01
3D object selection is more demanding when, 1) objects densly surround the target object, 2) the target object is significantly occluded, and 3) when the target object is dynamically changing location. Most 3D selection techniques and guidelines were developed and tested on static or mostly sparse environments. In contrast, games tend to incorporate densly packed and dynamic objects as part of their typical interaction. With the increasing popularity of 3D selection in games using hand gestures or motion controllers, our current understanding of 3D selection needs revision. We present a study that compared four different selection techniques under five different scenarios based on varying object density and motion dynamics. We utilized two existing techniques, Raycasting and SQUAD, and developed two variations of them, Zoom and Expand, using iterative design. Our results indicate that while Raycasting and SQUAD both have weaknesses in terms of speed and accuracy in dense and dynamic environments, by making small modifications to them (i.e., flavoring), we can achieve significant performance increases.
Demonstration of landfill gas enhancement techniques in landfill simulators
NASA Astrophysics Data System (ADS)
Walsh, J. J.; Vogt, W. G.
1982-02-01
Various techniques to enhance gas production in sanitary landfills were applied to landfill simulators. These techniques include (1) accelerated moisture addition, (2) leachate recycling, (3) buffer addition, (4) nutrient addition, and (5) combinations of the above. Results are compiled through on-going operation and monitoring of sixteen landfill simulators. These test cells contain about 380 kg of municipal solid waste. Quantities of buffer and nutrient materials were placed in selected cells at the time of loading. Water is added to all test cells on a monthly basis; leachate is withdrawn from all cells (and recycled on selected cells) also on a monthly basis. Daily monitoring of gas volumes and refuse temperatures is performed. Gas and leachate samples are collected and analyzed on a monthly basis. Leachate and gas quality and quantity reslts are presented for the first 18 months of operation.
Cost effectiveness as applied to the Viking Lander systems-level thermal development test program
NASA Technical Reports Server (NTRS)
Buna, T.; Shupert, T. C.
1974-01-01
The economic aspects of thermal testing at the systems-level as applied to the Viking Lander Capsule thermal development program are reviewed. The unique mission profile and pioneering scientific goals of Viking imposed novel requirements on testing, including the development of a simulation technique for the Martian thermal environment. The selected approach included modifications of an existing conventional thermal vacuum facility, and improved test-operational techniques that are applicable to the simulation of the other mission phases as well, thereby contributing significantly to the cost effectiveness of the overall thermal test program.
Analysis of Weibull Grading Test for Solid Tantalum Capacitors
NASA Technical Reports Server (NTRS)
Teverovsky, Alexander
2010-01-01
Weibull grading test is a powerful technique that allows selection and reliability rating of solid tantalum capacitors for military and space applications. However, inaccuracies in the existing method and non-adequate acceleration factors can result in significant, up to three orders of magnitude, errors in the calculated failure rate of capacitors. This paper analyzes deficiencies of the existing technique and recommends more accurate method of calculations. A physical model presenting failures of tantalum capacitors as time-dependent-dielectric-breakdown is used to determine voltage and temperature acceleration factors and select adequate Weibull grading test conditions. This, model is verified by highly accelerated life testing (HALT) at different temperature and voltage conditions for three types of solid chip tantalum capacitors. It is shown that parameters of the model and acceleration factors can be calculated using a general log-linear relationship for the characteristic life with two stress levels.
NASA Technical Reports Server (NTRS)
Sprowls, D. O.
1984-01-01
A review of the literature is presented with the objectives of identifying relationships between various accelerated stress corrosion testing techniques, and for determining the combination of test methods best suited to selection and design of high strength aluminum alloys. The following areas are reviewed: status of stress-corrosion test standards, the influence of mechanical and environmental factors on stress corrosion testing, correlation of accelerated test data with in-service experience, and procedures used to avoid stress corrosion problems in service. Promising areas for further work are identified.
NASA Technical Reports Server (NTRS)
Jennings, W. P.; Olsen, N. L.; Walter, M. J.
1976-01-01
The development of testing techniques useful in airplane ground resonance testing, wind tunnel aeroelastic model testing, and airplane flight flutter testing is presented. Included is the consideration of impulsive excitation, steady-state sinusoidal excitation, and random and pseudorandom excitation. Reasons for the selection of fast sine sweeps for transient excitation are given. The use of the fast fourier transform dynamic analyzer (HP-5451B) is presented, together with a curve fitting data process in the Laplace domain to experimentally evaluate values of generalized mass, model frequencies, dampings, and mode shapes. The effects of poor signal to noise ratios due to turbulence creating data variance are discussed. Data manipulation techniques used to overcome variance problems are also included. The experience is described that was gained by using these techniques since the early stages of the SST program. Data measured during 747 flight flutter tests, and SST, YC-14, and 727 empennage flutter model tests are included.
NASA Astrophysics Data System (ADS)
Weeks, Brian E.
College students often come to the study of evolutionary biology with many misconceptions of how the processes of natural selection and speciation occur. How to relinquish these misconceptions with learners is a question that many educators face in introductory biology courses. Constructivism as a theoretical framework has become an accepted and promoted model within the epistemology of science instruction. However, constructivism is not without its skeptics who see some problems of its application in lacking necessary guidance for novice learners. This study within a quantitative, quasi-experimental format tested whether guided online instruction in a video format of common misconceptions in evolutionary biology produced higher performance on a survey of knowledge of natural selection versus more constructivist style learning in the form of student exploration of computer simulations of the evolutionary process. Performances on surveys were also explored for a combination of constructivist and guided techniques to determine if a consolidation of approaches produced higher test scores. Out of the 94 participants 95% displayed at least one misconception of natural selection in the pre-test while the study treatments produced no statistically significant improvements in post-test scores except within the video (guided learning treatment). These overall results demonstrated the stubbornness of misconceptions involving natural selection for adult learners and the difficulty of helping them overcome them. It also bolsters the idea that some misconceptions of natural selection and evolution may be hardwired in a neurological sense and that new, more long-term teaching techniques may be warranted. Such long-term strategies may not be best implemented with constructivist techniques alone, and it is likely that some level of guidance may be necessary for novice adult learners. A more substantial, nuanced approach for undergraduates is needed that consolidates successful teaching strategies to adult students that is based on current research.
Thermal sensing of cryogenic wind tunnel model surfaces Evaluation of silicon diodes
NASA Technical Reports Server (NTRS)
Daryabeigi, K.; Ash, R. L.; Dillon-Townes, L. A.
1986-01-01
Different sensors and installation techniques for surface temperature measurement of cryogenic wind tunnel models were investigated. Silicon diodes were selected for further consideration because of their good inherent accuracy. Their average absolute temperature deviation in comparison tests with standard platinum resistance thermometers was found to be 0.2 K in the range from 125 to 273 K. Subsurface temperature measurement was selected as the installation technique in order to minimize aerodynamic interference. Temperature distortion caused by an embedded silicon diode was studied numerically.
Thermal sensing of cryogenic wind tunnel model surfaces - Evaluation of silicon diodes
NASA Technical Reports Server (NTRS)
Daryabeigi, Kamran; Ash, Robert L.; Dillon-Townes, Lawrence A.
1986-01-01
Different sensors and installation techniques for surface temperature measurement of cryogenic wind tunnel models were investigated. Silicon diodes were selected for further consideration because of their good inherent accuracy. Their average absolute temperature deviation in comparison tests with standard platinum resistance thermometers was found to be 0.2 K in the range from 125 to 273 K. Subsurface temperature measurement was selected as the installation technique in order to minimize aerodynamic interference. Temperature distortion caused by an embedded silicon diode was studied numerically.
Conducting field studies for testing pesticide leaching models
Smith, Charles N.; Parrish, Rudolph S.; Brown, David S.
1990-01-01
A variety of predictive models are being applied to evaluate the transport and transformation of pesticides in the environment. These include well known models such as the Pesticide Root Zone Model (PRZM), the Risk of Unsaturated-Saturated Transport and Transformation Interactions for Chemical Concentrations Model (RUSTIC) and the Groundwater Loading Effects of Agricultural Management Systems Model (GLEAMS). The potentially large impacts of using these models as tools for developing pesticide management strategies and regulatory decisions necessitates development of sound model validation protocols. This paper offers guidance on many of the theoretical and practical problems encountered in the design and implementation of field-scale model validation studies. Recommendations are provided for site selection and characterization, test compound selection, data needs, measurement techniques, statistical design considerations and sampling techniques. A strategy is provided for quantitatively testing models using field measurements.
Polyphasic approach for differentiating Penicillium nordicum from Penicillium verrucosum.
Berni, E; Degola, F; Cacchioli, C; Restivo, F M; Spotti, E
2011-04-01
The aim of this research was to use a polyphasic approach to differentiate Penicillium verrucosum from Penicillium nordicum, to compare different techniques, and to select the most suitable for industrial use. In particular, (1) a cultural technique with two substrates selective for these species; (2) a molecular diagnostic test recently set up and a RAPD procedure derived from this assay; (3) an RP-HPLC analysis to quantify ochratoxin A (OTA) production and (4) an automated system based on fungal carbon source utilisation (Biolog Microstation™) were used. Thirty strains isolated from meat products and originally identified as P. verrucosum by morphological methods were re-examined by newer cultural tests and by PCR methods. All were found to belong to P. nordicum. Their biochemical and chemical characterisation supported the results obtained by cultural and molecular techniques and showed the varied ability in P. verrucosum and P. nordicum to metabolise carbon-based sources and to produce OTA at different concentrations, respectively.
Space Shuttle Main Engine Liquid Air Insulation Redesign Lessons Learned
NASA Technical Reports Server (NTRS)
Gaddy, Darrell; Carroll, Paul; Head, Kenneth; Fasheh, John; Stuart, Jessica
2010-01-01
The Space Shuttle Main Engine Liquid Air Insulation redesign was required to prevent the reoccurance of the STS-111 High Pressure Speed Sensor In-Flight Anomaly. The STS-111 In-Flight Anomaly Failure Investigation Team's initial redesign of the High Pressure Fuel Turbopump Pump End Ball Bearing Liquid Air Insulation failed the certification test by producing Liquid Air. The certification test failure indicated not only the High Pressure Fuel Turbopump Liquid Air Insulation, but all other Space Shuttle Main Engine Liquid Air Insulation. This paper will document the original Space Shuttle Main Engine Liquid Air STS-111 In-Flight Anomaly investigation, the heritage Space Shuttle Main Engine Insulation certification testing faults, the techniques and instrumentation used to accurately test the Liquid Air Insulation systems on the Stennis Space Center SSME test stand, the analysis techniques used to identify the Liquid Air Insulation problem areas and the analytical verification of the redesign before entering certification testing, Trade study down selected to three potential design solutions, the results of the development testing which down selected the final Liquid Air Redesign are also documented within this paper.
Ground test challenges in the development of the Space Shuttle orbiter auxiliary power unit
NASA Technical Reports Server (NTRS)
Chaffee, N. H.; Lance, R. J.; Weary, D. P.
1984-01-01
A conventional aircraft hydraulic system design approach was selected to provide fluid power for the Space Shuttle Orbiter. Developing the power unit, known as the Auxiliary Power Unit (APU), to drive the hydraulic pumps presented a major technological challenge. A small, high speed turbine drive unit powered by catalytically decomposed hydrazine and operating in the pulse mode was selected to meet the requirement. Because of limitations of vendor test facilities, significant portions of the development, flight qualification, and postflight anomaly testing of the Orbiter APU were accomplished at the Johnson Space Center (JSC) test facilities. This paper discusses the unique requirements of attitude, gravity forces, pressure profiles, and thermal environments which had to be satisfied by the APU, and presents the unique test facility and simulation techniques employed to meet the ground test requirements. In particular, the development of the zero-g lubrication system, the development of necessary APU thermal control techniques, the accomplishment of integrated systems tests, and the postflight investigation of the APU lube oil cooler behavior are discussed.
NASA Astrophysics Data System (ADS)
Gruzin, A. V.; Gruzin, V. V.; Shalay, V. V.
2017-08-01
The development of technology for a directional soil compaction of tank foundations for oil and oil products storage is a relevant problem which solution will enable simultaneously provide required operational characteristics of a soil foundation and reduce time and material costs to prepare the foundation. The impact dynamics of rammers' operating elements on the soil foundation is planned to specify in the course of laboratory studies. A specialized technique is developed to justify the parameters and select the equipment for laboratory researches. The usage of this technique enabled us to calculate dimensions of the models, of a test bench and specifications of the recording equipment, and a lighting system. The necessary equipment for laboratory studies was selected. Preliminary laboratory tests were carried out. The estimate of accuracy for planned laboratory studies was given.
Frequentist Model Averaging in Structural Equation Modelling.
Jin, Shaobo; Ankargren, Sebastian
2018-06-04
Model selection from a set of candidate models plays an important role in many structural equation modelling applications. However, traditional model selection methods introduce extra randomness that is not accounted for by post-model selection inference. In the current study, we propose a model averaging technique within the frequentist statistical framework. Instead of selecting an optimal model, the contributions of all candidate models are acknowledged. Valid confidence intervals and a [Formula: see text] test statistic are proposed. A simulation study shows that the proposed method is able to produce a robust mean-squared error, a better coverage probability, and a better goodness-of-fit test compared to model selection. It is an interesting compromise between model selection and the full model.
ITEM SELECTION TECHNIQUES AND EVALUATION OF INSTRUCTIONAL OBJECTIVES.
ERIC Educational Resources Information Center
COX, RICHARD C.
THE VALIDITY OF AN EDUCATIONAL ACHIEVEMENT TEST DEPENDS UPON THE CORRESPONDENCE BETWEEN SPECIFIED EDUCATIONAL OBJECTIVES AND THE EXTENT TO WHICH THESE OBJECTIVES ARE MEASURED BY THE EVALUATION INSTRUMENT. THIS STUDY IS DESIGNED TO EVALUATE THE EFFECT OF STATISTICAL ITEM SELECTION ON THE STRUCTURE OF THE FINAL EVALUATION INSTRUMENT AS COMPARED WITH…
Use of color-coded sleeve shutters accelerates oscillograph channel selection
NASA Technical Reports Server (NTRS)
Bouchlas, T.; Bowden, F. W.
1967-01-01
Sleeve-type shutters mechanically adjust individual galvanometer light beams onto or away from selected channels on oscillograph papers. In complex test setups, the sleeve-type shutters are color coded to separately identify each oscillograph channel. This technique could be used on any equipment using tubular galvanometer light sources.
A Comparison of Item Selection Techniques for Testlets
ERIC Educational Resources Information Center
Murphy, Daniel L.; Dodd, Barbara G.; Vaughn, Brandon K.
2010-01-01
This study examined the performance of the maximum Fisher's information, the maximum posterior weighted information, and the minimum expected posterior variance methods for selecting items in a computerized adaptive testing system when the items were grouped in testlets. A simulation study compared the efficiency of ability estimation among the…
Lin, Chun-Yuan; Wang, Yen-Ling
2014-01-01
Checkpoint kinase 2 (Chk2) has a great effect on DNA-damage and plays an important role in response to DNA double-strand breaks and related lesions. In this study, we will concentrate on Chk2 and the purpose is to find the potential inhibitors by the pharmacophore hypotheses (PhModels), combinatorial fusion, and virtual screening techniques. Applying combinatorial fusion into PhModels and virtual screening techniques is a novel design strategy for drug design. We used combinatorial fusion to analyze the prediction results and then obtained the best correlation coefficient of the testing set (r test) with the value 0.816 by combining the Best(train)Best(test) and Fast(train)Fast(test) prediction results. The potential inhibitors were selected from NCI database by screening according to Best(train)Best(test) + Fast(train)Fast(test) prediction results and molecular docking with CDOCKER docking program. Finally, the selected compounds have high interaction energy between a ligand and a receptor. Through these approaches, 23 potential inhibitors for Chk2 are retrieved for further study.
Reliability of High-Voltage Tantalum Capacitors. Parts 3 and 4)
NASA Technical Reports Server (NTRS)
Teverovsky, Alexander
2010-01-01
Weibull grading test is a powerful technique that allows selection and reliability rating of solid tantalum capacitors for military and space applications. However, inaccuracies in the existing method and non-adequate acceleration factors can result in significant, up to three orders of magnitude, errors in the calculated failure rate of capacitors. This paper analyzes deficiencies of the existing technique and recommends more accurate method of calculations. A physical model presenting failures of tantalum capacitors as time-dependent-dielectric-breakdown is used to determine voltage and temperature acceleration factors and select adequate Weibull grading test conditions. This model is verified by highly accelerated life testing (HALT) at different temperature and voltage conditions for three types of solid chip tantalum capacitors. It is shown that parameters of the model and acceleration factors can be calculated using a general log-linear relationship for the characteristic life with two stress levels.
ERIC Educational Resources Information Center
Grand, James A.; Golubovich, Juliya; Ryan, Ann Marie; Schmitt, Neal
2013-01-01
In organizational and educational practices, sensitivity reviews are commonly advocated techniques for reducing test bias and enhancing fairness. In the present paper, results from two studies are reported which investigate how effective individuals are at detecting problematic test content and the influence such content has on important testing…
Eyeball Measurement of Dexterity: Tests as Alternatives to Interviews.
ERIC Educational Resources Information Center
Guion, Robert M.; Imada, Andrew S.
1981-01-01
Reports a study conducted for litigation in a sex discrimination case dealing with misuse of an employment interview. Results show that dexterity could not be determined in an interview and a more appropriate selection technique such as a test was required. (Author/JAC)
Li, Jiangeng; Su, Lei; Pang, Zenan
2015-12-01
Feature selection techniques have been widely applied to tumor gene expression data analysis in recent years. A filter feature selection method named marginal Fisher analysis score (MFA score) which is based on graph embedding has been proposed, and it has been widely used mainly because it is superior to Fisher score. Considering the heavy redundancy in gene expression data, we proposed a new filter feature selection technique in this paper. It is named MFA score+ and is based on MFA score and redundancy excluding. We applied it to an artificial dataset and eight tumor gene expression datasets to select important features and then used support vector machine as the classifier to classify the samples. Compared with MFA score, t test and Fisher score, it achieved higher classification accuracy.
NASA Astrophysics Data System (ADS)
Tong, Xin; Winney, Alexander H.; Willitsch, Stefan
2010-10-01
We present a new method for the generation of rotationally and vibrationally state-selected, translationally cold molecular ions in ion traps. Our technique is based on the state-selective threshold photoionization of neutral molecules followed by sympathetic cooling of the resulting ions with laser-cooled calcium ions. Using N2+ ions as a test system, we achieve >90% selectivity in the preparation of the ground rovibrational level and state lifetimes on the order of 15 minutes limited by collisions with background-gas molecules. The technique can be employed to produce a wide range of apolar and polar molecular ions in the ground and excited rovibrational states. Our approach opens up new perspectives for cold quantum-controlled ion-molecule-collision studies, frequency-metrology experiments with state-selected molecular ions and molecular-ion qubits.
Mac A. Callaham; Arthur J. Stewart; Clara Alarcon; Sara J. McMillen
2002-01-01
Current bioremediation techniques for petroleum-contaminated soils are designed to remove contaminants as quickly and efficiently as possible, but not necessarily with postremediation soil biological quality as a primary objective. To test a simple postbioremediation technique, we added earthworms (Eisenia fetida) or wheat (Triticum aestivum...
Commeau, R.F.; Reynolds, Leslie A.; Poag, C.W.
1985-01-01
The composition of agglutinated foraminiferal tests vary remarkably in response to local substrate characteristics, physiochemical properties of the water column and species- dependant selectivity of test components. We have employed a technique that combines a scanning electron microscope with an energy dispersive X-ray spectrometer system to identify major and minor elemental constituents of agglutinated foraminiferal walls. As a sample is bombarded with a beam of high energy electrons, X-rays are generated that are characteristic of the elements present. As a result, X- ray density maps can be produced for each of several elements present in the tests of agglutinated foraminifers.
ERIC Educational Resources Information Center
Weisgerber, Robert A.; deHaas, Carla
The report describes an effort to develop and test instructional materials, techniques and procedures - ESSETS (environmental sensing, selection, evaluation and training system) - for teaching functionally blind young adults to use electronic travel aids (ETAs). Considered are development of training guidelines, field site selection and instructor…
A Simple Test Identifies Selection on Complex Traits.
Beissinger, Tim; Kruppa, Jochen; Cavero, David; Ha, Ngoc-Thuy; Erbe, Malena; Simianer, Henner
2018-05-01
Important traits in agricultural, natural, and human populations are increasingly being shown to be under the control of many genes that individually contribute only a small proportion of genetic variation. However, the majority of modern tools in quantitative and population genetics, including genome-wide association studies and selection-mapping protocols, are designed to identify individual genes with large effects. We have developed an approach to identify traits that have been under selection and are controlled by large numbers of loci. In contrast to existing methods, our technique uses additive-effects estimates from all available markers, and relates these estimates to allele-frequency change over time. Using this information, we generate a composite statistic, denoted [Formula: see text] which can be used to test for significant evidence of selection on a trait. Our test requires pre- and postselection genotypic data but only a single time point with phenotypic information. Simulations demonstrate that [Formula: see text] is powerful for identifying selection, particularly in situations where the trait being tested is controlled by many genes, which is precisely the scenario where classical approaches for selection mapping are least powerful. We apply this test to breeding populations of maize and chickens, where we demonstrate the successful identification of selection on traits that are documented to have been under selection. Copyright © 2018 Beissinger et al.
Accelerated testing of space mechanisms
NASA Technical Reports Server (NTRS)
Murray, S. Frank; Heshmat, Hooshang
1995-01-01
This report contains a review of various existing life prediction techniques used for a wide range of space mechanisms. Life prediction techniques utilized in other non-space fields such as turbine engine design are also reviewed for applicability to many space mechanism issues. The development of new concepts on how various tribological processes are involved in the life of the complex mechanisms used for space applications are examined. A 'roadmap' for the complete implementation of a tribological prediction approach for complex mechanical systems including standard procedures for test planning, analytical models for life prediction and experimental verification of the life prediction and accelerated testing techniques are discussed. A plan is presented to demonstrate a method for predicting the life and/or performance of a selected space mechanism mechanical component.
Refurbishment cost study of the thermal protection system of a space shuttle vehicle, phase 2
NASA Technical Reports Server (NTRS)
Haas, D. W.
1972-01-01
The labor costs and techniques associated with the refurbishment and maintenance of representative thermal protection system (TPS) components and their attachment concepts suitable for space shuttle application are defined, characterized, and evaluated from the results of an experimental test program. This program consisted of designing selected TPS concepts, fabricating and assembling test hardware, and performing a time and motion study of specific maintenance functions of the test hardware on a full-scale- mockup. Labor requirements and refurbishment techniques, as they relate to the maintenance functions of inspection, repair, removal, and replacement were identified.
Pretest information for a test to validate plume simulation procedures (FA-17)
NASA Technical Reports Server (NTRS)
Hair, L. M.
1978-01-01
The results of an effort to plan a final verification wind tunnel test to validate the recommended correlation parameters and application techniques were presented. The test planning effort was complete except for test site finalization and the associated coordination. Two suitable test sites were identified. Desired test conditions were shown. Subsequent sections of this report present the selected model and test site, instrumentation of this model, planned test operations, and some concluding remarks.
Techniques for Embedding Instrumentation in Pressure Vessel Test Articles
NASA Technical Reports Server (NTRS)
Cornelius, Michael
2006-01-01
Many interesting structural and thermal events occur in materials that are housed within a surrounding pressure vessel. In order to measure the environment during these events and explore their causes instrumentation must be installed on or in the material. Transducers can be selected that are small enough to be embedded within the test material but these instruments must interface with an external system in order to apply excitation voltages and output the desired data. The methods for installing the instrumentation and creating an interface are complicated when the material is located in a case or housing containing high pressures and hot gases. Installation techniques for overcoming some of these difficulties were developed while testing a series of small-scale solid propellant and hybrid rocket motors at Marshall Space Flight Center. These techniques have potential applications in other test articles where data are acquired from materials that require containment due to the severe environment encountered during the test process. This severe environment could include high pressure, hot gases, or ionized atmospheres. The development of these techniques, problems encountered, and the lessons learned from the ongoing testing process are summarized.
Development of low cost custom hybrid microcircuit technology
NASA Technical Reports Server (NTRS)
Perkins, K. L.; Licari, J. J.
1981-01-01
Selected potentially low cost, alternate packaging and interconnection techniques were developed and implemented in the manufacture of specific NASA/MSFC hardware, and the actual cost savings achieved by their use. The hardware chosen as the test bed for this evaluation ws the hybrids and modules manufactured by Rockwell International fo the MSFC Flight Accelerometer Safety Cut-Off System (FASCOS). Three potentially low cost packaging and interconnection alternates were selected for evaluation. This study was performed in three phases: hardware fabrication and testing, cost comparison, and reliability evaluation.
Pohlert, Thorsten; Hillebrand, Gudrun; Breitung, Vera
2011-06-01
This study focusses on the effect of sampling techniques for suspended matter in stream water on subsequent particle-size distribution and concentrations of total organic carbon and selected persistent organic pollutants. The key questions are whether differences between the sampling techniques are due to the separation principle of the devices or due to the difference between time-proportional versus integral sampling. Several multivariate homogeneity tests were conducted on an extensive set of field-data that covers the period from 2002 to 2007, when up to three different sampling techniques were deployed in parallel at four monitoring stations of the River Rhine. The results indicate homogeneity for polychlorinated biphenyls, but significant effects due to the sampling techniques on particle-size, organic carbon and hexachlorobenzene. The effects can be amplified depending on the site characteristics of the monitoring stations.
NASA Technical Reports Server (NTRS)
Fear, J. S.
1983-01-01
An assessment is made of the results of Phase 1 screening testing of current and advanced combustion system concepts using several broadened-properties fuels. The severity of each of several fuels-properties effects on combustor performance or liner life is discussed, as well as design techniques with the potential to offset these adverse effects. The selection of concepts to be pursued in Phase 2 refinement testing is described. This selection takes into account the relative costs and complexities of the concepts, the current outlook on pollutant emissions control, and practical operational problems.
NASA Technical Reports Server (NTRS)
Smedes, H. W.; Linnerud, H. J.; Woolaver, L. B.; Su, M. Y.; Jayroe, R. R.
1972-01-01
Two clustering techniques were used for terrain mapping by computer of test sites in Yellowstone National Park. One test was made with multispectral scanner data using a composite technique which consists of (1) a strictly sequential statistical clustering which is a sequential variance analysis, and (2) a generalized K-means clustering. In this composite technique, the output of (1) is a first approximation of the cluster centers. This is the input to (2) which consists of steps to improve the determination of cluster centers by iterative procedures. Another test was made using the three emulsion layers of color-infrared aerial film as a three-band spectrometer. Relative film densities were analyzed using a simple clustering technique in three-color space. Important advantages of the clustering technique over conventional supervised computer programs are (1) human intervention, preparation time, and manipulation of data are reduced, (2) the computer map, gives unbiased indication of where best to select the reference ground control data, (3) use of easy to obtain inexpensive film, and (4) the geometric distortions can be easily rectified by simple standard photogrammetric techniques.
40 CFR 158.2110 - Microbial pesticides data requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
...: genetic engineering techniques used; the identity of the inserted or deleted gene segment (base sequence... evaluate genetic stability and exchange; and selected Tier II environmental expression and toxicology tests. ...
40 CFR 158.2110 - Microbial pesticides data requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
...: genetic engineering techniques used; the identity of the inserted or deleted gene segment (base sequence... evaluate genetic stability and exchange; and selected Tier II environmental expression and toxicology tests. ...
40 CFR 158.2110 - Microbial pesticides data requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
...: genetic engineering techniques used; the identity of the inserted or deleted gene segment (base sequence... evaluate genetic stability and exchange; and selected Tier II environmental expression and toxicology tests. ...
40 CFR 158.2110 - Microbial pesticides data requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
...: genetic engineering techniques used; the identity of the inserted or deleted gene segment (base sequence... evaluate genetic stability and exchange; and selected Tier II environmental expression and toxicology tests. ...
Ensemble of sparse classifiers for high-dimensional biological data.
Kim, Sunghan; Scalzo, Fabien; Telesca, Donatello; Hu, Xiao
2015-01-01
Biological data are often high in dimension while the number of samples is small. In such cases, the performance of classification can be improved by reducing the dimension of data, which is referred to as feature selection. Recently, a novel feature selection method has been proposed utilising the sparsity of high-dimensional biological data where a small subset of features accounts for most variance of the dataset. In this study we propose a new classification method for high-dimensional biological data, which performs both feature selection and classification within a single framework. Our proposed method utilises a sparse linear solution technique and the bootstrap aggregating algorithm. We tested its performance on four public mass spectrometry cancer datasets along with two other conventional classification techniques such as Support Vector Machines and Adaptive Boosting. The results demonstrate that our proposed method performs more accurate classification across various cancer datasets than those conventional classification techniques.
Fukunaga-Koontz transform based dimensionality reduction for hyperspectral imagery
NASA Astrophysics Data System (ADS)
Ochilov, S.; Alam, M. S.; Bal, A.
2006-05-01
Fukunaga-Koontz Transform based technique offers some attractive properties for desired class oriented dimensionality reduction in hyperspectral imagery. In FKT, feature selection is performed by transforming into a new space where feature classes have complimentary eigenvectors. Dimensionality reduction technique based on these complimentary eigenvector analysis can be described under two classes, desired class and background clutter, such that each basis function best represent one class while carrying the least amount of information from the second class. By selecting a few eigenvectors which are most relevant to desired class, one can reduce the dimension of hyperspectral cube. Since the FKT based technique reduces data size, it provides significant advantages for near real time detection applications in hyperspectral imagery. Furthermore, the eigenvector selection approach significantly reduces computation burden via the dimensionality reduction processes. The performance of the proposed dimensionality reduction algorithm has been tested using real-world hyperspectral dataset.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-05
... testing periodically and when there has been a material change in the product's design or manufacturing... control data during product manufacture; and using manufacturing techniques with intrinsic manufacturing... sample in the production population an equal probability of being selected (75 FR at 28349 through 28350...
DOT National Transportation Integrated Search
1971-06-01
A study was conducted in which performance on a non-verbal problem- solving task was correlated with the Otis Quick Scoring Mental Ability Test and the Raven Progressive Matrices Test. The problem-solving task, called 'code- lock' required the subjec...
Real time automatic detection of bearing fault in induction machine using kurtogram analysis.
Tafinine, Farid; Mokrani, Karim
2012-11-01
A proposed signal processing technique for incipient real time bearing fault detection based on kurtogram analysis is presented in this paper. The kurtogram is a fourth-order spectral analysis tool introduced for detecting and characterizing non-stationarities in a signal. This technique starts from investigating the resonance signatures over selected frequency bands to extract the representative features. The traditional spectral analysis is not appropriate for non-stationary vibration signal and for real time diagnosis. The performance of the proposed technique is examined by a series of experimental tests corresponding to different bearing conditions. Test results show that this signal processing technique is an effective bearing fault automatic detection method and gives a good basis for an integrated induction machine condition monitor.
Flori, Pierre; Delaunay, Edouard; Guillerme, Cécile; Charaoui, Sana; Raberin, Hélène; Hafid, Jamal; L'Ollivier, Coralie
2017-01-01
ABSTRACT A study comparing the ICT (immunochromatography technology) Toxoplasma IgG and IgM rapid diagnostic test (LDBio Diagnostics, France) with a fully automated system, Architect, was performed on samples from university hospitals of Marseille and Saint-Etienne. A total of 767 prospective sera and 235 selected sera were collected. The panels were selected to test various IgG and IgM parameters. The reference technique, Toxoplasma IgGII Western blot analysis (LDBio Diagnostics), was used to confirm the IgG results, and commercial kits Platelia Toxo IgM (Bio-Rad) and Toxo-ISAgA (bioMérieux) were used in Saint-Etienne and Marseille, respectively, as the IgM reference techniques. Sensitivity and specificity of the ICT and the Architect IgG assays were compared using a prospective panel. Sensitivity was 100% for the ICT test and 92.1% for Architect (cutoff at 1.6 IU/ml). The low-IgG-titer serum results confirmed that ICT sensitivity was superior to that of Architect. Specificity was 98.7% (ICT) and 99.8% (Architect IgG). The ICT test is also useful for detecting IgM without IgG and is both sensitive (100%) and specific (100%), as it can distinguish nonspecific IgM from specific Toxoplasma IgM. In comparison, IgM sensitivity and specificity on Architect are 96.1% and 99.6%, respectively (cutoff at 0.5 arbitrary units [AU]/ml). To conclude, this new test overcomes the limitations of automated screening techniques, which are not sensitive enough for IgG and lack specificity for IgM (rare IgM false-positive cases). PMID:28954897
Novel magnetically separable TiO2-guanidine-(Ni,Co)Fe2O4 nanomaterials were prepared and characterised by a series of techniques including XRD, SEM, TEM, N2 physisorption as well as XPS and subsequently tested for their photocatalytic activities in the selective transformation of...
The NASA B-757 HIRF Test Series: Flight Test Results
NASA Technical Reports Server (NTRS)
Moeller, Karl J.; Dudley, Kenneth L.
1997-01-01
In 1995, the NASA Langley Research Center conducted a series of aircraft tests aimed at characterizing the electromagnetic environment (EME) in and around a Boeing 757 airliner. Measurements were made of the electromagnetic energy coupled into the aircraft and the signals induced on select structures as the aircraft was flown past known RF transmitters. These measurements were conducted to provide data for the validation of computational techniques for the assessment of electromagnetic effects in commercial transport aircraft. This paper reports on the results of flight tests using RF radiators in the HF, VHF, and UHF ranges and on efforts to use computational and analytical techniques to predict RF field levels inside the airliner at these frequencies.
Hamula, Camille L A; Peng, Hanyong; Wang, Zhixin; Tyrrell, Gregory J; Li, Xing-Fang; Le, X Chris
2016-03-15
Streptococcus pyogenes is a clinically important pathogen consisting of various serotypes determined by different M proteins expressed on the cell surface. The M type is therefore a useful marker to monitor the spread of invasive S. pyogenes in a population. Serotyping and nucleic acid amplification/sequencing methods for the identification of M types are laborious, inconsistent, and usually confined to reference laboratories. The primary objective of this work is to develop a technique that enables generation of aptamers binding to specific M-types of S. pyogenes. We describe here an in vitro technique that directly used live bacterial cells and the Systematic Evolution of Ligands by Exponential Enrichment (SELEX) strategy. Live S. pyogenes cells were incubated with DNA libraries consisting of 40-nucleotides randomized sequences. Those sequences that bound to the cells were separated, amplified using polymerase chain reaction (PCR), purified using gel electrophoresis, and served as the input DNA pool for the next round of SELEX selection. A specially designed forward primer containing extended polyA20/5Sp9 facilitated gel electrophoresis purification of ssDNA after PCR amplification. A counter-selection step using non-target cells was introduced to improve selectivity. DNA libraries of different starting sequence diversity (10(16) and 10(14)) were compared. Aptamer pools from each round of selection were tested for their binding to the target and non-target cells using flow cytometry. Selected aptamer pools were then cloned and sequenced. Individual aptamer sequences were screened on the basis of their binding to the 10 M-types that were used as targets. Aptamer pools obtained from SELEX rounds 5-8 showed high affinity to the target S. pyogenes cells. Tests against non-target Streptococcus bovis, Streptococcus pneumoniae, and Enterococcus species demonstrated selectivity of these aptamers for binding to S. pyogenes. Several aptamer sequences were found to bind preferentially to the M11 M-type of S. pyogenes. Estimated binding dissociation constants (Kd) were in the low nanomolar range for the M11 specific sequences; for example, sequence E-CA20 had a Kd of 7±1 nM. These affinities are comparable to those of a monoclonal antibody. The improved bacterial cell-SELEX technique is successful in generating aptamers selective for S. pyogenes and some of its M-types. These aptamers are potentially useful for detecting S. pyogenes, achieving binding profiles of the various M-types, and developing new M-typing technologies for non-specialized laboratories or point-of-care testing. Copyright © 2015 Elsevier Inc. All rights reserved.
Flight control system design factors for applying automated testing techniques
NASA Technical Reports Server (NTRS)
Sitz, Joel R.; Vernon, Todd H.
1990-01-01
The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 high alpha research vehicle (HARV) automated test systems are discussed. It is noted that operational experiences in developing and using these automated testing techniques have highlighted the need for incorporating target system features to improve testability. Improved target system testability can be accomplished with the addition of nonreal-time and real-time features. Online access to target system implementation details, unobtrusive real-time access to internal user-selectable variables, and proper software instrumentation are all desirable features of the target system. Also, test system and target system design issues must be addressed during the early stages of the target system development. Processing speeds of up to 20 million instructions/s and the development of high-bandwidth reflective memory systems have improved the ability to integrate the target system and test system for the application of automated testing techniques. It is concluded that new methods of designing testability into the target systems are required.
Automatic welding detection by an intelligent tool pipe inspection
NASA Astrophysics Data System (ADS)
Arizmendi, C. J.; Garcia, W. L.; Quintero, M. A.
2015-07-01
This work provide a model based on machine learning techniques in welds recognition, based on signals obtained through in-line inspection tool called “smart pig” in Oil and Gas pipelines. The model uses a signal noise reduction phase by means of pre-processing algorithms and attribute-selection techniques. The noise reduction techniques were selected after a literature review and testing with survey data. Subsequently, the model was trained using recognition and classification algorithms, specifically artificial neural networks and support vector machines. Finally, the trained model was validated with different data sets and the performance was measured with cross validation and ROC analysis. The results show that is possible to identify welding automatically with an efficiency between 90 and 98 percent.
HIV-1 protease cleavage site prediction based on two-stage feature selection method.
Niu, Bing; Yuan, Xiao-Cheng; Roeper, Preston; Su, Qiang; Peng, Chun-Rong; Yin, Jing-Yuan; Ding, Juan; Li, HaiPeng; Lu, Wen-Cong
2013-03-01
Knowledge of the mechanism of HIV protease cleavage specificity is critical to the design of specific and effective HIV inhibitors. Searching for an accurate, robust, and rapid method to correctly predict the cleavage sites in proteins is crucial when searching for possible HIV inhibitors. In this article, HIV-1 protease specificity was studied using the correlation-based feature subset (CfsSubset) selection method combined with Genetic Algorithms method. Thirty important biochemical features were found based on a jackknife test from the original data set containing 4,248 features. By using the AdaBoost method with the thirty selected features the prediction model yields an accuracy of 96.7% for the jackknife test and 92.1% for an independent set test, with increased accuracy over the original dataset by 6.7% and 77.4%, respectively. Our feature selection scheme could be a useful technique for finding effective competitive inhibitors of HIV protease.
NASA Astrophysics Data System (ADS)
Al-Jumaili, Safaa Kh.; Pearson, Matthew R.; Holford, Karen M.; Eaton, Mark J.; Pullin, Rhys
2016-05-01
An easy to use, fast to apply, cost-effective, and very accurate non-destructive testing (NDT) technique for damage localisation in complex structures is key for the uptake of structural health monitoring systems (SHM). Acoustic emission (AE) is a viable technique that can be used for SHM and one of the most attractive features is the ability to locate AE sources. The time of arrival (TOA) technique is traditionally used to locate AE sources, and relies on the assumption of constant wave speed within the material and uninterrupted propagation path between the source and the sensor. In complex structural geometries and complex materials such as composites, this assumption is no longer valid. Delta T mapping was developed in Cardiff in order to overcome these limitations; this technique uses artificial sources on an area of interest to create training maps. These are used to locate subsequent AE sources. However operator expertise is required to select the best data from the training maps and to choose the correct parameter to locate the sources, which can be a time consuming process. This paper presents a new and improved fully automatic delta T mapping technique where a clustering algorithm is used to automatically identify and select the highly correlated events at each grid point whilst the "Minimum Difference" approach is used to determine the source location. This removes the requirement for operator expertise, saving time and preventing human errors. A thorough assessment is conducted to evaluate the performance and the robustness of the new technique. In the initial test, the results showed excellent reduction in running time as well as improved accuracy of locating AE sources, as a result of the automatic selection of the training data. Furthermore, because the process is performed automatically, this is now a very simple and reliable technique due to the prevention of the potential source of error related to manual manipulation.
Quality assessment of the TLS data in conservation of monuments
NASA Astrophysics Data System (ADS)
Markiewicz, Jakub S.; Zawieska, Dorota
2015-06-01
Laser scanning has been recently confirming its high potential in the field of acquiring 3D data for architectural and engineering objects. The objective of this paper is to analyse the quality of the TLS data acquired for different surfaces of monumental objects, with consideration of distances and the scanning angles. Tests concerning the quality of the survey data and shapes of architectural objects, characterised by diversified curvature, structure and the uniformity of the surface, were performed. The obtained results proved that utilisation of terrestrial laser scanning techniques does not allow to achieve expected accuracy for some historical surfaces and it should be substituted by alternative, photogrammetric techniques. Therefore, the typology of constructions of historical objects is important not only for selection of the optimum technique of surveys, but also for its appropriate utilisation. The test objects were architectural details of the Main Hall of the Warsaw University of Technology. Scans were acquired using the 5006h scanner. Diversified geometry of scans was tested, and the relations between the distance and obtained accuracy were specified. In the case of numerous conservational works the precise surface reconstruction is often important, in order to specify damages. Therefore, the repeatability of obtained TLS results for selected surfaces was also tested. Different surfaces were analysed, which are composed of different materials having glittery elements and inhomogeneous structure. The obtained results and performed analyses revealed the high imperfections of the TLS technique applied for measuring surfaces of historical objects. The presented accuracy of measurements of projection of historical surfaces, obtained using the TLS technique may be applied by art conservators, museum professionals, archaeologists and other specialists, to perform wide analyses of historical heritage objects.
NASA Technical Reports Server (NTRS)
1976-01-01
Data base management techniques and applicable equipment are described. Recommendations which will assist potential NASA data users in selecting and using appropriate data base management tools and techniques are presented. Classes of currently available data processing equipment ranging from basic terminals to large minicomputer systems were surveyed as they apply to the needs of potential SEASAT data users. Cost and capabilities projections for this equipment through 1985 were presented. A test of a typical data base management system was described, as well as the results of this test and recommendations to assist potential users in determining when such a system is appropriate for their needs. The representative system tested was UNIVAC's DMS 1100.
Comparison of Three Optical Methods for Measuring Model Deformation
NASA Technical Reports Server (NTRS)
Burner, A. W.; Fleming, G. A.; Hoppe, J. C.
2000-01-01
The objective of this paper is to compare the current state-of-the-art of the following three optical techniques under study by NASA for measuring model deformation in wind tunnels: (1) video photogrammetry, (2) projection moire interferometry, and (3) the commercially available Optotrak system. An objective comparison of these three techniques should enable the selection of the best technique for a particular test undertaken at various NASA facilities. As might be expected, no one technique is best for all applications. The techniques are also not necessarily mutually exclusive and in some cases can be complementary to one another.
ERIC Educational Resources Information Center
Davis, Laurie Laughlin; Pastor, Dena A.; Dodd, Barbara G.; Chiang, Claire; Fitzpatrick, Steven J.
2003-01-01
Examined the effectiveness of the Sympson-Hetter technique and rotated content balancing relative to no exposure control and no content rotation conditions in a computerized adaptive testing system based on the partial credit model. Simulation results show the Sympson-Hetter technique can be used with minimal impact on measurement precision,…
1988-09-01
tested. To measure 42 the adequacy of the sample, the Kaiser - Meyer - Olkin measure of sampling adequacy was used. This technique is described in Factor...40 4- 0 - 7 0 0 07 -58d the relatively large number of variables, there was concern about the adequacy of the sample size. A Kaiser - Meyer - Olkin
Metacognitive Control and Strategy Selection: Deciding to Practice Retrieval during Learning
ERIC Educational Resources Information Center
Karpicke, Jeffrey D.
2009-01-01
Retrieval practice is a potent technique for enhancing learning, but how often do students practice retrieval when they regulate their own learning? In 4 experiments the subjects learned foreign-language items across multiple study and test periods. When items were assigned to be repeatedly tested, repeatedly studied, or removed after they were…
A Survey of UML Based Regression Testing
NASA Astrophysics Data System (ADS)
Fahad, Muhammad; Nadeem, Aamer
Regression testing is the process of ensuring software quality by analyzing whether changed parts behave as intended, and unchanged parts are not affected by the modifications. Since it is a costly process, a lot of techniques are proposed in the research literature that suggest testers how to build regression test suite from existing test suite with minimum cost. In this paper, we discuss the advantages and drawbacks of using UML diagrams for regression testing and analyze that UML model helps in identifying changes for regression test selection effectively. We survey the existing UML based regression testing techniques and provide an analysis matrix to give a quick insight into prominent features of the literature work. We discuss the open research issues like managing and reducing the size of regression test suite, prioritization of the test cases that would be helpful during strict schedule and resources that remain to be addressed for UML based regression testing.
Applications Of Measurement Techniques To Develop Small-Diameter, Undersea Fiber Optic Cables
NASA Astrophysics Data System (ADS)
Kamikawa, Neil T.; Nakagawa, Arthur T.
1984-12-01
Attenuation, strain, and optical time domain reflectometer (OTDR) measurement techniques were applied successfully in the development of a minimum-diameter, electro-optic sea floor cable. Temperature and pressure models for excess attenuation in polymer coated, graded-index fibers were investigated analytically and experimentally using these techniques in the laboratory. The results were used to select a suitable fiber for the cable. Measurements also were performed on these cables during predeployment and sea-trial testing to verify laboratory results. Application of the measurement techniques and results are summarized in this paper.
ERIC Educational Resources Information Center
Luijten, Anton J. M., Ed.
This collection of 18 papers (selected from a total of 57 presented at a conference of the International Association for Educational Assessment) represents efforts by examining bodies and institutes to: improve the examination system and testing techniques; develop reliable instruments; and establish standards for public examinations. The papers…
Design evolution of the orbiter reaction control subsystem
NASA Technical Reports Server (NTRS)
Taeber, R. J.; Karakulko, W.; Belvins, D.; Hohmann, C.; Henderson, J.
1985-01-01
The challenges of space shuttle orbiter reaction control subsystem development began with selection of the propellant for the subsystem. Various concepts were evaluated before the current Earth storable, bipropellant combination was selected. Once that task was accomplished, additional challenges of designing the system to satisfy the wide range of requirements dictated by operating environments, reusability, and long life were met. Verification of system adequacy was achieved by means of a combination of analysis and test. The studies, the design efforts, and the test and analysis techniques employed in meeting the challenges are described.
Tests of selection in pooled case-control data: an empirical study.
Udpa, Nitin; Zhou, Dan; Haddad, Gabriel G; Bafna, Vineet
2011-01-01
For smaller organisms with faster breeding cycles, artificial selection can be used to create sub-populations with different phenotypic traits. Genetic tests can be employed to identify the causal markers for the phenotypes, as a precursor to engineering strains with a combination of traits. Traditional approaches involve analyzing crosses of inbred strains to test for co-segregation with genetic markers. Here we take advantage of cheaper next generation sequencing techniques to identify genetic signatures of adaptation to the selection constraints. Obtaining individual sequencing data is often unrealistic due to cost and sample issues, so we focus on pooled genomic data. We explore a series of statistical tests for selection using pooled case (under selection) and control populations. The tests generally capture skews in the scaled frequency spectrum of alleles in a region, which are indicative of a selective sweep. Extensive simulations are used to show that these approaches work well for a wide range of population divergence times and strong selective pressures. Control vs control simulations are used to determine an empirical False Positive Rate, and regions under selection are determined using a 1% FPR level. We show that pooling does not have a significant impact on statistical power. The tests are also robust to reasonable variations in several different parameters, including window size, base-calling error rate, and sequencing coverage. We then demonstrate the viability (and the challenges) of one of these methods in two independent Drosophila populations (Drosophila melanogaster) bred under selection for hypoxia and accelerated development, respectively. Testing for extreme hypoxia tolerance showed clear signals of selection, pointing to loci that are important for hypoxia adaptation. Overall, we outline a strategy for finding regions under selection using pooled sequences, then devise optimal tests for that strategy. The approaches show promise for detecting selection, even several generations after fixation of the beneficial allele has occurred.
NASA Astrophysics Data System (ADS)
Viboonratanasri, Duangkamon; Pabchanda, Suwat; Prompinit, Panida
2018-05-01
In this study, a simple, rapid and relatively less toxic method for rhodamine 6G dye adsorption on hydrogen-form Y-type zeolite for highly selective nitrite detection was demonstrated. The adsorption behavior was described by Langmuir isotherm and the adsorption process reached the equilibrium promptly within a minute. The developed test papers characterized by fluorescence technique display high sensing performance with wide working range (0.04-20.0 mg L-1) and high selectivity. The test papers show good reproducibility with relative standard deviation (RSD) of 7% for five replicated determinations of 3 mg L-1 of nitrite. The nitrite concentration determined by using the test paper was in the same range as using ion chromatography within a 95% confidence level. The test papers offer advantages in terms of low cost and practical usage enabling them to be a promising candidate for nitrite sensor in environmental samples, food, and fertilizers.
NASA Technical Reports Server (NTRS)
Rader, W. P.; Barrett, S.; Payne, K. R.
1975-01-01
Data measurement and interpretation techniques were defined for application to the first few space shuttle flights, so that the dynamic environment could be sufficiently well established to be used to reduce the cost of future payloads through more efficient design and environmental test techniques. It was concluded that: (1) initial payloads must be given comprehensive instrumentation coverage to obtain detailed definition of acoustics, vibration, and interface loads, (2) analytical models of selected initial payloads must be developed and verified by modal surveys and flight measurements, (3) acoustic tests should be performed on initial payloads to establish realistic test criteria for components and experiments in order to minimize unrealistic failures and retest requirements, (4) permanent data banks should be set up to establish statistical confidence in the data to be used, (5) a more unified design/test specification philosophy is needed, (6) additional work is needed to establish a practical testing technique for simulation of vehicle transients.
Research on Hartmann test for progressive addition lenses
NASA Astrophysics Data System (ADS)
Qin, Lin-ling; Yu, Jing-chi
2009-05-01
Recently, in the world some growing-up measurements for Progressive addition lenses and relevant equipments have been developed. They are single point measurement, moiré deflectometry, Ronchi test techniques. Hartmann test for Progressive addition lenses is proposed in the article. The measurement principle of Hartmann test for ophthalmic lenses and the power compensation of off-axis rays are introduced. The experimental setup used to test lenses is put forward. For experimental test, a spatial filter is used for selecting a clean Gaussian beam; a collimating lens with focal distance f =300 mm is used to produce collimated beam. The Hartmann plate with a square array of holes separated at 2 mm is selected. The selection of laser and CCD camera is critical to the accuracy of experiment and the image processing algorithm. The spot patterns from CCD are obtained from the experimental tests. The power distribution map for lenses can be obtained by image processing in theory. The results indicate that Hartmann test for Progressive addition lenses is convenient and feasible; also its structure is simple.
2010-01-01
Background The order Carnivora is well represented in India, with 58 of the 250 species found globally, occurring here. However, small carnivores figure very poorly in research and conservation policies in India. This is mainly due to the dearth of tested and standardized techniques that are both cost effective and conducive to small carnivore studies in the field. In this paper we present a non-invasive genetic technique standardized for the study of Indian felids and canids with the use of PCR amplification and restriction enzyme digestion of scat collected in the field. Findings Using existing sequences of felids and canids from GenBank, we designed primers from the 16S rRNA region of the mitochondrial genome and tested these on ten species of felids and five canids. We selected restriction enzymes that would cut the selected region differentially for various species within each family. We produced a restriction digestion profile for the potential differentiation of species based on fragment patterns. To test our technique, we used felid PCR primers on scats collected from various habitats in India, representing varied environmental conditions. Amplification success with field collected scats was 52%, while 86% of the products used for restriction digestion could be accurately assigned to species. We verified this through sequencing. A comparison of costs across the various techniques currently used for scat assignment showed that this technique was the most practical and cost effective. Conclusions The species-specific key developed in this paper provides a means for detailed investigations in the future that focus on elusive carnivores in India and this approach provides a model for other studies in areas of Asia where many small carnivores co-occur. PMID:20525407
NASA Technical Reports Server (NTRS)
Baumann, P. R. (Principal Investigator)
1979-01-01
Three computer quantitative techniques for determining urban land cover patterns are evaluated. The techniques examined deal with the selection of training samples by an automated process, the overlaying of two scenes from different seasons of the year, and the use of individual pixels as training points. Evaluation is based on the number and type of land cover classes generated and the marks obtained from an accuracy test. New Orleans, Louisiana and its environs form the study area.
QFD analysis of RSRM aqueous cleaners
NASA Technical Reports Server (NTRS)
Marrs, Roy D.; Jones, Randy K.
1995-01-01
This paper presents a Quality Function Deployment (QFD) analysis of the final down-selected aqueous cleaners to be used on the Redesigned Solid Rocket Motor (RSRM) program. The new cleaner will replace solvent vapor degreasing. The RSRM Ozone Depleting Compound Elimination program is discontinuing the methyl chloroform vapor degreasing process and replacing it with a spray-in-air aqueous cleaning process. Previously, 15 cleaners were down-selected to two candidates by passing screening tests involving toxicity, flammability, cleaning efficiency, contaminant solubility, corrosion potential, cost, and bond strength. The two down-selected cleaners were further evaluated with more intensive testing and evaluated using QFD techniques to assess suitability for cleaning RSRM case and nozzle surfaces in preparation for adhesive bonding.
Measurement of fracture toughness by nanoindentation methods: Recent advances and future challenges
Sebastiani, Marco; Johanns, K. E.; Herbert, Erik G.; ...
2015-04-30
In this study, we describe recent advances and developments for the measurement of fracture toughness at small scales by the use of nanoindentation-based methods including techniques based on micro-cantilever beam bending and micro-pillar splitting. A critical comparison of the techniques is made by testing a selected group of bulk and thin film materials. For pillar splitting, cohesive zone finite element simulations are used to validate a simple relationship between the critical load at failure, the pillar radius, and the fracture toughness for a range of material properties and coating/substrate combinations. The minimum pillar diameter required for nucleation and growth ofmore » a crack during indentation is also estimated. An analysis of pillar splitting for a film on a dissimilar substrate material shows that the critical load for splitting is relatively insensitive to the substrate compliance for a large range of material properties. Experimental results from a selected group of materials show good agreement between single cantilever and pillar splitting methods, while a discrepancy of ~25% is found between the pillar splitting technique and double-cantilever testing. It is concluded that both the micro-cantilever and pillar splitting techniques are valuable methods for micro-scale assessment of fracture toughness of brittle ceramics, provided the underlying assumptions can be validated. Although the pillar splitting method has some advantages because of the simplicity of sample preparation and testing, it is not applicable to most metals because their higher toughness prevents splitting, and in this case, micro-cantilever bend testing is preferred.« less
NASA Astrophysics Data System (ADS)
Wentworth, Mami Tonoe
Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification techniques for model calibration. For Bayesian model calibration, we employ adaptive Metropolis algorithms to construct densities for input parameters in the heat model and the HIV model. To quantify the uncertainty in the parameters, we employ two MCMC algorithms: Delayed Rejection Adaptive Metropolis (DRAM) [33] and Differential Evolution Adaptive Metropolis (DREAM) [66, 68]. The densities obtained using these methods are compared to those obtained through the direct numerical evaluation of the Bayes' formula. We also combine uncertainties in input parameters and measurement errors to construct predictive estimates for a model response. A significant emphasis is on the development and illustration of techniques to verify the accuracy of sampling-based Metropolis algorithms. We verify the accuracy of DRAM and DREAM by comparing chains, densities and correlations obtained using DRAM, DREAM and the direct evaluation of Bayes formula. We also perform similar analysis for credible and prediction intervals for responses. Once the parameters are estimated, we employ energy statistics test [63, 64] to compare the densities obtained by different methods for the HIV model. The energy statistics are used to test the equality of distributions. We also consider parameter selection and verification techniques for models having one or more parameters that are noninfluential in the sense that they minimally impact model outputs. We illustrate these techniques for a dynamic HIV model but note that the parameter selection and verification framework is applicable to a wide range of biological and physical models. To accommodate the nonlinear input to output relations, which are typical for such models, we focus on global sensitivity analysis techniques, including those based on partial correlations, Sobol indices based on second-order model representations, and Morris indices, as well as a parameter selection technique based on standard errors. A significant objective is to provide verification strategies to assess the accuracy of those techniques, which we illustrate in the context of the HIV model. Finally, we examine active subspace methods as an alternative to parameter subset selection techniques. The objective of active subspace methods is to determine the subspace of inputs that most strongly affect the model response, and to reduce the dimension of the input space. The major difference between active subspace methods and parameter selection techniques is that parameter selection identifies influential parameters whereas subspace selection identifies a linear combination of parameters that impacts the model responses significantly. We employ active subspace methods discussed in [22] for the HIV model and present a verification that the active subspace successfully reduces the input dimensions.
Sensor failure detection system. [for the F100 turbofan engine
NASA Technical Reports Server (NTRS)
Beattie, E. C.; Laprad, R. F.; Mcglone, M. E.; Rock, S. M.; Akhter, M. M.
1981-01-01
Advanced concepts for detecting, isolating, and accommodating sensor failures were studied to determine their applicability to the gas turbine control problem. Five concepts were formulated based upon such techniques as Kalman filters and a screening process led to the selection of one advanced concept for further evaluation. The selected advanced concept uses a Kalman filter to generate residuals, a weighted sum square residuals technique to detect soft failures, likelihood ratio testing of a bank of Kalman filters for isolation, and reconfiguring of the normal mode Kalman filter by eliminating the failed input to accommodate the failure. The advanced concept was compared to a baseline parameter synthesis technique. The advanced concept was shown to be a viable concept for detecting, isolating, and accommodating sensor failures for the gas turbine applications.
Design and additive manufacture for flow chemistry.
Capel, Andrew J; Edmondson, Steve; Christie, Steven D R; Goodridge, Ruth D; Bibb, Richard J; Thurstans, Matthew
2013-12-07
We review the use of additive manufacturing (AM) as a novel manufacturing technique for the production of milli-scale reactor systems. Five well-developed additive manufacturing techniques: stereolithography (SL), multi-jet modelling (MJM), selective laser melting (SLM), laser sintering (LS) and fused deposition modelling (FDM) were used to manufacture a number of miniaturised reactors which were tested using a range of organic and inorganic reactions.
Abdel-Kareem, Omar
2010-01-01
Fungal deterioration is one of the highest risk factors for damage of historical textile objects in Egypt. This paper represents both a study case about the fungal microflora deteriorating historical textiles in the Egyptian Museum and the Coptic museum in Cairo, and evaluation of the efficacy of several combinations of polymers with fungicides for the reinforcement of textiles and their prevention against fungal deterioration. Both cotton swab technique and biodeteriorated textile part technique were used for isolation of fungi from historical textile objects. The plate method with the manual key was used for identification of fungi. The results show that the most dominant fungi isolated from the tested textile samples belong to Alternaria, Aspergillus, Chaetomium, Penicillium and Trichoderma species. Microbiological testing was used for evaluating the usefulness of the suggested conservation materials (polymers combined with fungicides) in prevention of the fungal deterioration of ancient Egyptian textiles. Textile samples were treated with 4 selected polymers combined with two selected fungicides. Untreated and treated textile samples were deteriorated by 3 selected active fungal strains isolated from ancient Egyptian textiles. This study reports that most of the tested polymers combined with the tested fungicides prevented the fungal deterioration of textiles. Treatment of ancient textiles by suggested polymers combined with the suggested fungicides not only reinforces these textiles, but also prevents fungal deterioration and increases the durability of these textiles. The tested polymers without fungicides reduce the fungal deterioration of textiles but do not prevent it completely.
Authentication of Electromagnetic Interference Removal in Johnson Noise Thermometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Britton Jr, Charles L.; Roberts, Michael
This report summarizes the testing performed offsite at the TVA Kingston Fossil Plant (KFP). This location is selected as a valid offsite test facility because the environment is very similar to the expected industrial nuclear power plant environment. This report will discuss the EMI discovered in the environment, the removal technique validity, and results from the measurements.
Fixed Wing Stability and Control Theory and Flight Test Techniques. Revision
1981-11-01
positions tested. TEui AIrPLANC LIFT COEFFICIENT. C,. (3) Fair lines using rules shown on f igu re. ZU (4) Using selected CL values from the faired...takeoff or landing in a crosswind which eilceed the oirp c copbilitic M.ay...... L t" the airplanc : departing the runway with catastrophic consequences. An
ERIC Educational Resources Information Center
Nisonger, Thomas E.
1983-01-01
Using random selection of citations from journal articles, two specific permutations of the citation checking approach to university library collection evaluation are tested on political science collections in five university libraries in the Washington, D.C. area. The history of the citation checking approach is reviewed. Forty-three references…
Thomas, Minta; De Brabanter, Kris; De Moor, Bart
2014-05-10
DNA microarrays are potentially powerful technology for improving diagnostic classification, treatment selection, and prognostic assessment. The use of this technology to predict cancer outcome has a history of almost a decade. Disease class predictors can be designed for known disease cases and provide diagnostic confirmation or clarify abnormal cases. The main input to this class predictors are high dimensional data with many variables and few observations. Dimensionality reduction of these features set significantly speeds up the prediction task. Feature selection and feature transformation methods are well known preprocessing steps in the field of bioinformatics. Several prediction tools are available based on these techniques. Studies show that a well tuned Kernel PCA (KPCA) is an efficient preprocessing step for dimensionality reduction, but the available bandwidth selection method for KPCA was computationally expensive. In this paper, we propose a new data-driven bandwidth selection criterion for KPCA, which is related to least squares cross-validation for kernel density estimation. We propose a new prediction model with a well tuned KPCA and Least Squares Support Vector Machine (LS-SVM). We estimate the accuracy of the newly proposed model based on 9 case studies. Then, we compare its performances (in terms of test set Area Under the ROC Curve (AUC) and computational time) with other well known techniques such as whole data set + LS-SVM, PCA + LS-SVM, t-test + LS-SVM, Prediction Analysis of Microarrays (PAM) and Least Absolute Shrinkage and Selection Operator (Lasso). Finally, we assess the performance of the proposed strategy with an existing KPCA parameter tuning algorithm by means of two additional case studies. We propose, evaluate, and compare several mathematical/statistical techniques, which apply feature transformation/selection for subsequent classification, and consider its application in medical diagnostics. Both feature selection and feature transformation perform well on classification tasks. Due to the dynamic selection property of feature selection, it is hard to define significant features for the classifier, which predicts classes of future samples. Moreover, the proposed strategy enjoys a distinctive advantage with its relatively lesser time complexity.
An Axial-Torsional, Thermomechanical Fatigue Testing Technique
NASA Technical Reports Server (NTRS)
Kalluri, Sreeramesh; Bonacuse, Peter J.
1995-01-01
A technique for conducting strain-controlled, thermomechanical, axial-torsional fatigue tests on thin-walled tubular specimens was developed. Three waveforms of loading, namely, the axial strain waveform, the engineering shear strain waveform, and the temperature waveform were required in these tests. The phasing relationships between the mechanical strain waveforms and the temperature and axial strain waveforms were used to define a set of four axial-torsional, thermomechanical fatigue (AT-TMF) tests. Real-time test control (3 channels) and data acquisition (a minimum of 7 channels) were performed with a software program written in C language and executed on a personal computer. The AT-TMF testing technique was used to investigate the axial-torsional thermomechanical fatigue behavior of a cobalt-base superalloy, Haynes 188. The maximum and minimum temperatures selected for the AT-TMF tests were 760 and 316 C, respectively. Details of the testing system, calibration of the dynamic temperature profile of the thin-walled tubular specimen, thermal strain compensation technique, and test control and data acquisition schemes, are reported. The isothermal, axial, torsional, and in- and out-of-phase axial-torsional fatigue behaviors of Haynes 188 at 316 and 760 C were characterized in previous investigations. The cyclic deformation and fatigue behaviors of Haynes 188 in AT-TMF tests are compared to the previously reported isothermal axial-torsional behavior of this superalloy at the maximum and minimum temperatures.
Selected environmental plutonium research reports of the NAEG
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, M.G.; Dunaway, P.B.
Twenty-one papers were presented on various aspects of plutonium and radioisotope ecology at the Nevada Test Site. This includes studies of wildlife, microorganisms, and the plant-soil system. Analysis and sampling techniques are also included.
A Guideline to Local Anesthetic Allergy Testing
Canfield, David W.; Gage, Tommy W.
1987-01-01
Patients with a history of adverse reactions to a local anesthetic may often be incorrectly labeled as “allergic.” Determining if a patient is allergic to a local anesthetic is essential in the selection of appropriate pain control techniques. Local anesthetic allergy testing may be performed safely and with reasonable accuracy by a knowledgeable practitioner. This paper presents guidelines for an allergy testing method. ImagesFigure 1 PMID:3318567
Simultaneous Estimation of Regression Functions for Marine Corps Technical Training Specialties.
1985-01-03
Edmonton, Alberta CANADA 1 Dr. Frederic M. Lord Educational Testing Service 1 Dr. Earl Hunt Princeton, NJ 08541 Dept, of Psychology University of...111111-1.6 MICROCOPY RESOLUTION TEST CHART NATIONAL BUREAU OF STANDARDS-1963-A SIMIULTANEOUS ESTIMATION OF REGRESSION FUNCTIONS FOR MARINE CORPS...Bayesian techniques for simul- taneous estimation to the specification of regression weights for selection tests used in various technical training courses
Local density measurement of additive manufactured copper parts by instrumented indentation
NASA Astrophysics Data System (ADS)
Santo, Loredana; Quadrini, Fabrizio; Bellisario, Denise; Tedde, Giovanni Matteo; Zarcone, Mariano; Di Domenico, Gildo; D'Angelo, Pierpaolo; Corona, Diego
2018-05-01
Instrumented flat indentation has been used to evaluate local density of additive manufactured (AM) copper samples with different relative density. Indentations were made by using tungsten carbide (WC) flat pins with 1 mm diameter. Pure copper powders were used in a selective laser melting (SLM) machine to produce samples to test. By changing process parameters, samples density was changed from the relative density of 63% to 71%. Indentation tests were performed on the xy surface of the AM samples. In order to make a correlation between indentation test results and sample density, the indentation pressure at fixed displacement was selected. Results show that instrumented indentation is a valid technique to measure density distribution along the geometry of an SLM part. In fact, a linear trend between indentation pressure and sample density was found for the selected density range.
Chantler, C T; Islam, M T; Rae, N A; Tran, C Q; Glover, J L; Barnea, Z
2012-03-01
An extension of the X-ray extended-range technique is described for measuring X-ray mass attenuation coefficients by introducing absolute measurement of a number of foils - the multiple independent foil technique. Illustrating the technique with the results of measurements for gold in the 38-50 keV energy range, it is shown that its use enables selection of the most uniform and well defined of available foils, leading to more accurate measurements; it allows one to test the consistency of independently measured absolute values of the mass attenuation coefficient with those obtained by the thickness transfer method; and it tests the linearity of the response of the counter and counting chain throughout the range of X-ray intensities encountered in a given experiment. In light of the results for gold, the strategy to be ideally employed in measuring absolute X-ray mass attenuation coefficients, X-ray absorption fine structure and related quantities is discussed.
NASA Technical Reports Server (NTRS)
Landmann, A. E.; Tillema, H. F.; Marshall, S. E.
1989-01-01
The application of selected analysis techniques to low frequency cabin noise associated with advanced propeller engine installations is evaluated. Three design analysis techniques were chosen for evaluation including finite element analysis, statistical energy analysis (SEA), and a power flow method using element of SEA (computer program Propeller Aircraft Interior Noise). An overview of the three procedures is provided. Data from tests of a 727 airplane (modified to accept a propeller engine) were used to compare with predictions. Comparisons of predicted and measured levels at the end of the first year's effort showed reasonable agreement leading to the conclusion that each technique had value for propeller engine noise predictions on large commercial transports. However, variations in agreement were large enough to remain cautious and to lead to recommendations for further work with each technique. Assessment of the second year's results leads to the conclusion that the selected techniques can accurately predict trends and can be useful to a designer, but that absolute level predictions remain unreliable due to complexity of the aircraft structure and low modal densities.
NASA Technical Reports Server (NTRS)
1972-01-01
Guidelines for the selection of equipment to be used for manned spacecraft in order to assure a five year maintenance-free service life were developed. A special study was conducted to determine the adequacy of the procedures used to determine the quality and effectiveness of various components. The subjects examined are: (1) temperature cycling for acceptance of electronic assemblies; (2) accelerated testing techniques; (3) electronic part screening techniques; (4) electronic part derating practices; (5) vibration life extension of printed circuit board assemblies; and (6) tolerance funnelling and test requirements.
Vegetation and other parameters in the Brevard County bar-built estuaries
NASA Technical Reports Server (NTRS)
Down, C.; Withrow, R. (Editor)
1978-01-01
It is shown that low-altitude aerial photography, using specific interpretive techniques, can effectively delineate sea-grass beds, oyster beds, and other underwater features. Various techniques were used on several sets of aerial imagery. Imagery was tested using several data analysis methods, ground truth, and biological testing. Approximately 45,000 acres of grass beds, 2,500 acres of oyster beds, and 4,200 acres of dredged canals were mapped. This data represents selected sites only. Areas chosen have the highest quality water in Brevard County and are among the most highly recognized biologically productive waters in Florida.
Wipe-rinse technique for quantitating microbial contamination on large surfaces.
Kirschner, L E; Puleo, J R
1979-01-01
The evaluation of an improved wipe-rinse technique for the bioassay of large areas was undertaken due to inherent inadequacies in the cotton swab-rinse technique to which assay of spacecraft is currently restricted. Four types of contamination control cloths were initially tested. A polyester-bonded cloth (PBC) was selected for further evaluation because of its superior efficiency and handling characteristics. Results from comparative tests with PBC and cotton swabs on simulated spacecraft surfaces indicated a significantly higher recovery efficiency for the PBC than for the cotton (90.4 versus 75.2%). Of the sampling areas sites studied, PBC was found to be most effective on surface areas not exceeding 0.74 m2 (8.0 feet 2). PMID:394682
Wipe-rinse technique for quantitating microbial contamination on large surfaces
NASA Technical Reports Server (NTRS)
Kirschner, L. E.; Puleo, J. R.
1979-01-01
The evaluation of an improved wipe-rinse technique for the bioassay of large areas was undertaken due to inherent inadequacies in the cotton swab-rinse technique to which assay of spacecraft is currently restricted. Four types of contamination control cloths were initially tested. A polyester-bonded cloth (PBC) was selected for further evaluation because of its superior efficiency and handling characteristics. Results from comparative tests with PBC and cotton swabs on simulated spacecraft surfaces indicated a significantly higher recovery efficiency for the PBC than for the cotton (90.4 versus 75.2%). Of the sampling area sites studied, PBC was found to be most effective on surface areas not exceeding 0.74 sq m (8.0 sq ft).
A Module Experimental Process System Development Unit (MEPSDU)
NASA Technical Reports Server (NTRS)
1981-01-01
Subsequent to the design review, a series of tests was conducted on simulated modules to demonstrate that all environmental specifications (wind loading, hailstone impact, thermal cycling, and humidity cycling) are satisfied by the design. All tests, except hailstone impact, were successfully completed. The assembly sequence was simplified by virtue of eliminating the frame components and assembly steps. Performance was improved by reducing the module edge border required to accommodate the frame of the preliminary design module. An ultrasonic rolling spot bonding technique was selected for use in the machine to perform the aluminum interconnect to cell metallization electrical joints required in the MEPSDU module configuration. This selection was based on extensive experimental tests and economic analyses.
Larenas-Linnemann, Désirée; Luna-Pech, Jorge A; Mösges, Ralph
2017-01-01
Percutaneous skin prick tests (SPT) have been considered the preferred method for confirming IgE-mediated sensitization. This reliable and minimally invasive technique correlates with in vivo challenges, has good reproducibility, is easily quantified, and allows analyzing multiple allergens simultaneously. Potent extracts and a proficient tester improve its accuracy. Molecular-based allergy diagnostics (MA-Dx) quantifies allergenic components obtained either from purification of natural sources or recombinant technology to identify the patient's reactivity to those specific allergenic protein components. For a correct allergy diagnosis, the patient selection is crucial. MA-Dx has been shown to have a high specificity, however, as MA-Dx testing can be ordered by any physician, the pre-selection of patients might not always be optimal, reducing test specificity. Also, MA-Dx is less sensitive than in vitro testing with the whole allergen or SPT. Secondly, no allergen-specific immunotherapy (AIT) trial has yet shown efficacy with patients selected on the basis of their MA-Dx results. Thirdly, why would we need molecular diagnosis, as no molecular treatment can yet be offered? Then there are the practical arguments of costs (SPT highly cost-efficient), test availability for MA-Dx still lacking in wide areas of the world and scarce in others. As such, it is hard physicians can build confidence in the test and their interpretation of the MA-Dx results. as of now these techniques should be reserved for situations of complex allergies and polysensitization; in the future MA-Dx might help to reduce the number of allergens for AIT, but trials are needed to prove this concept.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freitez, Juan A.; Sanchez, Morella; Ruette, Fernando
Application of simulated annealing (SA) and simplified GSA (SGSA) techniques for parameter optimization of parametric quantum chemistry method (CATIVIC) was performed. A set of organic molecules were selected for test these techniques. Comparison of the algorithms was carried out for error function minimization with respect to experimental values. Results show that SGSA is more efficient than SA with respect to computer time. Accuracy is similar in both methods; however, there are important differences in the final set of parameters.
Sexual selection and mate choice.
Andersson, Malte; Simmons, Leigh W
2006-06-01
The past two decades have seen extensive growth of sexual selection research. Theoretical and empirical work has clarified many components of pre- and postcopulatory sexual selection, such as aggressive competition, mate choice, sperm utilization and sexual conflict. Genetic mechanisms of mate choice evolution have been less amenable to empirical testing, but molecular genetic analyses can now be used for incisive experimentation. Here, we highlight some of the currently debated areas in pre- and postcopulatory sexual selection. We identify where new techniques can help estimate the relative roles of the various selection mechanisms that might work together in the evolution of mating preferences and attractive traits, and in sperm-egg interactions.
Trends in Materials' Outgassing Technology
NASA Technical Reports Server (NTRS)
Colony, J. A.
1979-01-01
Test sample acquisition and chemical analysis techniques for outgassing products from spacecraft, experiment modules, and support equipment is described. The reduction of test data to a computer compatible format to implement materials selection policies is described. A list of the most troublesome outgassing species is given and several materials correlations are discussed. Outgassing from solar panels, thermal blankets, and wire insulation are examined individually.
Tseng, Shao-Chin; Yu, Chen-Chieh; Wan, Dehui; Chen, Hsuen-Li; Wang, Lon Alex; Wu, Ming-Chung; Su, Wei-Fang; Han, Hsieh-Cheng; Chen, Li-Chyong
2012-06-05
Convenient, rapid, and accurate detection of chemical and biomolecules would be a great benefit to medical, pharmaceutical, and environmental sciences. Many chemical and biosensors based on metal nanoparticles (NPs) have been developed. However, as a result of the inconvenience and complexity of most of the current preparation techniques, surface plasmon-based test papers are not as common as, for example, litmus paper, which finds daily use. In this paper, we propose a convenient and practical technique, based on the photothermal effect, to fabricate the plasmonic test paper. This technique is superior to other reported methods for its rapid fabrication time (a few seconds), large-area throughput, selectivity in the positioning of the NPs, and the capability of preparing NP arrays in high density on various paper substrates. In addition to their low cost, portability, flexibility, and biodegradability, plasmonic test paper can be burned after detecting contagious biomolecules, making them safe and eco-friendly.
Applications of Computational Methods for Dynamic Stability and Control Derivatives
NASA Technical Reports Server (NTRS)
Green, Lawrence L.; Spence, Angela M.
2004-01-01
Initial steps in the application o f a low-order panel method computational fluid dynamic (CFD) code to the calculation of aircraft dynamic stability and control (S&C) derivatives are documented. Several capabilities, unique to CFD but not unique to this particular demonstration, are identified and demonstrated in this paper. These unique capabilities complement conventional S&C techniques and they include the ability to: 1) perform maneuvers without the flow-kinematic restrictions and support interference commonly associated with experimental S&C facilities, 2) easily simulate advanced S&C testing techniques, 3) compute exact S&C derivatives with uncertainty propagation bounds, and 4) alter the flow physics associated with a particular testing technique from those observed in a wind or water tunnel test in order to isolate effects. Also presented are discussions about some computational issues associated with the simulation of S&C tests and selected results from numerous surface grid resolution studies performed during the course of the study.
Comparison of Feature Selection Techniques in Machine Learning for Anatomical Brain MRI in Dementia.
Tohka, Jussi; Moradi, Elaheh; Huttunen, Heikki
2016-07-01
We present a comparative split-half resampling analysis of various data driven feature selection and classification methods for the whole brain voxel-based classification analysis of anatomical magnetic resonance images. We compared support vector machines (SVMs), with or without filter based feature selection, several embedded feature selection methods and stability selection. While comparisons of the accuracy of various classification methods have been reported previously, the variability of the out-of-training sample classification accuracy and the set of selected features due to independent training and test sets have not been previously addressed in a brain imaging context. We studied two classification problems: 1) Alzheimer's disease (AD) vs. normal control (NC) and 2) mild cognitive impairment (MCI) vs. NC classification. In AD vs. NC classification, the variability in the test accuracy due to the subject sample did not vary between different methods and exceeded the variability due to different classifiers. In MCI vs. NC classification, particularly with a large training set, embedded feature selection methods outperformed SVM-based ones with the difference in the test accuracy exceeding the test accuracy variability due to the subject sample. The filter and embedded methods produced divergent feature patterns for MCI vs. NC classification that suggests the utility of the embedded feature selection for this problem when linked with the good generalization performance. The stability of the feature sets was strongly correlated with the number of features selected, weakly correlated with the stability of classification accuracy, and uncorrelated with the average classification accuracy.
Model for spectral and chromatographic data
Jarman, Kristin [Richland, WA; Willse, Alan [Richland, WA; Wahl, Karen [Richland, WA; Wahl, Jon [Richland, WA
2002-11-26
A method and apparatus using a spectral analysis technique are disclosed. In one form of the invention, probabilities are selected to characterize the presence (and in another form, also a quantification of a characteristic) of peaks in an indexed data set for samples that match a reference species, and other probabilities are selected for samples that do not match the reference species. An indexed data set is acquired for a sample, and a determination is made according to techniques exemplified herein as to whether the sample matches or does not match the reference species. When quantification of peak characteristics is undertaken, the model is appropriately expanded, and the analysis accounts for the characteristic model and data. Further techniques are provided to apply the methods and apparatuses to process control, cluster analysis, hypothesis testing, analysis of variance, and other procedures involving multiple comparisons of indexed data.
The Development of Selective Copying: Children's Learning from an Expert versus Their Mother
ERIC Educational Resources Information Center
Lucas, Amanda J.; Burdett, Emily R. R.; Burgess, Vanessa; Wood, Lara A.; McGuigan, Nicola; Harris, Paul L.; Whiten, Andrew
2017-01-01
This study tested the prediction that, with age, children should rely less on familiarity and more on expertise in their selective social learning. Experiment 1 (N = 50) found that 5- to 6-year-olds copied the technique their mother used to extract a prize from a novel puzzle box, in preference to both a stranger and an established expert. This…
Neural net classification of x-ray pistachio nut data
NASA Astrophysics Data System (ADS)
Casasent, David P.; Sipe, Michael A.; Schatzki, Thomas F.; Keagy, Pamela M.; Le, Lan Chau
1996-12-01
Classification results for agricultural products are presented using a new neural network. This neural network inherently produces higher-order decision surfaces. It achieves this with fewer hidden layer neurons than other classifiers require. This gives better generalization. It uses new techniques to select the number of hidden layer neurons and adaptive algorithms that avoid other such ad hoc parameter selection problems; it allows selection of the best classifier parameters without the need to analyze the test set results. The agriculture case study considered is the inspection and classification of pistachio nuts using x- ray imagery. Present inspection techniques cannot provide good rejection of worm damaged nuts without rejecting too many good nuts. X-ray imagery has the potential to provide 100% inspection of such agricultural products in real time. Only preliminary results are presented, but these indicate the potential to reduce major defects to 2% of the crop with 1% of good nuts rejected. Future image processing techniques that should provide better features to improve performance and allow inspection of a larger variety of nuts are noted. These techniques and variations of them have uses in a number of other agricultural product inspection problems.
NASA Astrophysics Data System (ADS)
Mrozek, T.; Perlicki, K.; Tajmajer, T.; Wasilewski, P.
2017-08-01
The article presents an image analysis method, obtained from an asynchronous delay tap sampling (ADTS) technique, which is used for simultaneous monitoring of various impairments occurring in the physical layer of the optical network. The ADTS method enables the visualization of the optical signal in the form of characteristics (so called phase portraits) that change their shape under the influence of impairments such as chromatic dispersion, polarization mode dispersion and ASE noise. Using this method, a simulation model was built with OptSim 4.0. After the simulation study, data were obtained in the form of images that were further analyzed using the convolutional neural network algorithm. The main goal of the study was to train a convolutional neural network to recognize the selected impairment (distortion); then to test its accuracy and estimate the impairment for the selected set of test images. The input data consisted of processed binary images in the form of two-dimensional matrices, with the position of the pixel. This article focuses only on the analysis of images containing chromatic dispersion.
Banerjee, Satarupa; Pal, Mousumi; Chakrabarty, Jitamanyu; Petibois, Cyril; Paul, Ranjan Rashmi; Giri, Amita; Chatterjee, Jyotirmoy
2015-10-01
In search of specific label-free biomarkers for differentiation of two oral lesions, namely oral leukoplakia (OLK) and oral squamous-cell carcinoma (OSCC), Fourier-transform infrared (FTIR) spectroscopy was performed on paraffin-embedded tissue sections from 47 human subjects (eight normal (NOM), 16 OLK, and 23 OSCC). Difference between mean spectra (DBMS), Mann-Whitney's U test, and forward feature selection (FFS) techniques were used for optimising spectral-marker selection. Classification of diseases was performed with linear and quadratic support vector machine (SVM) at 10-fold cross-validation, using different combinations of spectral features. It was observed that six features obtained through FFS enabled differentiation of NOM and OSCC tissue (1782, 1713, 1665, 1545, 1409, and 1161 cm(-1)) and were most significant, able to classify OLK and OSCC with 81.3 % sensitivity, 95.7 % specificity, and 89.7 % overall accuracy. The 43 spectral markers extracted through Mann-Whitney's U Test were the least significant when quadratic SVM was used. Considering the high sensitivity and specificity of the FFS technique, extracting only six spectral biomarkers was thus most useful for diagnosis of OLK and OSCC, and to overcome inter and intra-observer variability experienced in diagnostic best-practice histopathological procedure. By considering the biochemical assignment of these six spectral signatures, this work also revealed altered glycogen and keratin content in histological sections which could able to discriminate OLK and OSCC. The method was validated through spectral selection by the DBMS technique. Thus this method has potential for diagnostic cost minimisation for oral lesions by label-free biomarker identification.
NASA Astrophysics Data System (ADS)
Goeters, Klaus-Martin; Fassbender, Christoph
A unique composition of personality assessment methods was applied to a group of 97 ESA scientists and engineers. This group is highly comparable to real astronaut candidates with respect to age and education. The list of used tests includes personality questionnaires, problem solving in groups as well as a projective technique. The study goals were: 1. Verification of psychometric qualities and applicability of tests to the target group; 2. Search for culture-fair tests by which multi-national European groups can be examined; 3. Identification of test methods by which the adaptability of the candidates to the psycho-social stress of long-duration space flights can be assessed. Based on the empirical findings, a test battery was defined which can be used in the selection of ESA space personnel.
Cider fermentation process monitoring by Vis-NIR sensor system and chemometrics.
Villar, Alberto; Vadillo, Julen; Santos, Jose I; Gorritxategi, Eneko; Mabe, Jon; Arnaiz, Aitor; Fernández, Luis A
2017-04-15
Optimization of a multivariate calibration process has been undertaken for a Visible-Near Infrared (400-1100nm) sensor system, applied in the monitoring of the fermentation process of the cider produced in the Basque Country (Spain). The main parameters that were monitored included alcoholic proof, l-lactic acid content, glucose+fructose and acetic acid content. The multivariate calibration was carried out using a combination of different variable selection techniques and the most suitable pre-processing strategies were selected based on the spectra characteristics obtained by the sensor system. The variable selection techniques studied in this work include Martens Uncertainty test, interval Partial Least Square Regression (iPLS) and Genetic Algorithm (GA). This procedure arises from the need to improve the calibration models prediction ability for cider monitoring. Copyright © 2016 Elsevier Ltd. All rights reserved.
TREATMENT OF AMMONIA PLANT PROCESS CONDENSATE EFFLUENT
The report gives results of an examination of contaminant content and selected treatment techniques for process condensate from seven different ammonia plants. Field tests were performed and data collected on an in-plant steam stripping column with vapor injection into the reform...
47 CFR 22.321 - Equal employment opportunities.
Code of Federal Regulations, 2010 CFR
2010-10-01
... employment advertisements in media which have significant circulation among minority groups in the recruiting area. (D) Recruiting through schools and colleges with significant minority group enrollments. (E... selection techniques or tests that have the effect of discriminating against minority groups or females...
Buck, Ursula; Buße, Kirsten; Campana, Lorenzo; Schyma, Christian
2018-03-01
Three-dimensional (3D) measurement techniques are gaining importance in many areas. The latest developments brought more cost-effective, user-friendly, and faster technologies onto the market. Which 3D techniques are suitable in the field of forensic medicine and what are their advantages and disadvantages? This wide-ranging study evaluated and validated various 3D measurement techniques for the forensic requirements. High-tech methods as well as low-budget systems have been tested and compared in terms of accuracy, ease of use, expenditure of time, mobility, cost, necessary knowhow, and their limitations. Within this study, various commercial measuring systems of the different techniques were tested. Based on the first results, one measuring system was selected for each technique, which appeared to be the most suitable for the forensic application or is already established in forensic medicine. A body of a deceased, a face and an injury of a living person, and a shoe sole were recorded by 11 people with different professions and previous knowledge using the selected systems. The results were assessed and the personal experiences were evaluated using a questionnaire. In addition, precision investigations were carried out using test objects. The study shows that the hand-held scanner and photogrammetry are very suitable for the 3D documentation of forensic medical findings. Their moderate acquisition costs and easy operation could lead to more frequent application in forensic medicine in the future. For special applications, the stripe-light scanner still has its justification due to its high precision, the flexible application area, and the high reliability. The results show that, thanks to the technological advances, the 3D measurement technology will have more and more impact on the routine of the forensic medical examination.
Damage evaluation and repair methods for prestressed concrete bridge members
NASA Astrophysics Data System (ADS)
Shanafelt, G. O.; Horn, W. B.
1980-11-01
The types of accidental damage occurring and the severity and frequency of their occurrence are summarized. Practices and equipment used for assessing damage and making repairs are presented and evaluated. Guidelines for inspection, assessing damage, and selection of repair methods are given. Methods of repair includes adding external prestress, a metal sleeve splice, and splicing broken strands or rods. The findings of this study suggest that in some instances better repair techniques should be used. The findings of this study also indicate that proper selection of repair methods may reduce the number of damaged girders presently being replaced. Plausible methods of repair requiring additional research are identified and techniques for testing are outlined.
Large Field Photogrammetry Techniques in Aircraft and Spacecraft Impact Testing
NASA Technical Reports Server (NTRS)
Littell, Justin D.
2010-01-01
The Landing and Impact Research Facility (LandIR) at NASA Langley Research Center is a 240 ft. high A-frame structure which is used for full-scale crash testing of aircraft and rotorcraft vehicles. Because the LandIR provides a unique capability to introduce impact velocities in the forward and vertical directions, it is also serving as the facility for landing tests on full-scale and sub-scale Orion spacecraft mass simulators. Recently, a three-dimensional photogrammetry system was acquired to assist with the gathering of vehicle flight data before, throughout and after the impact. This data provides the basis for the post-test analysis and data reduction. Experimental setups for pendulum swing tests on vehicles having both forward and vertical velocities can extend to 50 x 50 x 50 foot cubes, while weather, vehicle geometry, and other constraints make each experimental setup unique to each test. This paper will discuss the specific calibration techniques for large fields of views, camera and lens selection, data processing, as well as best practice techniques learned from using the large field of view photogrammetry on a multitude of crash and landing test scenarios unique to the LandIR.
Advanced techniques for determining long term compatibility of materials with propellants
NASA Technical Reports Server (NTRS)
Green, R. L.
1972-01-01
The search for advanced measurement techniques for determining long term compatibility of materials with propellants was conducted in several parts. A comprehensive survey of the existing measurement and testing technology for determining material-propellant interactions was performed. Selections were made from those existing techniques which were determined could meet or be made to meet the requirements. Areas of refinement or changes were recommended for improvement of others. Investigations were also performed to determine the feasibility and advantages of developing and using new techniques to achieve significant improvements over existing ones. The most interesting demonstration was that of the new technique, the volatile metal chelate analysis. Rivaling the neutron activation analysis in terms of sensitivity and specificity, the volatile metal chelate technique was fully demonstrated.
High-Grading Lunar Samples for Return to Earth
NASA Technical Reports Server (NTRS)
Allen, Carlton; Sellar, Glenn; Nunez, Jorge; Winterhalter, Daniel; Farmer, Jack
2009-01-01
Astronauts on long-duration lunar missions will need the capability to "high-grade" their samples to select the highest value samples for transport to Earth and to leave others on the Moon. We are supporting studies to defile the "necessary and sufficient" measurements and techniques for highgrading samples at a lunar outpost. A glovebox, dedicated to testing instruments and techniques for high-grading samples, is in operation at the JSC Lunar Experiment Laboratory.
Software component quality evaluation
NASA Technical Reports Server (NTRS)
Clough, A. J.
1991-01-01
The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.
Social insects and selfish genes.
Bourke, A F
2001-10-01
Sometimes science advances because of a new idea. Sometimes, it's because of a new technique. When both occur together, exciting times result. In the study of social insects, DNA-based methods for measuring relatedness now allow increasingly detailed tests of Hamilton's theory of kin selection.
ERIC Educational Resources Information Center
Balim, Ali Günay
2013-01-01
This study aims at identifying the effects of the mind-mapping technique upon students' perceptions of inquiry-learning skills, academic achievement, and retention of knowledge. The study was carried out in the Science and Technology course. A quasi-experimental research design with a pre-test and post-test control group, which was selected from…
Development and Validation of Measures for Selecting Soldiers for the Officer Candidate School
2011-08-01
SJT, there has been a debate about what SJTs actually measure and why they work (cf. Moss & Hunt, 1926; Thorndike , 1936), a debate that continues...meta-analytic review and integration. Psychological Bulletin, 129, 914-945. Thorndike , R. L. (1936). Factor analysis of social and abstract...intelligence. The Journal of Educational Psychology, XXVII, 231—233. Thorndike , R. L. (1949). Personnel selection: Test and measurement techniques. New York
A New Catalog of Contact Binary Stars from ROTSE-I Sky Patrols
NASA Astrophysics Data System (ADS)
Gettel, S. J.; McKay, T. A.; Geske, M. T.
2005-05-01
Over 65,000 variable stars have been detected in the data from the ROTSE-I Sky Patrols. Using period-color and light curve selection techniques, about 5000 objects have been identified as contact binaries. This selection is tested for completeness against EW objects in the GCVS. By utilizing infrared color data from 2MASS, we fit a period-color-luminosity relation to these stars and estimate their distances.
Toledo-Machado, Christina Monerat; Machado de Avila, Ricardo Andrez; NGuyen, Christophe; Granier, Claude; Bueno, Lilian Lacerda; Carneiro, Claudia Martins; Menezes-Souza, Daniel; Carneiro, Rubens Antonio; Chávez-Olórtegui, Carlos; Fujiwara, Ricardo Toshio
2015-01-01
ELISA and RIFI are currently used for serodiagnosis of canine visceral leishmaniasis (CVL). The accuracy of these tests is controversial in endemic areas where canine infections by Trypanosoma cruzi may occur. We evaluated the usefulness of synthetic peptides that were selected through phage display technique in the serodiagnosis of CVL. Peptides were chosen based on their ability to bind to IgGs purified from infected dogs pooled sera. We selected three phage clones that reacted only with those IgGs. Peptides were synthesized, polymerized with glutaraldehyde, and used as antigens in ELISA assays. Each individual peptide or a mix of them was reactive with infected dogs serum. The assay was highly sensitive and specific when compared to soluble Leishmania antigen that showed cross-reactivity with anti-T. cruzi IgGs. Our results demonstrate that phage display technique is useful for selection of peptides that may represent valuable synthetic antigens for an improved serodiagnosis of CVL. PMID:25710003
Fuzzy System-Based Target Selection for a NIR Camera-Based Gaze Tracker
Naqvi, Rizwan Ali; Arsalan, Muhammad; Park, Kang Ryoung
2017-01-01
Gaze-based interaction (GBI) techniques have been a popular subject of research in the last few decades. Among other applications, GBI can be used by persons with disabilities to perform everyday tasks, as a game interface, and can play a pivotal role in the human computer interface (HCI) field. While gaze tracking systems have shown high accuracy in GBI, detecting a user’s gaze for target selection is a challenging problem that needs to be considered while using a gaze detection system. Past research has used the blinking of the eyes for this purpose as well as dwell time-based methods, but these techniques are either inconvenient for the user or requires a long time for target selection. Therefore, in this paper, we propose a method for fuzzy system-based target selection for near-infrared (NIR) camera-based gaze trackers. The results of experiments performed in addition to tests of the usability and on-screen keyboard use of the proposed method show that it is better than previous methods. PMID:28420114
NASA Technical Reports Server (NTRS)
Birch, J. N.; Getzin, N.
1971-01-01
Analog and digital voice coding techniques for application to an L-band satellite-basedair traffic control (ATC) system for over ocean deployment are examined. In addition to performance, the techniques are compared on the basis of cost, size, weight, power consumption, availability, reliability, and multiplexing features. Candidate systems are chosen on the bases of minimum required RF bandwidth and received carrier-to-noise density ratios. A detailed survey of automated and nonautomated intelligibility testing methods and devices is presented and comparisons given. Subjective evaluation of speech system by preference tests is considered. Conclusion and recommendations are developed regarding the selection of the voice system. Likewise, conclusions and recommendations are developed for the appropriate use of intelligibility tests, speech quality measurements, and preference tests with the framework of the proposed ATC system.
Local defect resonance for sensitive non-destructive testing
NASA Astrophysics Data System (ADS)
Adebahr, W.; Solodov, I.; Rahammer, M.; Gulnizkij, N.; Kreutzbruck, M.
2016-02-01
Ultrasonic wave-defect interaction is a background of ultrasound activated techniques for imaging and non-destructive testing (NDT) of materials and industrial components. The interaction, primarily, results in acoustic response of a defect which provides attenuation and scattering of ultrasound used as an indicator of defects in conventional ultrasonic NDT. The derivative ultrasonic-induced effects include e.g. nonlinear, thermal, acousto-optic, etc. responses also applied for NDT and defect imaging. These secondary effects are normally relatively inefficient so that the corresponding NDT techniques require an elevated acoustic power and stand out from conventional ultrasonic NDT counterparts for their specific instrumentation particularly adapted to high-power ultrasonic. In this paper, a consistent way to enhance ultrasonic, optical and thermal defect responses and thus to reduce an ultrasonic power required is suggested by using selective ultrasonic activation of defects based on the concept of local defect resonance (LDR). A strong increase in vibration amplitude at LDR enables to reliably detect and visualize the defect as soon as the driving ultrasonic frequency is matched to the LDR frequency. This also provides a high frequency selectivity of the LDR-based imaging, i.e. an opportunity of detecting a certain defect among a multitude of other defects in material. Some examples are shown how to use LDR in non-destructive testing techniques, like vibrometry, ultrasonic thermography and shearography in order to enhance the sensitivity of defect visualization.
Evaluation of Three Different Processing Techniques in the Fabrication of Complete Dentures
Chintalacheruvu, Vamsi Krishna; Balraj, Rajasekaran Uttukuli; Putchala, Lavanya Sireesha; Pachalla, Sreelekha
2017-01-01
Aims and Objectives: The objective of the present study is to compare the effectiveness of three different processing techniques and to find out the accuracy of processing techniques through number of occlusal interferences and increase in vertical dimension after denture processing. Materials and Methods: A cross-sectional study was conducted on a sample of 18 patients indicated for complete denture fabrication was selected for the study and they were divided into three subgroups. Three processing techniques, compression molding and injection molding using prepolymerized resin and unpolymerized resin, were used to fabricate dentures for each of the groups. After processing, laboratory-remounted dentures were evaluated for number of occlusal interferences in centric and eccentric relations and change in vertical dimension through vertical pin rise in articulator. Data were analyzed using statistical test ANOVA and SPSS software version 19.0 by IBM was used. Results: Data obtained from three groups were subjected to one-way ANOVA test. After ANOVA test, results with significant variations were subjected to post hoc test. Number of occlusal interferences with compression molding technique was reported to be more in both centric and eccentric positions as compared to the two injection molding techniques with statistical significance in centric, protrusive, right lateral nonworking, and left lateral working positions (P < 0.05). Mean vertical pin rise (0.52 mm) was reported to more in compression molding technique as compared to injection molding techniques, which is statistically significant (P < 0.001). Conclusions: Within the limitations of this study, injection molding techniques exhibited less processing errors as compared to compression molding technique with statistical significance. There was no statistically significant difference in processing errors reported within two injection molding systems. PMID:28713763
Evaluation of Three Different Processing Techniques in the Fabrication of Complete Dentures.
Chintalacheruvu, Vamsi Krishna; Balraj, Rajasekaran Uttukuli; Putchala, Lavanya Sireesha; Pachalla, Sreelekha
2017-06-01
The objective of the present study is to compare the effectiveness of three different processing techniques and to find out the accuracy of processing techniques through number of occlusal interferences and increase in vertical dimension after denture processing. A cross-sectional study was conducted on a sample of 18 patients indicated for complete denture fabrication was selected for the study and they were divided into three subgroups. Three processing techniques, compression molding and injection molding using prepolymerized resin and unpolymerized resin, were used to fabricate dentures for each of the groups. After processing, laboratory-remounted dentures were evaluated for number of occlusal interferences in centric and eccentric relations and change in vertical dimension through vertical pin rise in articulator. Data were analyzed using statistical test ANOVA and SPSS software version 19.0 by IBM was used. Data obtained from three groups were subjected to one-way ANOVA test. After ANOVA test, results with significant variations were subjected to post hoc test. Number of occlusal interferences with compression molding technique was reported to be more in both centric and eccentric positions as compared to the two injection molding techniques with statistical significance in centric, protrusive, right lateral nonworking, and left lateral working positions ( P < 0.05). Mean vertical pin rise (0.52 mm) was reported to more in compression molding technique as compared to injection molding techniques, which is statistically significant ( P < 0.001). Within the limitations of this study, injection molding techniques exhibited less processing errors as compared to compression molding technique with statistical significance. There was no statistically significant difference in processing errors reported within two injection molding systems.
Selection of optimal sensors for predicting performance of polymer electrolyte membrane fuel cell
NASA Astrophysics Data System (ADS)
Mao, Lei; Jackson, Lisa
2016-10-01
In this paper, sensor selection algorithms are investigated based on a sensitivity analysis, and the capability of optimal sensors in predicting PEM fuel cell performance is also studied using test data. The fuel cell model is developed for generating the sensitivity matrix relating sensor measurements and fuel cell health parameters. From the sensitivity matrix, two sensor selection approaches, including the largest gap method, and exhaustive brute force searching technique, are applied to find the optimal sensors providing reliable predictions. Based on the results, a sensor selection approach considering both sensor sensitivity and noise resistance is proposed to find the optimal sensor set with minimum size. Furthermore, the performance of the optimal sensor set is studied to predict fuel cell performance using test data from a PEM fuel cell system. Results demonstrate that with optimal sensors, the performance of PEM fuel cell can be predicted with good quality.
Inferring the Mode of Selection from the Transient Response to Demographic Perturbations
NASA Astrophysics Data System (ADS)
Balick, Daniel; Do, Ron; Reich, David; Sunyaev, Shamil
2014-03-01
Despite substantial recent progress in theoretical population genetics, most models work under the assumption of a constant population size. Deviations from fixed population sizes are ubiquitous in natural populations, many of which experience population bottlenecks and re-expansions. The non-equilibrium dynamics introduced by a large perturbation in population size are generally viewed as a confounding factor. In the present work, we take advantage of the transient response to a population bottleneck to infer features of the mode of selection and the distribution of selective effects. We develop an analytic framework and a corresponding statistical test that qualitatively differentiates between alleles under additive and those under recessive or more general epistatic selection. This statistic can be used to bound the joint distribution of selective effects and dominance effects in any diploid sexual organism. We apply this technique to human population genetic data, and severely restrict the space of allowed selective coefficients in humans. Additionally, one can test a set of functionally or medically relevant alleles for the primary mode of selection, or determine the local regional variation in dominance coefficients along the genome.
Westenberger, Benjamin J; Ellison, Christopher D; Fussner, Andrew S; Jenney, Susan; Kolinski, Richard E; Lipe, Terra G; Lyon, Robbe C; Moore, Terry W; Revelle, Larry K; Smith, Anjanette P; Spencer, John A; Story, Kimberly D; Toler, Duckhee Y; Wokovich, Anna M; Buhse, Lucinda F
2005-12-08
This work investigated the use of non-traditional analytical methods to evaluate the quality of a variety of pharmaceutical products purchased via internet sites from foreign sources and compared the results with those obtained from conventional quality assurance methods. Traditional analytical techniques employing HPLC for potency, content uniformity, chromatographic purity and drug release profiles were used to evaluate the quality of five selected drug products (fluoxetine hydrochloride, levothyroxine sodium, metformin hydrochloride, phenytoin sodium, and warfarin sodium). Non-traditional techniques, such as near infrared spectroscopy (NIR), NIR imaging and thermogravimetric analysis (TGA), were employed to verify the results and investigate their potential as alternative testing methods. Two of 20 samples failed USP monographs for quality attributes. The additional analytical methods found 11 of 20 samples had different formulations when compared to the U.S. product. Seven of the 20 samples arrived in questionable containers, and 19 of 20 had incomplete labeling. Only 1 of the 20 samples had final packaging similar to the U.S. products. The non-traditional techniques complemented the traditional techniques used and highlighted additional quality issues for the products tested. For example, these methods detected suspect manufacturing issues (such as blending), which were not evident from traditional testing alone.
NASA Astrophysics Data System (ADS)
Zeilik, M.; Garvin-Doxas, K.
2003-12-01
FLAG, the Field-tested Learning Assessment Guide (http://www.flaguide.org/) is a NSF funded website that offers broadly-applicable, self-contained modular classroom assessment techniques (CATs) and discipline-specific tools for STEM instructors creating new approaches to evaluate student learning, attitudes and performance. In particular, the FLAG contains proven techniques for alterative assessments---those needed for reformed, innovative STEM courses. Each tool has been developed, tested and refined in real classrooms at colleges and universities. The FLAG also contains an assessment primer, a section to help you select the most appropriate assessment technique(s) for your course goals, and other resources. In addition to references on instrument development and field-tested instruments on attitudes towards science, the FLAG also includes discipline-specific tools in Physics, Astronomy, Biology, and Mathematics. Building of the Geoscience collection is currently under way with the development of an instrument for detecting misconceptions of incoming freshmen on Space Science, which is being developed with the help of the Committee on Space Science and Astronomy of the American Association of Physics Teachers. Additional field-tested resources from the Geosciences are solicited from the community. Contributions should be sent to Michael Zeilik, zeilik@la.unm.edu. This work has been supported in part by NSF grant DUE 99-81155.
On selecting satellite conjunction filter parameters
NASA Astrophysics Data System (ADS)
Alfano, Salvatore; Finkleman, David
2014-06-01
This paper extends concepts of signal detection theory to predict the performance of conjunction screening techniques and guiding the selection of keepout and screening thresholds. The most efficient way to identify satellites likely to collide is to employ filters to identify orbiting pairs that should not come close enough over a prescribed time period to be considered hazardous. Such pairings can then be eliminated from further computation to accelerate overall processing time. Approximations inherent in filtering techniques include screening using only unperturbed Newtonian two body astrodynamics and uncertainties in orbit elements. Therefore, every filtering process is vulnerable to including objects that are not threats and excluding some that are threats, Type I and Type II errors. The approach in this paper guides selection of the best operating point for the filters suited to a user's tolerance for false alarms and unwarned threats. We demonstrate the approach using three archetypal filters with an initial three-day span, select filter parameters based on performance, and then test those parameters using eight historical snapshots of the space catalog. This work provides a mechanism for selecting filter parameters but the choices depend on the circumstances.
Derrac, Joaquín; Triguero, Isaac; Garcia, Salvador; Herrera, Francisco
2012-10-01
Cooperative coevolution is a successful trend of evolutionary computation which allows us to define partitions of the domain of a given problem, or to integrate several related techniques into one, by the use of evolutionary algorithms. It is possible to apply it to the development of advanced classification methods, which integrate several machine learning techniques into a single proposal. A novel approach integrating instance selection, instance weighting, and feature weighting into the framework of a coevolutionary model is presented in this paper. We compare it with a wide range of evolutionary and nonevolutionary related methods, in order to show the benefits of the employment of coevolution to apply the techniques considered simultaneously. The results obtained, contrasted through nonparametric statistical tests, show that our proposal outperforms other methods in the comparison, thus becoming a suitable tool in the task of enhancing the nearest neighbor classifier.
NASA Technical Reports Server (NTRS)
Kashangaki, Thomas A. L.
1992-01-01
This paper describes a series of modal tests that were performed on a cantilevered truss structure. The goal of the tests was to assemble a large database of high quality modal test data for use in verification of proposed methods for on orbit model verification and damage detection in flexible truss structures. A description of the hardware is provided along with details of the experimental setup and procedures for 16 damage cases. Results from selected cases are presented and discussed. Differences between ground vibration testing and on orbit modal testing are also described.
High-Pressure Oxygen Test Evaluations
NASA Technical Reports Server (NTRS)
Schwinghamer, R. J.; Key, C. F.
1974-01-01
The relevance of impact sensitivity testing to the development of the space shuttle main engine is discussed in the light of the special requirements for the engine. The background and history of the evolution of liquid and gaseous oxygen testing techniques and philosophy is discussed also. The parameters critical to reliable testing are treated in considerable detail, and test apparatus and procedures are described and discussed. Materials threshold sensitivity determination procedures are considered and a decision logic diagram for sensitivity threshold determination was plotted. Finally, high-pressure materials sensitivity test data are given for selected metallic and nonmetallic materials.
'Enzyme Test Bench': A biochemical application of the multi-rate modeling
NASA Astrophysics Data System (ADS)
Rachinskiy, K.; Schultze, H.; Boy, M.; Büchs, J.
2008-11-01
In the expanding field of 'white biotechnology' enzymes are frequently applied to catalyze the biochemical reaction from a resource material to a valuable product. Evolutionary designed to catalyze the metabolism in any life form, they selectively accelerate complex reactions under physiological conditions. Modern techniques, such as directed evolution, have been developed to satisfy the increasing demand on enzymes. Applying these techniques together with rational protein design, we aim at improving of enzymes' activity, selectivity and stability. To tap the full potential of these techniques, it is essential to combine them with adequate screening methods. Nowadays a great number of high throughput colorimetric and fluorescent enzyme assays are applied to measure the initial enzyme activity with high throughput. However, the prediction of enzyme long term stability within short experiments is still a challenge. A new high throughput technique for enzyme characterization with specific attention to the long term stability, called 'Enzyme Test Bench', is presented. The concept of the Enzyme Test Bench consists of short term enzyme tests conducted under partly extreme conditions to predict the enzyme long term stability under moderate conditions. The technique is based on the mathematical modeling of temperature dependent enzyme activation and deactivation. Adapting the temperature profiles in sequential experiments by optimum non-linear experimental design, the long term deactivation effects can be purposefully accelerated and detected within hours. During the experiment the enzyme activity is measured online to estimate the model parameters from the obtained data. Thus, the enzyme activity and long term stability can be calculated as a function of temperature. The results of the characterization, based on micro liter format experiments of hours, are in good agreement with the results of long term experiments in 1L format. Thus, the new technique allows for both: the enzyme screening with regard to the long term stability and the choice of the optimal process temperature. The presented article gives a successful example for the application of multi-rate modeling, experimental design and parameter estimation within biochemical engineering. At the same time, it shows the limitations of the methods at the state of the art and addresses the current problems to the applied mathematics community.
Wang, Yen-Ling
2014-01-01
Checkpoint kinase 2 (Chk2) has a great effect on DNA-damage and plays an important role in response to DNA double-strand breaks and related lesions. In this study, we will concentrate on Chk2 and the purpose is to find the potential inhibitors by the pharmacophore hypotheses (PhModels), combinatorial fusion, and virtual screening techniques. Applying combinatorial fusion into PhModels and virtual screening techniques is a novel design strategy for drug design. We used combinatorial fusion to analyze the prediction results and then obtained the best correlation coefficient of the testing set (r test) with the value 0.816 by combining the BesttrainBesttest and FasttrainFasttest prediction results. The potential inhibitors were selected from NCI database by screening according to BesttrainBesttest + FasttrainFasttest prediction results and molecular docking with CDOCKER docking program. Finally, the selected compounds have high interaction energy between a ligand and a receptor. Through these approaches, 23 potential inhibitors for Chk2 are retrieved for further study. PMID:24864236
Solving the Swath Segment Selection Problem
NASA Technical Reports Server (NTRS)
Knight, Russell; Smith, Benjamin
2006-01-01
Several artificial-intelligence search techniques have been tested as means of solving the swath segment selection problem (SSSP) -- a real-world problem that is not only of interest in its own right, but is also useful as a test bed for search techniques in general. In simplest terms, the SSSP is the problem of scheduling the observation times of an airborne or spaceborne synthetic-aperture radar (SAR) system to effect the maximum coverage of a specified area (denoted the target), given a schedule of downlinks (opportunities for radio transmission of SAR scan data to a ground station), given the limit on the quantity of SAR scan data that can be stored in an onboard memory between downlink opportunities, and given the limit on the achievable downlink data rate. The SSSP is NP complete (short for "nondeterministic polynomial time complete" -- characteristic of a class of intractable problems that can be solved only by use of computers capable of making guesses and then checking the guesses in polynomial time).
Mallineni, S K; Anthonappa, R P; King, N M
2016-12-01
To assess the reliability of the vertical tube shift technique (VTST) and horizontal tube shift technique (HTST) for the localisation of unerupted supernumerary teeth (ST) in the anterior region of the maxilla. A convenience sample of 83 patients who attended a major teaching hospital because of unerupted ST was selected. Only non-syndromic patients with ST and who had complete clinical and radiographic and surgical records were included in the study. Ten examiners independently rated the paired set of radiographs for each technique. Chi-square test, paired t test and kappa statistics were employed to assess the intra- and inter-examiner reliability. Paired sets of 1660 radiographs (830 pairs for each technique) were available for the analysis. The overall sensitivity for VTST and HTST was 80.6 and 72.1% respectively, with slight inter-examiner and good intra-examiner reliability. Statistically significant differences were evident between the two localisation techniques (p < 0.05). Localisation of unerupted ST using VTST was more successful than HTST in the anterior region of the maxilla.
Seminal quality prediction using data mining methods.
Sahoo, Anoop J; Kumar, Yugal
2014-01-01
Now-a-days, some new classes of diseases have come into existences which are known as lifestyle diseases. The main reasons behind these diseases are changes in the lifestyle of people such as alcohol drinking, smoking, food habits etc. After going through the various lifestyle diseases, it has been found that the fertility rates (sperm quantity) in men has considerably been decreasing in last two decades. Lifestyle factors as well as environmental factors are mainly responsible for the change in the semen quality. The objective of this paper is to identify the lifestyle and environmental features that affects the seminal quality and also fertility rate in man using data mining methods. The five artificial intelligence techniques such as Multilayer perceptron (MLP), Decision Tree (DT), Navie Bayes (Kernel), Support vector machine+Particle swarm optimization (SVM+PSO) and Support vector machine (SVM) have been applied on fertility dataset to evaluate the seminal quality and also to predict the person is either normal or having altered fertility rate. While the eight feature selection techniques such as support vector machine (SVM), neural network (NN), evolutionary logistic regression (LR), support vector machine plus particle swarm optimization (SVM+PSO), principle component analysis (PCA), chi-square test, correlation and T-test methods have been used to identify more relevant features which affect the seminal quality. These techniques are applied on fertility dataset which contains 100 instances with nine attribute with two classes. The experimental result shows that SVM+PSO provides higher accuracy and area under curve (AUC) rate (94% & 0.932) among multi-layer perceptron (MLP) (92% & 0.728), Support Vector Machines (91% & 0.758), Navie Bayes (Kernel) (89% & 0.850) and Decision Tree (89% & 0.735) for some of the seminal parameters. This paper also focuses on the feature selection process i.e. how to select the features which are more important for prediction of fertility rate. In this paper, eight feature selection methods are applied on fertility dataset to find out a set of good features. The investigational results shows that childish diseases (0.079) and high fever features (0.057) has less impact on fertility rate while age (0.8685), season (0.843), surgical intervention (0.7683), alcohol consumption (0.5992), smoking habit (0.575), number of hours spent on setting (0.4366) and accident (0.5973) features have more impact. It is also observed that feature selection methods increase the accuracy of above mentioned techniques (multilayer perceptron 92%, support vector machine 91%, SVM+PSO 94%, Navie Bayes (Kernel) 89% and decision tree 89%) as compared to without feature selection methods (multilayer perceptron 86%, support vector machine 86%, SVM+PSO 85%, Navie Bayes (Kernel) 83% and decision tree 84%) which shows the applicability of feature selection methods in prediction. This paper lightens the application of artificial techniques in medical domain. From this paper, it can be concluded that data mining methods can be used to predict a person with or without disease based on environmental and lifestyle parameters/features rather than undergoing various medical test. In this paper, five data mining techniques are used to predict the fertility rate and among which SVM+PSO provide more accurate results than support vector machine and decision tree.
NASA Astrophysics Data System (ADS)
Liu, Y.; Gao, J. R.; Lou, H. P.; Zhang, J. R.; Rauch, H. P.
2010-05-01
Use the potential values of soil bioengineering techniques are important for the wide attention river ecological restoration works in Beijing. At first, demand for basic knowledge of the technical and biological properties of plants is essential for development of such techniques. Species for each chosen plant material type should be selected with an emphasis on the following: suitability for anticipated environment conditions, reasonable availability in desired quantity and probability of successful establishment. Account on these criteria, four species which used as live staking and rooted cutting techniques were selected, namely, Salix X aureo-pendula, Salix cheilophila, Vitex negundo var. heterophylla and Amorpha fruticosa L.. And monitoring work was performed on three construction sites of Beijing. Various survival rates and morphological parameters data were collected. Concerning plants hydraulic and hydrological behavior, bending tests were used to analysis the flexibility of each plant species. The results from rate and morphological parameters monitoring show that: Salix cheilophila performed the best. Other three plants behaved satisfactorily in shoots or roots development respectively. In the bending test mornitoring, Salix cheilophila branch had the least broken number. Then were Salix X aureo-pendula and Amorpha fruticosa L.. Vitex negundo var. branch had the highest broken number, but it tolerated the highest amount of stress. All plant species should be considered in the future scientific research and construction works in Beijing. Keywords: River bank stabilization, live staking, rooted cutting
Texture Modification of the Shuttle Landing Facility Runway at Kennedy Space Center
NASA Technical Reports Server (NTRS)
Daugherty, Robert H.; Yager, Thomas J.
1997-01-01
This paper describes the test procedures and the criteria used in selecting an effective runway-surface-texture modification at the Kennedy Space Center (KSC) Shuttle Landing Facility (SLF) to reduce Orbiter tire wear. The new runway surface may ultimately result in an increase of allowable crosswinds for launch and landing operations. The modification allows launch and landing operations in 20-knot crosswinds, if desired. This 5-knot increase over the previous 15-knot limit drastically increases landing safety and the ability to make on-time launches to support missions in which Space Station rendezvous are planned. The paper presents the results of an initial (1988) texture modification to reduce tire spin-up wear and then describes a series of tests that use an instrumented ground-test vehicle to compare tire friction and wear characteristics, at small scale, of proposed texture modifications placed into the SLF runway surface itself. Based on these tests, three candidate surfaces were chosen to be tested at full-scale by using a highly modified and instrumented transport aircraft capable of duplicating full Orbiter landing profiles. The full-scale Orbiter tire testing revealed that tire wear could be reduced approximately by half with either of two candidates. The texture-modification technique using a Humble Equipment Company Skidabrader(trademark) shotpeening machine proved to be highly effective, and the entire SLF runway surface was modified in September 1994. The extensive testing and evaluation effort that preceded the selection of this particular surface-texture-modification technique is described herein.
Alshamlan, Hala; Badr, Ghada; Alohali, Yousef
2015-01-01
An artificial bee colony (ABC) is a relatively recent swarm intelligence optimization approach. In this paper, we propose the first attempt at applying ABC algorithm in analyzing a microarray gene expression profile. In addition, we propose an innovative feature selection algorithm, minimum redundancy maximum relevance (mRMR), and combine it with an ABC algorithm, mRMR-ABC, to select informative genes from microarray profile. The new approach is based on a support vector machine (SVM) algorithm to measure the classification accuracy for selected genes. We evaluate the performance of the proposed mRMR-ABC algorithm by conducting extensive experiments on six binary and multiclass gene expression microarray datasets. Furthermore, we compare our proposed mRMR-ABC algorithm with previously known techniques. We reimplemented two of these techniques for the sake of a fair comparison using the same parameters. These two techniques are mRMR when combined with a genetic algorithm (mRMR-GA) and mRMR when combined with a particle swarm optimization algorithm (mRMR-PSO). The experimental results prove that the proposed mRMR-ABC algorithm achieves accurate classification performance using small number of predictive genes when tested using both datasets and compared to previously suggested methods. This shows that mRMR-ABC is a promising approach for solving gene selection and cancer classification problems. PMID:25961028
Alshamlan, Hala; Badr, Ghada; Alohali, Yousef
2015-01-01
An artificial bee colony (ABC) is a relatively recent swarm intelligence optimization approach. In this paper, we propose the first attempt at applying ABC algorithm in analyzing a microarray gene expression profile. In addition, we propose an innovative feature selection algorithm, minimum redundancy maximum relevance (mRMR), and combine it with an ABC algorithm, mRMR-ABC, to select informative genes from microarray profile. The new approach is based on a support vector machine (SVM) algorithm to measure the classification accuracy for selected genes. We evaluate the performance of the proposed mRMR-ABC algorithm by conducting extensive experiments on six binary and multiclass gene expression microarray datasets. Furthermore, we compare our proposed mRMR-ABC algorithm with previously known techniques. We reimplemented two of these techniques for the sake of a fair comparison using the same parameters. These two techniques are mRMR when combined with a genetic algorithm (mRMR-GA) and mRMR when combined with a particle swarm optimization algorithm (mRMR-PSO). The experimental results prove that the proposed mRMR-ABC algorithm achieves accurate classification performance using small number of predictive genes when tested using both datasets and compared to previously suggested methods. This shows that mRMR-ABC is a promising approach for solving gene selection and cancer classification problems.
NASA Technical Reports Server (NTRS)
Thomas, J. M.; Hanagud, S.
1974-01-01
The design criteria and test options for aerospace structural reliability were investigated. A decision methodology was developed for selecting a combination of structural tests and structural design factors. The decision method involves the use of Bayesian statistics and statistical decision theory. Procedures are discussed for obtaining and updating data-based probabilistic strength distributions for aerospace structures when test information is available and for obtaining subjective distributions when data are not available. The techniques used in developing the distributions are explained.
Multiconductor Short/Open Cable Tester
NASA Technical Reports Server (NTRS)
Eichenberg, Dennis
1994-01-01
Frequent or regular testing of multiconductor cables terminated in multipin conductors tedious, if not impossible, task. This inexpensive circuit simplifies open/short testing and is amenable to automation. In operation, pair of connectors selected to match pair of connectors installed on each of cables to be tested. As many connectors accommodated as required, and each can have as many conductors as required. Testing technique implemented with this circuit automated easily with electronic controls and computer interface. Printout provides status of each conductor in cable, indicating which, if any, of conductors has open or short circuit.
Containment of composite fan blades
NASA Technical Reports Server (NTRS)
Stotler, C. L.; Coppa, A. P.
1979-01-01
A lightweight containment was developed for turbofan engine fan blades. Subscale ballistic-type tests were first run on a number of concepts. The most promising configuration was selected and further evaluated by larger scale tests in a rotating test rig. Weight savings made possible by the use of this new containment system were determined and extrapolated to a CF6-size engine. An analytical technique was also developed to predict the released blades motion when involved in the blade/casing interaction process. Initial checkout of this procedure was accomplished using several of the tests run during the program.
Quiet Clean Short-haul Experimental Engine (QCSEE) clean combustor test report
NASA Technical Reports Server (NTRS)
1975-01-01
A component pressure test was conducted on a F101 PFRT combustor to evaluate the emissions levels of this combustor design at selected under the wing and over the wing operating conditions for the quiet clean short haul experimental engine (QCSEE). Emissions reduction techniques were evaluated which included compressor discharge bleed and sector burning in the combustor. The results of this test were utilized to compare the expected QCSEE emissions levels with the emission goals of the QCSEE engine program.
Robust Statistics and Regularization for Feature Extraction and UXO Discrimination
2011-07-01
July 11, 2011 real data we find that this technique has an improved probability of finding all ordnance in a test data set, relative to previously...many sites. Tests on larger data sets should still be carried out. In previous work we considered a bootstrapping approach to selecting the operating...Marginalizing over x we obtain the probability that the ith order statistic in the test data belongs to the T class (55) P (T |x(i)) = ∞∫ −∞ P (T |x)p(x
Retention of denture bases fabricated by three different processing techniques – An in vivo study
Chalapathi Kumar, V. H.; Surapaneni, Hemchand; Ravikiran, V.; Chandra, B. Sarat; Balusu, Srilatha; Reddy, V. Naveen
2016-01-01
Aim: Distortion due to Polymerization shrinkage compromises the retention. To evaluate the amount of retention of denture bases fabricated by conventional, anchorized, and injection molding polymerization techniques. Materials and Methods: Ten completely edentulous patients were selected, impressions were made, and master cast obtained was duplicated to fabricate denture bases by three polymerization techniques. Loop was attached to the finished denture bases to estimate the force required to dislodge them by retention apparatus. Readings were subjected to nonparametric Friedman two-way analysis of variance followed by Bonferroni correction methods and Wilcoxon matched-pairs signed-ranks test. Results: Denture bases fabricated by injection molding (3740 g), anchorized techniques (2913 g) recorded greater retention values than conventional technique (2468 g). Significant difference was seen between these techniques. Conclusions: Denture bases obtained by injection molding polymerization technique exhibited maximum retention, followed by anchorized technique, and least retention was seen in conventional molding technique. PMID:27382542
Hydrodynamics-induced variability in the USP apparatus II dissolution test.
Baxter, Jennifer L; Kukura, Joseph; Muzzio, Fernando J
2005-03-23
The USP tablet dissolution test is an analytical tool used for the verification of drug release processes and formulation selection within the pharmaceutical industry. Given the strong impact of this test, it is surprising that operating conditions and testing devices have been selected empirically. In fact, the flow phenomena in the USP test have received little attention in the past. An examination of the hydrodynamics in the USP apparatus II shows that the device is highly vulnerable to mixing problems that can affect testing performance and consistency. Experimental and computational techniques reveal that the flow field within the device is not uniform, and dissolution results can vary dramatically with the position of the tablet within the vessel. Specifically, computations predict sharp variations in the shear along the bottom of the vessel where the tablet is most likely to settle. Experiments in which the tablet location was carefully controlled reveal that the variation of shear within the testing device can affect the measured dissolution rate.
Journal of Engineering Thermophysics (Selected Articles),
1983-05-20
A SURGE TEST OF A TWIN-SHAFT TURBOJET ENGINE ON GROUND TEST BED* Chiang Feng (Shengyang Aeroengine Company) ABSTRACT Instrument technique for...oscillogram for the static pressure behind the two compressors. This noise was analyzed and believed to have arisen from the vibrations of the rotating blades...booms are heard. The vibrational energy of the surge is enormous, especially in the range of 85-90% of rotational speed. One can feel the vibrations
Stressed Stability Techniques for Adjuvant Formulations.
Hasija, Manvi; Sheung, Anthony; Rahman, Nausheen; Ausar, Salvador F
2017-01-01
Stressed stability testing is crucial to the understanding of mechanisms of degradation and the effects of external stress factors on adjuvant stability. These studies vastly help the development of stability indicating tests and the selection of stabilizing conditions for long term storage. In this chapter, we provide detailed protocols for the execution of forced degradation experiments that evaluate the robustness of adjuvant formulations against thermal, mechanical, freeze-thawing, and photo stresses.
Operational Based Vision Assessment Cone Contrast Test: Description and Operation
2016-06-02
Jun 2016. Report contains color . 14. ABSTRACT The work detailed in this report was conducted by the Operational Based Vision Assessment (OBVA...currently used by the Air Force for aircrew color vision screening. The new OBVA CCT is differentiated from the Rabin device primarily by hardware...test procedures, and analysis techniques. Like the Rabin CCT, the OBVA CCT uses colors that selectively stimulate the cone photoreceptors of the
Production and characterization of large-area sputtered selective solar absorber coatings
NASA Astrophysics Data System (ADS)
Graf, Wolfgang; Koehl, Michael; Wittwer, Volker
1992-11-01
Most of the commercially available selective solar absorber coatings are produced by electroplating. Often the reproducibility or the durability of their optical properties is not very satisfying. Good reproducibility can be achieved by sputtering, the technique for the production of low-(epsilon) coatings for windows. The suitability of this kind of deposition technique for flat-plate solar absorber coatings based on the principle of ceramic/metal composites was investigated for different material combinations, and prototype collectors were manufactured. The optical characterization of the coatings is based on spectral measurements of the near-normal/hemispherical and the angle-dependent reflectance in the wavelength-range 0.38 micrometers - 17 micrometers . The durability assessment was carried out by temperature tests in ovens and climatic chambers.
Advanced aerodynamics. Selected NASA research
NASA Technical Reports Server (NTRS)
1981-01-01
This Conference Publication contains selected NASA papers that were presented at the Fifth Annual Status Review of the NASA Aircraft Energy Efficiency (ACEE) Energy Efficient Transport (EET) Program held at Dryden Flight Research Center in Edwards, California on September 14 to 15, 1981. These papers describe the status of several NASA in-house research activities in the areas of advanced turboprops, natural laminar flow, oscillating control surfaces, high-Reynolds-number airfoil tests, high-lift technology, and theoretical design techniques.
Does Angling Technique Selectively Target Fishes Based on Their Behavioural Type?
Wilson, Alexander D. M.; Brownscombe, Jacob W.; Sullivan, Brittany; Jain-Schlaepfer, Sofia; Cooke, Steven J.
2015-01-01
Recently, there has been growing recognition that fish harvesting practices can have important impacts on the phenotypic distributions and diversity of natural populations through a phenomenon known as fisheries-induced evolution. Here we experimentally show that two common recreational angling techniques (active crank baits versus passive soft plastics) differentially target wild largemouth bass (Micropterus salmoides) and rock bass (Ambloplites rupestris) based on variation in their behavioural tendencies. Fish were first angled in the wild using both techniques and then brought back to the laboratory and tested for individual-level differences in common estimates of personality (refuge emergence, flight-initiation-distance, latency-to-recapture and with a net, and general activity) in an in-lake experimental arena. We found that different angling techniques appear to selectively target these species based on their boldness (as characterized by refuge emergence, a standard measure of boldness in fishes) but not other assays of personality. We also observed that body size was independently a significant predictor of personality in both species, though this varied between traits and species. Our results suggest a context-dependency for vulnerability to capture relative to behaviour in these fish species. Ascertaining the selective pressures angling practices exert on natural populations is an important area of fisheries research with significant implications for ecology, evolution, and resource management. PMID:26284779
Evaluation of AISI 4140 Steel Repair Without Post-Weld Heat Treatment
NASA Astrophysics Data System (ADS)
Silva, Cleiton C.; de Albuquerque, Victor H. C.; Moura, Cícero R. O.; Aguiar, Willys M.; Farias, Jesualdo P.
2009-04-01
The present work evaluates the two-layer technique on the heat affected zone (HAZ) of AISI 4140 steel welded with different heat input levels between the first and second layer. The weld heat input levels selected by the Higuchi test were 5/5, 5/10, and 15/5 kJ/cm. The evaluation of the refining and/or tempering of the coarsened grain HAZ of the first layer was carried out using metallographic tests, microhardness measurements, and the Charpy-V impact test. The tempering of the first layer was only reached when the weld heat input ratio was 5/5 kJ/cm. The results of the Charpy-V impact test showed that the two-layer technique was efficient, from the point of view of toughness, since the toughness values reached were greater than the base metal for all weld heat input ratios applied. The results obtained indicate that the best performance of the two-layer deposition technique was for the weld heat input ratio 5/5 kJ/cm employing low heat input.
User Selection Criteria of Airspace Designs in Flexible Airspace Management
NASA Technical Reports Server (NTRS)
Lee, Hwasoo E.; Lee, Paul U.; Jung, Jaewoo; Lai, Chok Fung
2011-01-01
A method for identifying global aerodynamic models from flight data in an efficient manner is explained and demonstrated. A novel experiment design technique was used to obtain dynamic flight data over a range of flight conditions with a single flight maneuver. Multivariate polynomials and polynomial splines were used with orthogonalization techniques and statistical modeling metrics to synthesize global nonlinear aerodynamic models directly and completely from flight data alone. Simulation data and flight data from a subscale twin-engine jet transport aircraft were used to demonstrate the techniques. Results showed that global multivariate nonlinear aerodynamic dependencies could be accurately identified using flight data from a single maneuver. Flight-derived global aerodynamic model structures, model parameter estimates, and associated uncertainties were provided for all six nondimensional force and moment coefficients for the test aircraft. These models were combined with a propulsion model identified from engine ground test data to produce a high-fidelity nonlinear flight simulation very efficiently. Prediction testing using a multi-axis maneuver showed that the identified global model accurately predicted aircraft responses.
NASA Technical Reports Server (NTRS)
Hartman, A. S.; Nutt, K. W.
1982-01-01
Tests of the space shuttle external tank foam insulation were conducted in the von Karman Gas Dynamics Facility Tunnel C. For these tests, Tunnel C was run at Mach 4 with a total temperature of 1440 F and a total pressure which varied from 30-100 psia. Cold wall heating rates were changed by varying the test article support wedge angle and by adding and removing a shock generator or a cylindrical protuberance. Selected results are presented to illustrate the test techniques and typical data obtained.
Flight test techniques for validating simulated nuclear electromagnetic pulse aircraft responses
NASA Technical Reports Server (NTRS)
Winebarger, R. M.; Neely, W. R., Jr.
1984-01-01
An attempt has been made to determine the effects of nuclear EM pulses (NEMPs) on aircraft systems, using a highly instrumented NASA F-106B to document the simulated NEMP environment at the Kirtland Air Force Base's Vertically Polarized Dipole test facility. Several test positions were selected so that aircraft orientation relative to the test facility would be the same in flight as when on the stationary dielectric stand, in order to validate the dielectric stand's use in flight configuration simulations. Attention is given to the flight test portions of the documentation program.
A survey of variable selection methods in two Chinese epidemiology journals
2010-01-01
Background Although much has been written on developing better procedures for variable selection, there is little research on how it is practiced in actual studies. This review surveys the variable selection methods reported in two high-ranking Chinese epidemiology journals. Methods Articles published in 2004, 2006, and 2008 in the Chinese Journal of Epidemiology and the Chinese Journal of Preventive Medicine were reviewed. Five categories of methods were identified whereby variables were selected using: A - bivariate analyses; B - multivariable analysis; e.g. stepwise or individual significance testing of model coefficients; C - first bivariate analyses, followed by multivariable analysis; D - bivariate analyses or multivariable analysis; and E - other criteria like prior knowledge or personal judgment. Results Among the 287 articles that reported using variable selection methods, 6%, 26%, 30%, 21%, and 17% were in categories A through E, respectively. One hundred sixty-three studies selected variables using bivariate analyses, 80% (130/163) via multiple significance testing at the 5% alpha-level. Of the 219 multivariable analyses, 97 (44%) used stepwise procedures, 89 (41%) tested individual regression coefficients, but 33 (15%) did not mention how variables were selected. Sixty percent (58/97) of the stepwise routines also did not specify the algorithm and/or significance levels. Conclusions The variable selection methods reported in the two journals were limited in variety, and details were often missing. Many studies still relied on problematic techniques like stepwise procedures and/or multiple testing of bivariate associations at the 0.05 alpha-level. These deficiencies should be rectified to safeguard the scientific validity of articles published in Chinese epidemiology journals. PMID:20920252
Automated Inference of Chemical Discriminants of Biological Activity.
Raschka, Sebastian; Scott, Anne M; Huertas, Mar; Li, Weiming; Kuhn, Leslie A
2018-01-01
Ligand-based virtual screening has become a standard technique for the efficient discovery of bioactive small molecules. Following assays to determine the activity of compounds selected by virtual screening, or other approaches in which dozens to thousands of molecules have been tested, machine learning techniques make it straightforward to discover the patterns of chemical groups that correlate with the desired biological activity. Defining the chemical features that generate activity can be used to guide the selection of molecules for subsequent rounds of screening and assaying, as well as help design new, more active molecules for organic synthesis.The quantitative structure-activity relationship machine learning protocols we describe here, using decision trees, random forests, and sequential feature selection, take as input the chemical structure of a single, known active small molecule (e.g., an inhibitor, agonist, or substrate) for comparison with the structure of each tested molecule. Knowledge of the atomic structure of the protein target and its interactions with the active compound are not required. These protocols can be modified and applied to any data set that consists of a series of measured structural, chemical, or other features for each tested molecule, along with the experimentally measured value of the response variable you would like to predict or optimize for your project, for instance, inhibitory activity in a biological assay or ΔG binding . To illustrate the use of different machine learning algorithms, we step through the analysis of a dataset of inhibitor candidates from virtual screening that were tested recently for their ability to inhibit GPCR-mediated signaling in a vertebrate.
Energy efficient engine shroudless, hollow fan blade technology report
NASA Technical Reports Server (NTRS)
Michael, C. J.
1981-01-01
The Shroudless, Hollow Fan Blade Technology program was structured to support the design, fabrication, and subsequent evaluation of advanced hollow and shroudless blades for the Energy Efficient Engine fan component. Rockwell International was initially selected to produce hollow airfoil specimens employing the superplastic forming/diffusion bonding (SPF/DB) fabrication technique. Rockwell demonstrated that a titanium hollow structure could be fabricated utilizing SPF/DB manufacturing methods. However, some problems such as sharp internal cavity radii and unsatisfactory secondary bonding of the edge and root details prevented production of the required quantity of fatigue test specimens. Subsequently, TRW was selected to (1) produce hollow airfoil test specimens utilizing a laminate-core/hot isostatic press/diffusion bond approach, and (2) manufacture full-size hollow prototype fan blades utilizing the technology that evolved from the specimen fabrication effort. TRW established elements of blade design and defined laminate-core/hot isostatic press/diffusion bonding fabrication techniques to produce test specimens. This fabrication technology was utilized to produce full size hollow fan blades in which the HIP'ed parts were cambered/twisted/isothermally forged, finish machined, and delivered to Pratt & Whitney Aircraft and NASA for further evaluation.
NASA Technical Reports Server (NTRS)
Feinstein, S. P.; Girard, M. A.
1979-01-01
An automated technique for measuring particle diameters and their spatial coordinates from holographic reconstructions is being developed. Preliminary tests on actual cold-flow holograms of impinging jets indicate that a suitable discriminant algorithm consists of a Fourier-Gaussian noise filter and a contour thresholding technique. This process identifies circular as well as noncircular objects. The desired objects (in this case, circular or possibly ellipsoidal) are then selected automatically from the above set and stored with their parametric representations. From this data, dropsize distributions as a function of spatial coordinates can be generated and combustion effects due to hardware and/or physical variables studied.
Improving neuromodulation technique for refractory voiding dysfunctions: two-stage implant.
Janknegt, R A; Weil, E H; Eerdmans, P H
1997-03-01
Neuromodulation is a new technique that uses electrical stimulation of the sacral nerves for patients with refractory urinary urge/frequency or urge-incontinence, and some forms of urinary retention. The limiting factor for receiving an implant is often a failure of the percutaneous nerve evaluation (PNE) test. Present publications mention only about a 50% success score for PNE of all patients, although the micturition diaries and urodynamic parameters are similar. We wanted to investigate whether PNE results improved by using a permanent electrode as a PNE test. This would show that improvement of the PNE technique is feasible. In 10 patients where the original PNE had failed to improve the micturition diary parameters more than 50%, a permanent electrode was implanted by operation. It was connected to an external stimulator. In those cases where the patients improved according to their micturition diary by more than 50% during a period of 4 days, the external stimulator was replaced by a permanent subcutaneous neurostimulator. Eight of the 10 patients had a good to very good result (60% to 90% improvement) during the testing period and received their implant 5 to 14 days after the first stage. The good results of the two-stage implant technique we used indicate that the development of better PNE electrodes may lead to an improvement of the testing technique and better selection between nonresponders and technical failures.
Rethinking developmental toxicity testing: Evolution or revolution?
Scialli, Anthony R; Daston, George; Chen, Connie; Coder, Prägati S; Euling, Susan Y; Foreman, Jennifer; Hoberman, Alan M; Hui, Julia; Knudsen, Thomas; Makris, Susan L; Morford, LaRonda; Piersma, Aldert H; Stanislaus, Dinesh; Thompson, Kary E
2018-06-01
Current developmental toxicity testing adheres largely to protocols suggested in 1966 involving the administration of test compound to pregnant laboratory animals. After more than 50 years of embryo-fetal development testing, are we ready to consider a different approach to human developmental toxicity testing? A workshop was held under the auspices of the Developmental and Reproductive Toxicology Technical Committee of the ILSI Health and Environmental Sciences Institute to consider how we might design developmental toxicity testing if we started over with 21st century knowledge and techniques (revolution). We first consider what changes to the current protocols might be recommended to make them more predictive for human risk (evolution). The evolutionary approach includes modifications of existing protocols and can include humanized models, disease models, more accurate assessment and testing of metabolites, and informed approaches to dose selection. The revolution could start with hypothesis-driven testing where we take what we know about a compound or close analog and answer specific questions using targeted experimental techniques rather than a one-protocol-fits-all approach. Central to the idea of hypothesis-driven testing is the concept that testing can be done at the level of mode of action. It might be feasible to identify a small number of key events at a molecular or cellular level that predict an adverse outcome and for which testing could be performed in vitro or in silico or, rarely, using limited in vivo models. Techniques for evaluating these key events exist today or are in development. Opportunities exist for refining and then replacing current developmental toxicity testing protocols using techniques that have already been developed or are within reach. © 2018 The Authors. Birth Defects Research Published by Wiley Periodicals, Inc.
Design and grayscale fabrication of beamfanners in a silicon substrate
NASA Astrophysics Data System (ADS)
Ellis, Arthur Cecil
2001-11-01
This dissertation addresses important first steps in the development of a grayscale fabrication process for multiple phase diffractive optical elements (DOS's) in silicon. Specifically, this process was developed through the design, fabrication, and testing of 1-2 and 1-4 beamfanner arrays for 5-micron illumination. The 1-2 beamfanner arrays serve as a test-of- concept and basic developmental step toward the construction of the 1-4 beamfanners. The beamfanners are 50 microns wide, and have features with dimensions of between 2 and 10 microns. The Iterative Annular Spectrum Approach (IASA) method, developed by Steve Mellin of UAH, and the Boundary Element Method (BEM) are the design and testing tools used to create the beamfanner profiles and predict their performance. Fabrication of the beamfanners required the techniques of grayscale photolithography and reactive ion etching (RIE). A 2-3micron feature size 1-4 silicon beamfanner array was fabricated, but the small features and contact photolithographic techniques available prevented its construction to specifications. A second and more successful attempt was made in which both 1-4 and 1-2 beamfanner arrays were fabricated with a 5-micron minimum feature size. Photolithography for the UAH array was contracted to MEMS-Optical of Huntsville, Alabama. A repeatability study was performed, using statistical techniques, of 14 photoresist arrays and the subsequent RIE process used to etch the arrays in silicon. The variance in selectivity between the 14 processes was far greater than the variance between the individual etched features within each process. Specifically, the ratio of the variance of the selectivities averaged over each of the 14 etch processes to the variance of individual feature selectivities within the processes yielded a significance level below 0.1% by F-test, indicating that good etch-to-etch process repeatability was not attained. One of the 14 arrays had feature etch-depths close enough to design specifications for optical testing, but 5- micron IR illumination of the 1-4 and 1-2 beamfanners yielded no convincing results of beam splitting in the detector plane 340 microns from the surface of the beamfanner array.
Fabrication of Cantilever-Bump Type Si Probe Card
NASA Astrophysics Data System (ADS)
Park, Jeong-Yong; Lee, Dong-Seok; Kim, Dong-Kwon; Lee, Jong-Hyun
2000-12-01
Probe card is most important part in the test system which selects the good or bad chip of integrated circuit (IC) chips. Silicon vertical probe card is able to test multiple semiconductor chips simultaneously. We presented cantilever-bump type vertical probe card. It was fabricated by dry etching using RIE(reactive ion etching) technique and porous silicon micromachining using silicon direct bonded (SDB) wafer. Cantilevers and bumps were fabricated by isotropic etching using RIE@. 3-dimensional structures were formed by porous silicon micromachining technique using SDB wafer. Contact resistance of fabricated probe card was less than 2 Ω and its life time was more than 200,000 turns. The process used in this work is very simple and reproducible, which has good controllability in the tip dimension and spacing. It is expected that the fabricated probe card can reduce testing time, can promote productivity and enables burn-in test.
Liquid Biopsy in Non-Small Cell Lung Cancer
Molina-Vila, Miguel A.; Mayo-de-las-Casas, Clara; Giménez-Capitán, Ana; Jordana-Ariza, Núria; Garzón, Mónica; Balada, Ariadna; Villatoro, Sergi; Teixidó, Cristina; García-Peláez, Beatriz; Aguado, Cristina; Catalán, María José; Campos, Raquel; Pérez-Rosado, Ana; Bertran-Alamillo, Jordi; Martínez-Bueno, Alejandro; Gil, María-de-los-Llanos; González-Cao, María; González, Xavier; Morales-Espinosa, Daniela; Viteri, Santiago; Karachaliou, Niki; Rosell, Rafael
2016-01-01
Liquid biopsy analyses are already incorporated in the routine clinical practice in many hospitals and oncology departments worldwide, improving the selection of treatments and monitoring of lung cancer patients. Although they have not yet reached its full potential, liquid biopsy-based tests will soon be as widespread as “standard” biopsies and imaging techniques, offering invaluable diagnostic, prognostic, and predictive information. This review summarizes the techniques available for the isolation and analysis of circulating free DNA and RNA, exosomes, tumor-educated platelets, and circulating tumor cells from the blood of cancer patients, presents the methodological challenges associated with each of these materials, and discusses the clinical applications of liquid biopsy testing in lung cancer. PMID:28066769
Active-learning strategies in computer-assisted drug discovery.
Reker, Daniel; Schneider, Gisbert
2015-04-01
High-throughput compound screening is time and resource consuming, and considerable effort is invested into screening compound libraries, profiling, and selecting the most promising candidates for further testing. Active-learning methods assist the selection process by focusing on areas of chemical space that have the greatest chance of success while considering structural novelty. The core feature of these algorithms is their ability to adapt the structure-activity landscapes through feedback. Instead of full-deck screening, only focused subsets of compounds are tested, and the experimental readout is used to refine molecule selection for subsequent screening cycles. Once implemented, these techniques have the potential to reduce costs and save precious materials. Here, we provide a comprehensive overview of the various computational active-learning approaches and outline their potential for drug discovery. Copyright © 2014 Elsevier Ltd. All rights reserved.
The U.S. Army surveyed innovative treatment techniques for restoration of hazardous waste lagoons and selected solvent extraction as cost-effective restoration for further study. This treatability study focuses on treatment of organic (explosive) contaminated lagoon sediments w...
NASA Technical Reports Server (NTRS)
Granaas, Michael M.; Rhea, Donald C.
1989-01-01
The requirements for the development of real-time displays are reviewed. Of particular interest are the psychological aspects of design such as the layout, color selection, real-time response rate, and the interactivity of displays. Some existing Western Aeronautical Test Range displays are analyzed.
Equal Employment Legislation: Alternative Means of Compliance.
ERIC Educational Resources Information Center
Daum, Jeffrey W.
Alternative means of compliance available to organizations to bring their manpower uses into line with existing equal employment legislation are discussed in this paper. The first area addressed concerns the classical approach to selection and placement based on testing methods. The second area discussed reviews various nontesting techniques, such…
Marketing Education Through Benefit Segmentation. AIR Forum 1981 Paper.
ERIC Educational Resources Information Center
Goodnow, Wilma Elizabeth
The applicability of the "benefit segmentation" marketing technique to education was tested at the College of DuPage in 1979. Benefit segmentation identified target markets homogeneous in benefits expected from a program offering and may be useful in combatting declining enrollments. The 487 randomly selected students completed the 223…
Photoacoustic sensor for medical diagnostics
NASA Astrophysics Data System (ADS)
Wolff, Marcus; Groninga, Hinrich G.; Harde, Hermann
2004-03-01
The development of new optical sensor technologies has a major impact on the progress of diagnostic methods. Of the permanently increasing number of non-invasive breath tests, the 13C-Urea Breath Test (UBT) for the detection of Helicobacter pylori is the most prominent. However, many recent developments, like the detection of cancer by breath test, go beyond gastroenterological applications. We present a new detection scheme for breath analysis that employs an especially compact and simple set-up. Photoacoustic Spectroscopy (PAS) represents an offset-free technique that allows for short absorption paths and small sample cells. Using a single-frequency diode laser and taking advantage of acoustical resonances of the sample cell, we performed extremely sensitive and selective measurements. The smart data processing method contributes to the extraordinary sensitivity and selectivity as well. Also, the reasonable acquisition cost and low operational cost make this detection scheme attractive for many biomedical applications. The experimental set-up and data processing method, together with exemplary isotope-selective measurements on carbon dioxide, are presented.
NASA Technical Reports Server (NTRS)
Krause, D. R.
1972-01-01
A conceptual design was developed for an MLI system which will meet the design constraints of an ILRV used for 7- to 30-day missions. The ten tasks are briefly described: (1) material survey and procurement, material property tests, and selection of composites to be considered; (2) definition of environmental parameters and tooling requirements, and thermal and structural design verification test definition; (3) definition of tanks and associated hardware to be used, and definition of MLI concepts to be considered; (4) thermal analyses, including purge, evacuation, and reentry repressurization analyses; (5) structural analyses (6) thermal degradation tests of composite and structural tests of fastener; (7) selection of MLI materials and system; (8) definition of a conceptual MLI system design; (9) evaluation of nondestructive inspection techniques and definition of procedures for repair of damaged areas; and (10) preparation of preliminary specifications.
A novel attack method about double-random-phase-encoding-based image hiding method
NASA Astrophysics Data System (ADS)
Xu, Hongsheng; Xiao, Zhijun; Zhu, Xianchen
2018-03-01
By using optical image processing techniques, a novel text encryption and hiding method applied by double-random phase-encoding technique is proposed in the paper. The first step is that the secret message is transformed into a 2-dimension array. The higher bits of the elements in the array are used to fill with the bit stream of the secret text, while the lower bits are stored specific values. Then, the transformed array is encoded by double random phase encoding technique. Last, the encoded array is embedded on a public host image to obtain the image embedded with hidden text. The performance of the proposed technique is tested via analytical modeling and test data stream. Experimental results show that the secret text can be recovered either accurately or almost accurately, while maintaining the quality of the host image embedded with hidden data by properly selecting the method of transforming the secret text into an array and the superimposition coefficient.
The effects of solar incidence angle over digital processing of LANDSAT data
NASA Technical Reports Server (NTRS)
Parada, N. D. J. (Principal Investigator); Novo, E. M. L. M.
1983-01-01
A technique to extract the topography modulation component from digital data is described. The enhancement process is based on the fact that the pixel contains two types of information: (1) reflectance variation due to the target; (2) reflectance variation due to the topography. In order to enhance the signal variation due to topography, the technique recommends the extraction from original LANDSAT data of the component resulting from target reflectance. Considering that the role of topographic modulation over the pixel information will vary with solar incidence angle, the results of this technique of digital processing will differ from one season to another, mainly in highly dissected topography. In this context, the effects of solar incidence angle over the topographic modulation technique were evaluated. Two sets of MSS/LANDSAT data, with solar elevation angles varying from 22 to 41 deg were selected to implement the digital processing at the Image-100 System. A secondary watershed (Rio Bocaina) draining into Rio Paraiba do Sul (Sao Paulo State) was selected as a test site. The results showed that the technique used was more appropriate to MSS data acquired under higher Sun elevation angles. Topographic modulation components applied to low Sun elevation angles lessens rather than enhances topography.
1981-01-01
explanatory variable has been ommitted. Ramsey (1974) has developed a rather interesting test for detecting specification errors using estimates of the...Peter. (1979) A Guide to Econometrics , Cambridge, MA: The MIT Press. Ramsey , J.B. (1974), "Classical Model Selection Through Specification Error... Tests ," in P. Zarembka, Ed. Frontiers in Econometrics , New York: Academia Press. Theil, Henri. (1971), Principles of Econometrics , New York: John Wiley
Efficient live face detection to counter spoof attack in face recognition systems
NASA Astrophysics Data System (ADS)
Biswas, Bikram Kumar; Alam, Mohammad S.
2015-03-01
Face recognition is a critical tool used in almost all major biometrics based security systems. But recognition, authentication and liveness detection of the face of an actual user is a major challenge because an imposter or a non-live face of the actual user can be used to spoof the security system. In this research, a robust technique is proposed which detects liveness of faces in order to counter spoof attacks. The proposed technique uses a three-dimensional (3D) fast Fourier transform to compare spectral energies of a live face and a fake face in a mathematically selective manner. The mathematical model involves evaluation of energies of selective high frequency bands of average power spectra of both live and non-live faces. It also carries out proper recognition and authentication of the face of the actual user using the fringe-adjusted joint transform correlation technique, which has been found to yield the highest correlation output for a match. Experimental tests show that the proposed technique yields excellent results for identifying live faces.
A Parameter Subset Selection Algorithm for Mixed-Effects Models
Schmidt, Kathleen L.; Smith, Ralph C.
2016-01-01
Mixed-effects models are commonly used to statistically model phenomena that include attributes associated with a population or general underlying mechanism as well as effects specific to individuals or components of the general mechanism. This can include individual effects associated with data from multiple experiments. However, the parameterizations used to incorporate the population and individual effects are often unidentifiable in the sense that parameters are not uniquely specified by the data. As a result, the current literature focuses on model selection, by which insensitive parameters are fixed or removed from the model. Model selection methods that employ information criteria are applicablemore » to both linear and nonlinear mixed-effects models, but such techniques are limited in that they are computationally prohibitive for large problems due to the number of possible models that must be tested. To limit the scope of possible models for model selection via information criteria, we introduce a parameter subset selection (PSS) algorithm for mixed-effects models, which orders the parameters by their significance. In conclusion, we provide examples to verify the effectiveness of the PSS algorithm and to test the performance of mixed-effects model selection that makes use of parameter subset selection.« less
Chandrasekhar, Shalini; Prasad, Madu Ghanashyam; Radhakrishna, Ambati Naga; Saujanya, Kaniti; Raviteja, N V K; Deepthi, B; Ramakrishna, J
2018-01-01
The aim of this study was to evaluate the efficiency of four different obturating techniques in filling the radicular space in primary teeth. This clinical trial was carried out on 34 healthy, cooperative children (5-9 years) who had 63 carious primary teeth indicated for pulpectomy. They were divided into four groups, such that in each group, a total of 40 canals were allotted for obturation with respective technique. The root canals of selected primary teeth were filled with Endoflas obturating material using either bi-directional spiral (Group 1); incremental technique (Group 2), past inject (Group 3) or lentulo spiral (Group 4) according to the groups assigned. The effectiveness of the obturation techniques was assessed using postoperative radiographs. The assessment was made for a depth of fill in the canal, the presence of any voids using Modified Coll and Sadrian criteria. The obtained data were analyzed by using ANOVA test and unpaired t-test. Bi-directional spiral and lentulo spiral were superior to other techniques in providing optimally filled canals (P< 0.05). The bi-directional spiral was superior to lentulo spiral in preventing overfill (P< 0.05). Based on the present study results, bi-directional spiral can be recommended as an alternate obturating technique in primary teeth.
Supervised learning for infection risk inference using pathology data.
Hernandez, Bernard; Herrero, Pau; Rawson, Timothy Miles; Moore, Luke S P; Evans, Benjamin; Toumazou, Christofer; Holmes, Alison H; Georgiou, Pantelis
2017-12-08
Antimicrobial Resistance is threatening our ability to treat common infectious diseases and overuse of antimicrobials to treat human infections in hospitals is accelerating this process. Clinical Decision Support Systems (CDSSs) have been proven to enhance quality of care by promoting change in prescription practices through antimicrobial selection advice. However, bypassing an initial assessment to determine the existence of an underlying disease that justifies the need of antimicrobial therapy might lead to indiscriminate and often unnecessary prescriptions. From pathology laboratory tests, six biochemical markers were selected and combined with microbiology outcomes from susceptibility tests to create a unique dataset with over one and a half million daily profiles to perform infection risk inference. Outliers were discarded using the inter-quartile range rule and several sampling techniques were studied to tackle the class imbalance problem. The first phase selects the most effective and robust model during training using ten-fold stratified cross-validation. The second phase evaluates the final model after isotonic calibration in scenarios with missing inputs and imbalanced class distributions. More than 50% of infected profiles have daily requested laboratory tests for the six biochemical markers with very promising infection inference results: area under the receiver operating characteristic curve (0.80-0.83), sensitivity (0.64-0.75) and specificity (0.92-0.97). Standardization consistently outperforms normalization and sensitivity is enhanced by using the SMOTE sampling technique. Furthermore, models operated without noticeable loss in performance if at least four biomarkers were available. The selected biomarkers comprise enough information to perform infection risk inference with a high degree of confidence even in the presence of incomplete and imbalanced data. Since they are commonly available in hospitals, Clinical Decision Support Systems could benefit from these findings to assist clinicians in deciding whether or not to initiate antimicrobial therapy to improve prescription practices.
Mallik, Rangan; Wa, Chunling; Hage, David S.
2008-01-01
Two techniques were developed for the immobilization of proteins and other ligands to silica through sulfhydryl groups. These methods made use of maleimide-activated silica (the SMCC method) or iodoacetyl-activated silica (the SIA method). The resulting supports were tested for use in high-performance affinity chromatography by employing human serum albumin (HSA) as a model protein. Studies with normal and iodoacetamide-modified HSA indicated that these methods had a high selectivity for sulfhydryl groups on this protein, which accounted for the coupling of 77–81% of this protein to maleimide- or iodacetyl-activated silica. These supports were also evaluated in terms of their total protein content, binding capacity, specific activity, non-specific binding, stability and chiral selectivity for several test solutes. HSA columns prepared using maleimide-activated silica gave the best overall results for these properties when compared to HSA that had been immobilized to silica through the Schiff base method (i.e., an amine-based coupling technique). A key advantage of the supports developed in this work is that they offer the potential of giving greater site-selective immobilization and ligand activity than amine-based coupling methods. These features make these supports attractive in the development of protein columns for such applications as the study of biological interactions and chiral separations. PMID:17297940
A novel feature ranking method for prediction of cancer stages using proteomics data
Saghapour, Ehsan; Sehhati, Mohammadreza
2017-01-01
Proteomic analysis of cancers' stages has provided new opportunities for the development of novel, highly sensitive diagnostic tools which helps early detection of cancer. This paper introduces a new feature ranking approach called FRMT. FRMT is based on the Technique for Order of Preference by Similarity to Ideal Solution method (TOPSIS) which select the most discriminative proteins from proteomics data for cancer staging. In this approach, outcomes of 10 feature selection techniques were combined by TOPSIS method, to select the final discriminative proteins from seven different proteomic databases of protein expression profiles. In the proposed workflow, feature selection methods and protein expressions have been considered as criteria and alternatives in TOPSIS, respectively. The proposed method is tested on seven various classifier models in a 10-fold cross validation procedure that repeated 30 times on the seven cancer datasets. The obtained results proved the higher stability and superior classification performance of method in comparison with other methods, and it is less sensitive to the applied classifier. Moreover, the final introduced proteins are informative and have the potential for application in the real medical practice. PMID:28934234
Use of system identification techniques for improving airframe finite element models using test data
NASA Technical Reports Server (NTRS)
Hanagud, Sathya V.; Zhou, Weiyu; Craig, James I.; Weston, Neil J.
1993-01-01
A method for using system identification techniques to improve airframe finite element models using test data was developed and demonstrated. The method uses linear sensitivity matrices to relate changes in selected physical parameters to changes in the total system matrices. The values for these physical parameters were determined using constrained optimization with singular value decomposition. The method was confirmed using both simple and complex finite element models for which pseudo-experimental data was synthesized directly from the finite element model. The method was then applied to a real airframe model which incorporated all of the complexities and details of a large finite element model and for which extensive test data was available. The method was shown to work, and the differences between the identified model and the measured results were considered satisfactory.
Dense mesh sampling for video-based facial animation
NASA Astrophysics Data System (ADS)
Peszor, Damian; Wojciechowska, Marzena
2016-06-01
The paper describes an approach for selection of feature points on three-dimensional, triangle mesh obtained using various techniques from several video footages. This approach has a dual purpose. First, it allows to minimize the data stored for the purpose of facial animation, so that instead of storing position of each vertex in each frame, one could store only a small subset of vertices for each frame and calculate positions of others based on the subset. Second purpose is to select feature points that could be used for anthropometry-based retargeting of recorded mimicry to another model, with sampling density beyond that which can be achieved using marker-based performance capture techniques. Developed approach was successfully tested on artificial models, models constructed using structured light scanner, and models constructed from video footages using stereophotogrammetry.
Systems design study of the Pioneer Venus spacecraft. Volume 2. Preliminary program development plan
NASA Technical Reports Server (NTRS)
1973-01-01
The preliminary development plan for the Pioneer Venus program is presented. This preliminary plan treats only developmental aspects that would have a significant effect on program cost. These significant development areas were: master program schedule planning; test planning - both unit and system testing for probes/orbiter/ probe bus; ground support equipment; performance assurance; and science integration Various test planning options and test method techniques were evaluated in terms of achieving a low-cost program without degrading mission performance or system reliability. The approaches studied and the methodology of the selected approach are defined.
NASA Technical Reports Server (NTRS)
Harrison, P. Ann
1993-01-01
All the NASA VEGetation Workbench (VEG) goals except the Learning System provide the scientist with several different techniques. When VEG is run, rules assist the scientist in selecting the best of the available techniques to apply to the sample of cover type data being studied. The techniques are stored in the VEG knowledge base. The design and implementation of an interface that allows the scientist to add new techniques to VEG without assistance from the developer were completed. A new interface that enables the scientist to add techniques to VEG without assistance from the developer was designed and implemented. This interface does not require the scientist to have a thorough knowledge of Knowledge Engineering Environment (KEE) by Intellicorp or a detailed knowledge of the structure of VEG. The interface prompts the scientist to enter the required information about the new technique. It prompts the scientist to enter the required Common Lisp functions for executing the technique and the left hand side of the rule that causes the technique to be selected. A template for each function and rule and detailed instructions about the arguments of the functions, the values they should return, and the format of the rule are displayed. Checks are made to ensure that the required data were entered, the functions compiled correctly, and the rule parsed correctly before the new technique is stored. The additional techniques are stored separately from the VEG knowledge base. When the VEG knowledge base is loaded, the additional techniques are not normally loaded. The interface allows the scientist the option of adding all the previously defined new techniques before running VEG. When the techniques are added, the required units to store the additional techniques are created automatically in the correct places in the VEG knowledge base. The methods file containing the functions required by the additional techniques is loaded. New rule units are created to store the new rules. The interface that allow the scientist to select which techniques to use is updated automatically to include the new techniques. Task H was completed. The interface that allows the scientist to add techniques to VEG was implemented and comprehensively tested. The Common Lisp code for the Add Techniques system is listed in Appendix A.
Carvalho, Gabriela Lanna Xavier de; Moreira, Luciano Evangelista; Pena, João Luiz; Marinho, Carolina Coimbra; Bahia, Maria Terezinha; Machado-Coelho, George Luiz Lins
2012-02-01
This study compares the diagnostic accuracy of the TF-Test(®) (TFT) for human parasitosis with results obtained using the traditional Kato-Katz (KK), Hoffman-Pons-Janer (HPJ), Willis and Baermann-Moraes (BM) techniques. Overall, four stool samples were taken from each individual; three alternate-day TFT stool samples and another sample that was collected in a universal container. Stool samples were taken from 331 inhabitants of the community of Quilombola Santa Cruz. The gold standard (GS) for protozoa detection was defined as the combined results for TFT, HPJ and Willis coproscopic techniques; for helminth detection, GS was defined as the combined results for all five coproscopic techniques (TFT, KK, HPJ, Willis and BM). The positivity rate of each method was compared using the McNemar test. While the TFT exhibited similar positivity rates to the GS for Entamoeba histolytica/dispar (82.4%) and Giardia duodenalis (90%), HPJ and Willis techniques exhibited significantly lower positivity rates for these protozoa. All tests exhibited significantly lower positivity rates compared with GS for the diagnosis of helminths. The KK technique had the highest positivity rate for diagnosing Schistosoma mansoni (74.6%), while the TFT had the highest positivity rates for Ascaris lumbricoides (58.1%) and hookworm (75%); HPJ technique had the highest positivity rate for Strongyloides stercoralis (50%). Although a combination of tests is the most accurate method for the diagnosis of enteral parasites, the TFT reliably estimates the prevalence of protozoa and selected helminths, such as A. lumbricoides and hookworm. Further studies are needed to evaluate the detection accuracy of the TFT in samples with varying numbers of parasites.
Comparison between two surgical techniques for root coverage with an acellular dermal matrix graft.
Andrade, Patrícia F; Felipe, Maria Emília M C; Novaes, Arthur B; Souza, Sérgio L S; Taba, Mário; Palioto, Daniela B; Grisi, Márcio F M
2008-03-01
The aim of this randomized, controlled, clinical study was to compare two surgical techniques with the acellular dermal matrix graft (ADMG) to evaluate which technique could provide better root coverage. Fifteen patients with bilateral Miller Class I gingival recession areas were selected. In each patient, one recession area was randomly assigned to the control group, while the contra-lateral recession area was assigned to the test group. The ADMG was used in both groups. The control group was treated with a broader flap and vertical-releasing incisions, and the test group was treated with the proposed surgical technique, without releasing incisions. The clinical parameters evaluated before the surgeries and after 12 months were: gingival recession height, probing depth, relative clinical attachment level and the width and thickness of keratinized tissue. There were no statistically significant differences between the groups for all parameters at baseline. After 12 months, there was a statistically significant reduction in recession height in both groups, and there was no statistically significant difference between the techniques with regard to root coverage. Both surgical techniques provided significant reduction in gingival recession height after 12 months, and similar results in relation to root coverage.
Journée, H-L; Polak, H E; De Kleuver, M
2007-12-01
In spite of the use of multipulse, transcranial electrical stimulation (TES) is still insufficient in a subgroup of patients to elicit motor-evoked potentials during intraoperative neurophysiological monitoring (IONM). Classic facilitation methods used in awake patients are precluded under general anaesthesia. Conditioning techniques can be used in this situation. To present clinical experimental data and models of motor-neuron (MN) excitability for homonymous and heteronymous conditioning and discuss their applications in IONM. Data were obtained in a prospective study on multipulse TES-conditioning of the monosynaptic H-reflex and double multipulse TES. The principle of facilitation by conditioning stimulation is to apply a test stimulus when motor neurons (MNs) have been made maximally excitable by a conditioning stimulus. Both conditioning and test stimuli recruit separate populations of MNs. The overlapping fraction of MNs controls the efficacy of facilitation. Heteronymous conditioning stimulation, which is performed at a different site from the test stimulus, is illustrated by the TES-conditioned H-reflex (HR). Autonomous conditioning stimulation, which is performed at the same stimulation site, is illustrated by double-train TES (dt-TES). The facilitating curves obtained by conditioning stimulation are often 3-modal and show peaks of facilitation at short intertrain intervals (S-ITIs) of 10ms and between 15 and 20ms and at longer intertrain intervals (L-ITI) of over 100ms. The facilitation curves from HR and dt-TES are not always identical since different alphaMN pools are involved. Dt-TES is often successful in neurologically impaired patients whereas facilitation of the HR can be used when conditioned by TES at subthreshold levels allowing continuous IONM without movement in the surgical field. Alternatively, facilitation by conditioning from peripheral-nerve stimulation can be used for selective transmission of subthreshold TES motor responses to peripheral muscles, permitting motor-monitoring by a so-called selective motor-gating technique. Facilitation techniques offer many possibilities in IONM by enhancing low-amplitude TES-MEP responses. They can also selectively enhance responses in a few muscle groups for the reduction of movement.
Wang, Hongmei; Feng, Qing; Li, Ning; Xu, Sheng
2016-12-01
Limited information is available regarding the metal-ceramic bond strength of dental Co-Cr alloys fabricated by casting (CAST), computer numerical control (CNC) milling, and selective laser melting (SLM). The purpose of this in vitro study was to evaluate the metal-ceramic bond characteristics of 3 dental Co-Cr alloys fabricated by casting, computer numerical control milling, and selective laser melting techniques using the 3-point bend test (International Organization for Standardization [ISO] standard 9693). Forty-five specimens (25×3×0.5 mm) made of dental Co-Cr alloys were prepared by CAST, CNC milling, and SLM techniques. The morphology of the oxidation surface of metal specimens was evaluated by scanning electron microscopy (SEM). After porcelain application, the interfacial characterization was evaluated by SEM equipped with energy-dispersive spectrometry (EDS) analysis, and the metal-ceramic bond strength was assessed with the 3-point bend test. Failure type and elemental composition on the debonding interface were assessed by SEM/EDS. The bond strength was statistically analyzed by 1-way ANOVA and Tukey honest significant difference test (α=.05). The oxidation surfaces of the CAST, CNC, and SLM groups were different. They were porous in the CAST group but compact and irregular in the CNC and SLM groups. The metal-ceramic interfaces of the SLM and CNC groups showed excellent combination compared with those of the CAST group. The bond strength was 37.7 ±6.5 MPa for CAST, 43.3 ±9.2 MPa for CNC, and 46.8 ±5.1 MPa for the SLM group. Statistically significant differences were found among the 3 groups tested (P=.028). The debonding surfaces of all specimens exhibited cohesive failure mode. The oxidation surface morphologies and thicknesses of dental Co-Cr alloys are dependent on the different fabrication techniques used. The bond strength of all 3 groups exceed the minimum acceptable value of 25 MPa recommended by ISO 9693; hence, dental Co-Cr alloy fabricated with the SLM techniques could be a promising alternative for metal ceramic restorations. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Development of Gold Standard Ion-Selective Electrode-Based Methods for Fluoride Analysis
Martínez-Mier, E.A.; Cury, J.A.; Heilman, J.R.; Katz, B.P.; Levy, S.M.; Li, Y.; Maguire, A.; Margineda, J.; O’Mullane, D.; Phantumvanit, P.; Soto-Rojas, A.E.; Stookey, G.K.; Villa, A.; Wefel, J.S.; Whelton, H.; Whitford, G.M.; Zero, D.T.; Zhang, W.; Zohouri, V.
2011-01-01
Background/Aims: Currently available techniques for fluoride analysis are not standardized. Therefore, this study was designed to develop standardized methods for analyzing fluoride in biological and nonbiological samples used for dental research. Methods A group of nine laboratories analyzed a set of standardized samples for fluoride concentration using their own methods. The group then reviewed existing analytical techniques for fluoride analysis, identified inconsistencies in the use of these techniques and conducted testing to resolve differences. Based on the results of the testing undertaken to define the best approaches for the analysis, the group developed recommendations for direct and microdiffusion methods using the fluoride ion-selective electrode. Results Initial results demonstrated that there was no consensus regarding the choice of analytical techniques for different types of samples. Although for several types of samples, the results of the fluoride analyses were similar among some laboratories, greater differences were observed for saliva, food and beverage samples. In spite of these initial differences, precise and true values of fluoride concentration, as well as smaller differences between laboratories, were obtained once the standardized methodologies were used. Intraclass correlation coefficients ranged from 0.90 to 0.93, for the analysis of a certified reference material, using the standardized methodologies. Conclusion The results of this study demonstrate that the development and use of standardized protocols for F analysis significantly decreased differences among laboratories and resulted in more precise and true values. PMID:21160184
Quasi-Uniform High Speed Foam Crush Testing Using a Guided Drop Mass Impact
NASA Technical Reports Server (NTRS)
Jones, Lisa E. (Technical Monitor); Kellas, Sotiris
2004-01-01
A relatively simple method for measuring the dynamic crush response of foam materials at various loading rates is described. The method utilizes a drop mass impact configuration with mass and impact velocity selected such that the crush speed remains approximately uniform during the entire sample crushing event. Instrumentation, data acquisition, and data processing techniques are presented, and limitations of the test method are discussed. The objective of the test method is to produce input data for dynamic finite element modeling involving crash and energy absorption characteristics of foam materials.
NASA Astrophysics Data System (ADS)
Kal, Subhadeep; Mohanty, Nihar; Farrell, Richard A.; Franke, Elliott; Raley, Angelique; Thibaut, Sophie; Pereira, Cheryl; Pillai, Karthik; Ko, Akiteru; Mosden, Aelan; Biolsi, Peter
2017-04-01
Scaling beyond the 7nm technology node demands significant control over the variability down to a few angstroms, in order to achieve reasonable yield. For example, to meet the current scaling targets it is highly desirable to achieve sub 30nm pitch line/space features at back-end of the line (BEOL) or front end of line (FEOL); uniform and precise contact/hole patterning at middle of line (MOL). One of the quintessential requirements for such precise and possibly self-aligned patterning strategies is superior etch selectivity between the target films while other masks/films are exposed. The need to achieve high etch selectivity becomes more evident for unit process development at MOL and BEOL, as a result of low density films choices (compared to FEOL film choices) due to lower temperature budget. Low etch selectivity with conventional plasma and wet chemical etch techniques, causes significant gouging (un-intended etching of etch stop layer, as shown in Fig 1), high line edge roughness (LER)/line width roughness (LWR), non-uniformity, etc. In certain circumstances this may lead to added downstream process stochastics. Furthermore, conventional plasma etches may also have the added disadvantage of plasma VUV damage and corner rounding (Fig. 1). Finally, the above mentioned factors can potentially compromise edge placement error (EPE) and/or yield. Therefore a process flow enabled with extremely high selective etches inherent to film properties and/or etch chemistries is a significant advantage. To improve this etch selectivity for certain etch steps during a process flow, we have to implement alternate highly selective, plasma free techniques in conjunction with conventional plasma etches (Fig 2.). In this article, we will present our plasma free, chemical gas phase etch technique using chemistries that have high selectivity towards a spectrum of films owing to the reaction mechanism ( as shown Fig 1). Gas phase etches also help eliminate plasma damage to the features during the etch process. Herein we will also demonstrate a test case on how a combination or plasma assisted and plasma free etch techniques has the potential to improve process performance of a 193nm immersion based self aligned quandruple patterning (SAQP) for BEOL compliant films (an example shown in Fig 2). In addition, we will also present on the application of gas etches for (1) profile improvement, (2) selective mandrel pull (3) critical dimension trim of mandrels, with an analysis of advantages over conventional techniques in terms of LER and EPE.
NASA Astrophysics Data System (ADS)
Torres Astorga, Romina; Velasco, Hugo; Dercon, Gerd; Mabit, Lionel
2017-04-01
Soil erosion and associated sediment transportation and deposition processes are key environmental problems in Central Argentinian watersheds. Several land use practices - such as intensive grazing and crop cultivation - are considered likely to increase significantly land degradation and soil/sediment erosion processes. Characterized by highly erodible soils, the sub catchment Estancia Grande (12.3 km2) located 23 km north east of San Luis has been investigated by using sediment source fingerprinting techniques to identify critical hot spots of land degradation. The authors created 4 artificial mixtures using known quantities of the most representative sediment sources of the studied catchment. The first mixture was made using four rotation crop soil sources. The second and the third mixture were created using different proportions of 4 different soil sources including soils from a feedlot, a rotation crop, a walnut forest and a grazing soil. The last tested mixture contained the same sources as the third mixture but with the addition of a fifth soil source (i.e. a native bank soil). The Energy Dispersive X Ray Fluorescence (EDXRF) analytical technique has been used to reconstruct the source sediment proportion of the original mixtures. Besides using a traditional method of fingerprint selection such as Kruskal-Wallis H-test and Discriminant Function Analysis (DFA), the authors used the actual source proportions in the mixtures and selected from the subset of tracers that passed the statistical tests specific elemental tracers that were in agreement with the expected mixture contents. The selection process ended with testing in a mixing model all possible combinations of the reduced number of tracers obtained. Alkaline earth metals especially Strontium (Sr) and Barium (Ba) were identified as the most effective fingerprints and provided a reduced Mean Absolute Error (MAE) of approximately 2% when reconstructing the 4 artificial mixtures. This study demonstrates that the EDXRF fingerprinting approach performed very well in reconstructing our original mixtures especially in identifying and quantifying the contribution of the 4 rotation crop soil sources in the first mixture.
An improved dual-frequency technique for the remote sensing of ocean currents and wave spectra
NASA Technical Reports Server (NTRS)
Schuler, D. L.; Eng, W. P.
1984-01-01
A two frequency microwave radar technique for the remote sensing of directional ocean wave spectra and surface currents is investigated. This technique is conceptually attractive because its operational physical principle involves a spatial electromagnetic scattering resonance with a single, but selectable, long gravity wave. Multiplexing of signals having different spacing of the two transmitted frequencies allows measurements of the entire long wave ocean spectrum to be carried out. A new scatterometer is developed and experimentally tested which is capable of making measurements having much larger signal/background values than previously possible. This instrument couples the resonance technique with coherent, frequency agility radar capabilities. This scatterometer is presently configured for supporting a program of surface current measurements.
Surface contamination analysis technology team overview
NASA Technical Reports Server (NTRS)
Burns, H. Dewitt
1995-01-01
A team was established which consisted of representatives from NASA (Marshall Space Flight Center and Langley Research Center), Thiokol Corporation, the University of Alabama in Huntsville, AC Engineering, SAIC, Martin Marietta, and Aerojet. The team's purpose was to bring together the appropriate personnel to determine what surface inspection techniques were applicable to multiprogram bonding surface cleanliness inspection. In order to identify appropriate techniques and their sensitivity to various contaminant families, calibration standards were developed. Producing standards included development of consistent low level contamination application techniques. Oxidation was also considered for effect on inspection equipment response. Ellipsometry was used for oxidation characterization. Verification testing was then accomplished to show that selected inspection techniques could detect subject contaminants at levels found to be detrimental to critical bond systems of interest. Once feasibility of identified techniques was shown, selected techniques and instrumentation could then be incorporated into a multipurpose inspection head and integrated with a robot for critical surface inspection. Inspection techniques currently being evaluated include optically stimulated electron emission (OSEE); near infrared (NIR) spectroscopy utilizing fiber optics; Fourier transform infrared (FTIR) spectroscopy; and ultraviolet (UV) fluorescence. Current plans are to demonstrate an integrated system in MSFC's Productivity Enhancement Complex within five years from initiation of this effort in 1992 assuming appropriate funding levels are maintained. This paper gives an overview of work accomplished by the team and future plans.
Properties of radio-frequency heated argon confined uranium plasmas
NASA Technical Reports Server (NTRS)
1976-01-01
Pure uranium hexafluoride (UF6) was injected into an argon confined, steady state, rf-heated plasma within a fused silica peripheral wall test chamber. Exploratory tests conducted using an 80 kW rf facility and different test chamber flow configurations permitted selection of the configuration demonstrating the best confinement characteristics and minimum uranium compound wall coating. The overall test results demonstrated applicable flow schemes and associated diagnostic techniques were developed for the fluid mechanical confinement and characterization of uranium within an rf plasma discharge when pure UF6 is injected for long test times into an argon-confined, high-temperature, high-pressure, rf-heated plasma.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1969-07-01
The Fifth International Conference on Nondestructive Testing was held in Montreal, Canada, for the purpose of promoting international collaboration in all matters related to the development and use of nondestructive test methods. A total of 82 papers were selected for presentation. Session titles included: evaluation of material quality; ultrasonics - identification and measurements; thermal methods; testing of welds; visual aids in nondestructive testing; measurements of stress and elastic properties; magnetic and eddy-current methods; surface methods and neutron radiography; standardization - general; ultrasonics at elevated temperatures; applications; x-ray techniques; radiography; ultrasonic standardization; training and qualification; and, correlation of weld defects.
Vapor Hydrogen Peroxide as Alternative to Dry Heat Microbial Reduction
NASA Technical Reports Server (NTRS)
Cash, Howard A.; Kern, Roger G.; Chung, Shirley Y.; Koukol, Robert C.; Barengoltz, Jack B.
2006-01-01
The Jet Propulsion Laboratory, in conjunction with the NASA Planetary Protection Officer, has selected vapor phase hydrogen peroxide (VHP) sterilization process for continued development as a NASA approved sterilization technique for spacecraft subsystems and systems. The goal is to include this technique, with appropriate specification, in NPG8020.12C as a low temperature complementary technique to the dry heat sterilization process. A series of experiments were conducted in vacuum to determine VHP process parameters that provided significant reductions in spore viability while allowing survival of sufficient spores for statistically significant enumeration. With this knowledge of D values, sensible margins can be applied in a planetary protection specification. The outcome of this study provided an optimization of test sterilizer process conditions: VHP concentration, process duration, a process temperature range for which the worst case D value may be imposed, a process humidity range for which the worst case D value may be imposed, and robustness to selected spacecraft material substrates.
Barros, Raquel R M; Novaes, Arthur B Júnior; Grisi, Márcio F M; Souza, Sérgio L S; Taba, Mário Júnior; Palioto, Daniela B
2004-10-01
The acellular dermal matrix graft (ADMG) has become widely used in periodontal surgeries as a substitute for the subepithelial connective tissue graft (SCTG). These grafts exhibit different healing processes due to their distinct cellular and vascular structures. Therefore the surgical technique primarily developed for the autograft may not be adequate for the allograft. This study compared the clinical results of two surgical techniques--the "conventional" and a modified procedure--for the treatment of localized gingival recessions with the ADMG. A total of 32 bilateral Miller Class I or II gingival recessions were selected and randomly assigned to test and control groups. The control group received the SCTG and the test group the modified surgical technique. Probing depth (PD), relative clinical attachment level (RCAL), gingival recession (GR), and width of keratinized tissue (KT) were measured 2 weeks prior to surgery and 6 months post-surgery. Both procedures improved all the evaluated parameters after 6 months. Comparisons between the groups by Mann-Whitney rank sum test revealed no statistically significant differences in terms of CAL gain, PD reduction, and increase in KT from baseline to 6-month evaluation. However, there was a statistically significant greater reduction of GR favoring the modified technique (P = 0.002). The percentage of root coverage was 79% for the test group and 63.9% for the control group. We conclude that the modified technique is more suitable for root coverage procedures with the ADMG since it had statistically significant better clinical results compared to the traditional technique.
Williams, Jennifer A.; Schmitter-Edgecombe, Maureen; Cook, Diane J.
2016-01-01
Introduction Reducing the amount of testing required to accurately detect cognitive impairment is clinically relevant. The aim of this research was to determine the fewest number of clinical measures required to accurately classify participants as healthy older adult, mild cognitive impairment (MCI) or dementia using a suite of classification techniques. Methods Two variable selection machine learning models (i.e., naive Bayes, decision tree), a logistic regression, and two participant datasets (i.e., clinical diagnosis, clinical dementia rating; CDR) were explored. Participants classified using clinical diagnosis criteria included 52 individuals with dementia, 97 with MCI, and 161 cognitively healthy older adults. Participants classified using CDR included 154 individuals CDR = 0, 93 individuals with CDR = 0.5, and 25 individuals with CDR = 1.0+. Twenty-seven demographic, psychological, and neuropsychological variables were available for variable selection. Results No significant difference was observed between naive Bayes, decision tree, and logistic regression models for classification of both clinical diagnosis and CDR datasets. Participant classification (70.0 – 99.1%), geometric mean (60.9 – 98.1%), sensitivity (44.2 – 100%), and specificity (52.7 – 100%) were generally satisfactory. Unsurprisingly, the MCI/CDR = 0.5 participant group was the most challenging to classify. Through variable selection only 2 – 9 variables were required for classification and varied between datasets in a clinically meaningful way. Conclusions The current study results reveal that machine learning techniques can accurately classifying cognitive impairment and reduce the number of measures required for diagnosis. PMID:26332171
Uniting statistical and individual-based approaches for animal movement modelling.
Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel
2014-01-01
The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems.
Uniting Statistical and Individual-Based Approaches for Animal Movement Modelling
Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel
2014-01-01
The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems. PMID:24979047
Investigating ESL Students' Academic Performance in Tenses
ERIC Educational Resources Information Center
Javed, Muhammad; Ahmad, Atezaz
2013-01-01
The present study intends to assess the ESL students' performance in tenses at secondary school level. Grade 10 students were the target population of the study. A sample of 396 students (255 male and 141 female) was selected through convenience sampling technique from the District of Bahawalnagar, Pakistan. A test focusing on five different types…
Simultaneous Estimation of Regression Functions for Marine Corps Technical Training Specialties.
ERIC Educational Resources Information Center
Dunbar, Stephen B.; And Others
This paper considers the application of Bayesian techniques for simultaneous estimation to the specification of regression weights for selection tests used in various technical training courses in the Marine Corps. Results of a method for m-group regression developed by Molenaar and Lewis (1979) suggest that common weights for training courses…
Two-Phase Item Selection Procedure for Flexible Content Balancing in CAT
ERIC Educational Resources Information Center
Cheng, Ying; Chang, Hua-Hua; Yi, Qing
2007-01-01
Content balancing is an important issue in the design and implementation of computerized adaptive testing (CAT). Content-balancing techniques that have been applied in fixed content balancing, where the number of items from each content area is fixed, include constrained CAT (CCAT), the modified multinomial model (MMM), modified constrained CAT…
ERIC Educational Resources Information Center
Educational Communications, Inc., Lake Forest, IL.
Fourteen articles are directed to college-bound students regarding student financial aid information, the Scholastic Aptitude Test (SAT), advanced placement and credit by examination, college selection, types of colleges, choosing a major, and earning power after graduation. Techniques and publications that may help students gather accurate…
Application and principles of photon-doppler velocimetry for explosives testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Briggs, Matthew Ellsworth; Hill, Larry; Hull, Larry
2010-01-01
The velocimetry technique PDV is easier to field than its predecessors VISAR and Fabry-Perot, works on a broader variety of experiments, and is more accurate and simple to analyze. Experiments and analysis have now demonstrated the accuracy, precision and interpretation of what PDV does and does not measure, and the successful application of POV to basic and applied detonation problems. We present a selection of results intended to help workers assess the capabilities of PDV. First we present general considerations about the technique: various PDV configurations, single-signal, multisignal (e.g., triature) and frequency-shifted PDV; what types of motion are sensed andmore » missed by PDV; analysis schemes for velocity and position extraction; accuracy and precision of the results; and, experimental considerations for probe selection and positioning. We then present the status of various applications: detonation speeds and wall motion in cylinder tests, breakout velocity distributions from bare HE, ejecta, measurements from fibers embedded in HE, projectile velocity, resolving 2 and 3-D velocity vectors. This paper is an overview of work done by many groups around the world.« less
Development of an inflatable radiator system. [for space shuttles
NASA Technical Reports Server (NTRS)
Leach, J. W.
1976-01-01
Conceptual designs of an inflatable radiator system developed for supplying short duration supplementary cooling of space vehicles are described along with parametric trade studies, materials evaluation/selection studies, thermal and structural analyses, and numerous element tests. Fabrication techniques developed in constructing the engineering models and performance data from the model thermal vacuum tests are included. Application of these data to refining the designs of the flight articles and to constructing a full scale prototype radiator is discussed.
Photoacoustic sensor for VOCs: first step towards a lung cancer breath test
NASA Astrophysics Data System (ADS)
Wolff, Marcus; Groninga, Hinrich G.; Dressler, Matthias; Harde, Hermann
2005-08-01
Development of new optical sensor technologies has a major impact on the progression of diagnostic methods. Specifically, the optical analysis of breath is an extraordinarily promising technique. Spectroscopic sensors for the non-invasive 13C-breath tests (the Urea Breath Test for detection of Helicobacter pylori is most prominent) are meanwhile well established. However, recent research and development go beyond gastroenterological applications. Sensitive and selective detection of certain volatile organic compounds (VOCs) in a patient's breath, could enable the diagnosis of diseases that are very difficult to diagnose with contemporary techniques. For instance, an appropriate VOC biomarker for early-stage bronchial carcinoma (lung cancer) is n-butane (C4H10). We present a new optical detection scheme for VOCs that employs an especially compact and simple set-up based on photoacoustic spectroscopy (PAS). This method makes use of the transformation of absorbed modulated radiation into a sound wave. Employing a wavelength-modulated distributed feedback (DFB) diode laser and taking advantage of acoustical resonances of the sample cell, we performed very sensitive and selective measurements on butane. A detection limit for butane in air in the ppb range was achieved. In subsequent research the sensitivity will be successively improved to match the requirements of the medical application. Upon optimization, our photoacoustic sensor has the potential to enable future breath tests for early-stage lung cancer diagnostics.
Stress corrosion cracking properties of 15-5PH steel
NASA Technical Reports Server (NTRS)
Rosa, Ferdinand
1993-01-01
Unexpected occurrence of failures, due to stress corrosion cracking (SCC) of structural components, indicate a need for improved characterization of materials and more advanced analytical procedures for reliably predicting structures performance. Accordingly, the purpose of this study was to determine the stress corrosion susceptibility of 15-5PH steel over a wide range of applied strain rates in a highly corrosive environment. The selected environment for this investigation was a highly acidified sodium chloride (NaCl) aqueous solution. The selected alloy for the study was a 15-5PH steel in the H900 condition. The slow strain rate technique was selected to test the metals specimens.
Lean Stability augmentation study
NASA Technical Reports Server (NTRS)
Mcvey, J. B.; Kennedy, J. B.
1979-01-01
An analytical and experimental program was conducted to investigate techniques and develop technology for improving the lean combustion limits of premixing, prevaporizing combustors applicable to gas turbine engine main burners. Three concepts for improving lean stability limits were selected for experimental evaluation among twelve approaches considered. Concepts were selected on the basis of the potential for improving stability limits and achieving emission goals, the technological risks associated with development of practical burners employing the concepts, and the penalties to airline direct operating costs resulting from decreased combustor performance, increased engine cost, increased maintenance cost and increased engine weight associated with implementation of the concepts. Tests of flameholders embodying the selected concepts were conducted.
CO and NO2 Selective Monitoring by ZnO-Based Sensors
Hjiri, Mokhtar; El Mir, Lassaad; Leonardi, Salvatore Gianluca; Donato, Nicola; Neri, Giovanni
2013-01-01
ZnO nanomaterials with different shapes were synthesized, characterized and tested in the selective monitoring of low concentration of CO and NO2 in air. ZnO nanoparticles (NPs) and nanofibers (NFs) were synthesized by a modified sol-gel method in supercritical conditions and electrospinning technique, respectively. CO and NO2 sensing tests have demonstrated that the annealing temperature and shape of zinc oxide nanomaterials are the key factors in modulating the electrical and sensing properties. Specifically, ZnO NPs annealed at high temperature (700 °C) have been found sensitive to CO, while they displayed negligible response to NO2. The opposite behavior has been registered for the one-dimensional ZnO NFs annealed at medium temperature (400 °C). Due to their adaptable sensitivity/selectivity characteristics, the developed sensors show promising applications in dual air quality control systems for closed ambient such as automotive cabin, parking garage and tunnels. PMID:28348340
NASA Astrophysics Data System (ADS)
Jones, Louis Chin
This thesis entails the synthesis, automated catalytic testing, and in situ molecular characterization of supported Pt and Pt-alloy nanoparticle (NP) catalysts, with emphasis on how to assess the molecular distributions of Pt environments that are affecting overall catalytic activity and selectivity. We have taken the approach of (a) manipulating nucleation and growth of NPs using oxide supports, surfactants, and inorganic complexes to create Pt NPs with uniform size, shape, and composition, (b) automating batch and continuous flow catalytic reaction tests, and (c) characterizing the molecular environments of Pt surfaces using in situ infrared (IR) spectroscopy and solid-state 195Pt NMR. The following will highlight the synthesis and characterization of Ag-doped Pt NPs and their influence on C 2H2 hydrogenation selectivity, and the implementation of advanced solid-state 195Pt NMR techniques to distinguish how distributions of molecular Pt environments vary with nanoparticle size, support, and surface composition.
Chung, Seungjoon; Seo, Chang Duck; Choi, Jae-Hoon; Chung, Jinwook
2014-01-01
Membrane distillation (MD) is an emerging desalination technology as an energy-saving alternative to conventional distillation and reverse osmosis method. The selection of appropriate membrane is a prerequisite for the design of an optimized MD process. We proposed a simple approximation method to evaluate the performance of membranes for MD process. Three hollow fibre-type commercial membranes with different thicknesses and pore sizes were tested. Experimental results showed that one membrane was advantageous due to the highest flux, whereas another membrane was due to the lowest feed temperature drop. Regression analyses and multi-stage calculations were used to account for the trade-offeffects of flux and feed temperature drop. The most desirable membrane was selected from tested membranes in terms of the mean flux in a multi-stage process. This method would be useful for the selection of the membranes without complicated simulation techniques.
Fuzzy/Neural Software Estimates Costs of Rocket-Engine Tests
NASA Technical Reports Server (NTRS)
Douglas, Freddie; Bourgeois, Edit Kaminsky
2005-01-01
The Highly Accurate Cost Estimating Model (HACEM) is a software system for estimating the costs of testing rocket engines and components at Stennis Space Center. HACEM is built on a foundation of adaptive-network-based fuzzy inference systems (ANFIS) a hybrid software concept that combines the adaptive capabilities of neural networks with the ease of development and additional benefits of fuzzy-logic-based systems. In ANFIS, fuzzy inference systems are trained by use of neural networks. HACEM includes selectable subsystems that utilize various numbers and types of inputs, various numbers of fuzzy membership functions, and various input-preprocessing techniques. The inputs to HACEM are parameters of specific tests or series of tests. These parameters include test type (component or engine test), number and duration of tests, and thrust level(s) (in the case of engine tests). The ANFIS in HACEM are trained by use of sets of these parameters, along with costs of past tests. Thereafter, the user feeds HACEM a simple input text file that contains the parameters of a planned test or series of tests, the user selects the desired HACEM subsystem, and the subsystem processes the parameters into an estimate of cost(s).
Differences in head impulse test results due to analysis techniques.
Cleworth, Taylor W; Carpenter, Mark G; Honegger, Flurin; Allum, John H J
2017-01-01
Different analysis techniques are used to define vestibulo-ocular reflex (VOR) gain between eye and head angular velocity during the video head impulse test (vHIT). Comparisons would aid selection of gain techniques best related to head impulse characteristics and promote standardisation. Compare and contrast known methods of calculating vHIT VOR gain. We examined lateral canal vHIT responses recorded from 20 patients twice within 13 weeks of acute unilateral peripheral vestibular deficit onset. Ten patients were tested with an ICS Impulse system (GN Otometrics) and 10 with an EyeSeeCam (ESC) system (Interacoustics). Mean gain and variance were computed with area, average sample gain, and regression techniques over specific head angular velocity (HV) and acceleration (HA) intervals. Results for the same gain technique were not different between measurement systems. Area and average sample gain yielded equally lower variances than regression techniques. Gains computed over the whole impulse duration were larger than those computed for increasing HV. Gain over decreasing HV was associated with larger variances. Gains computed around peak HV were smaller than those computed around peak HA. The median gain over 50-70 ms was not different from gain around peak HV. However, depending on technique used, the gain over increasing HV was different from gain around peak HA. Conversion equations between gains obtained with standard ICS and ESC methods were computed. For low gains, the conversion was dominated by a constant that needed to be added to ESC gains to equal ICS gains. We recommend manufacturers standardize vHIT gain calculations using 2 techniques: area gain around peak HA and peak HV.
Dastmalchi, Nafiseh; Jafarzadeh, Hamid; Moradi, Saeed
2012-09-01
The ideal technique for the evaluation of pulp vitality should be noninvasive, painless, objective, reliable, and reproducible. To achieve this, the most routine tests are sensitivity tests. However, a major shortcoming with these tests is that they indirectly indicate pulp vitality by measuring a neural response. Pulse oximetry is a well-established oxygen saturation monitoring technique broadly used in medicine. However, its efficacy as the pulp vitality test should be evaluated. The aim of this study was to design and build a custom-made pulse oximeter dental probe and to evaluate its efficacy in comparison with electric pulp tester, cold spray, and a rubber cup in pulp vitality testing. Twenty-four single-canal mandibular premolars needing endodontic treatment were selected. The patients did not have systemic disease and did not consume drugs. Also, they had no clinically relevant signs of necrosis. The selected teeth were pulpally tested with 4 kinds of tests including pulse oximetry, the electric test, cold spray, and the rubber cup. After endodontic treatment of these teeth, which revealed the actual status of the pulp, the results were analyzed by the kappa test to show the efficacy of these tests. When comparing electric, cold, heat, and pulse oximeter tests with the gold standard, the kappa agreement coefficient was 18%, 18%, 14%, and 91%, respectively. The sensitivity of pulse oximetry, a rubber cup, electric test, and cold spray was 0.93, 0.60, 0.60, and 0.53, respectively. The specificity of these tests was 1.00, 0.55, 0.22, and 0.66, respectively. Pulp testing by using pulse oximetry is more reliable than the electric test, rubber cup, and cold spray. The custom-made pulse oximeter dental probe is an effective and objective method for pulp vitality assessment. Copyright © 2012 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reyhan, M; Yue, N
Purpose: To validate an automated image processing algorithm designed to detect the center of radiochromic film used for in vivo film dosimetry against the current gold standard of manual selection. Methods: An image processing algorithm was developed to automatically select the region of interest (ROI) in *.tiff images that contain multiple pieces of radiochromic film (0.5x1.3cm{sup 2}). After a user has linked a calibration file to the processing algorithm and selected a *.tiff file for processing, an ROI is automatically detected for all films by a combination of thresholding and erosion, which removes edges and any additional markings for orientation.more » Calibration is applied to the mean pixel values from the ROIs and a *.tiff image is output displaying the original image with an overlay of the ROIs and the measured doses. Validation of the algorithm was determined by comparing in vivo dose determined using the current gold standard (manually drawn ROIs) versus automated ROIs for n=420 scanned films. Bland-Altman analysis, paired t-test, and linear regression were performed to demonstrate agreement between the processes. Results: The measured doses ranged from 0.2-886.6cGy. Bland-Altman analysis of the two techniques (automatic minus manual) revealed a bias of -0.28cGy and a 95% confidence interval of (5.5cGy,-6.1cGy). These values demonstrate excellent agreement between the two techniques. Paired t-test results showed no statistical differences between the two techniques, p=0.98. Linear regression with a forced zero intercept demonstrated that Automatic=0.997*Manual, with a Pearson correlation coefficient of 0.999. The minimal differences between the two techniques may be explained by the fact that the hand drawn ROIs were not identical to the automatically selected ones. The average processing time was 6.7seconds in Matlab on an IntelCore2Duo processor. Conclusion: An automated image processing algorithm has been developed and validated, which will help minimize user interaction and processing time of radiochromic film used for in vivo dosimetry.« less
Information Gain Based Dimensionality Selection for Classifying Text Documents
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dumidu Wijayasekara; Milos Manic; Miles McQueen
2013-06-01
Selecting the optimal dimensions for various knowledge extraction applications is an essential component of data mining. Dimensionality selection techniques are utilized in classification applications to increase the classification accuracy and reduce the computational complexity. In text classification, where the dimensionality of the dataset is extremely high, dimensionality selection is even more important. This paper presents a novel, genetic algorithm based methodology, for dimensionality selection in text mining applications that utilizes information gain. The presented methodology uses information gain of each dimension to change the mutation probability of chromosomes dynamically. Since the information gain is calculated a priori, the computational complexitymore » is not affected. The presented method was tested on a specific text classification problem and compared with conventional genetic algorithm based dimensionality selection. The results show an improvement of 3% in the true positives and 1.6% in the true negatives over conventional dimensionality selection methods.« less
Design and Evaluation of Perceptual-based Object Group Selection Techniques
NASA Astrophysics Data System (ADS)
Dehmeshki, Hoda
Selecting groups of objects is a frequent task in graphical user interfaces. It is required prior to many standard operations such as deletion, movement, or modification. Conventional selection techniques are lasso, rectangle selection, and the selection and de-selection of items through the use of modifier keys. These techniques may become time-consuming and error-prone when target objects are densely distributed or when the distances between target objects are large. Perceptual-based selection techniques can considerably improve selection tasks when targets have a perceptual structure, for example when arranged along a line. Current methods to detect such groups use ad hoc grouping algorithms that are not based on results from perception science. Moreover, these techniques do not allow selecting groups with arbitrary arrangements or permit modifying a selection. This dissertation presents two domain-independent perceptual-based systems that address these issues. Based on established group detection models from perception research, the proposed systems detect perceptual groups formed by the Gestalt principles of good continuation and proximity. The new systems provide gesture-based or click-based interaction techniques for selecting groups with curvilinear or arbitrary structures as well as clusters. Moreover, the gesture-based system is adapted for the graph domain to facilitate path selection. This dissertation includes several user studies that show the proposed systems outperform conventional selection techniques when targets form salient perceptual groups and are still competitive when targets are semi-structured.
Power laws and extreme values in antibody repertoires
NASA Astrophysics Data System (ADS)
Boyer, Sebastien; Biswas, Dipanwita; Scaramozzino, Natale; Kumar, Ananda Soshee; Nizak, Clément; Rivoire, Olivier
2015-03-01
Evolution by natural selection involves the succession of three steps: mutations, selection and proliferation. We are interested in describing and characterizing the result of selection over a population of many variants. After selection, this population will be dominated by the few best variants, with highest propensity to be selected, or highest ``selectivity.'' We ask the following question: how is the selectivity of the best variants distributed in the population? Extreme value theory, which characterizes the extreme tail of probability distributions in terms of a few universality class, has been proposed to describe it. To test this proposition and identify the relevant universality class, we performed quantitative in vitro experimental selections of libraries of >105 antibodies using the technique of phage display. Data obtained by high-throughput sequencing allows us to fit the selectivity distribution over more than two decades. In most experiments, the results show a striking power law for the selectivity distribution of the top antibodies, consistent with extreme value theory.
Doherty, Carolynne M; Forbes, Raeburn B
2014-01-01
Diagnostic Lumbar Puncture is one of the most commonly performed invasive tests in clinical medicine. Evaluation of an acute headache and investigation of inflammatory or infectious disease of the nervous system are the most common indications. Serious complications are rare, and correct technique will minimise diagnostic error and maximise patient comfort. We review the technique of diagnostic Lumbar Puncture including anatomy, needle selection, needle insertion, measurement of opening pressure, Cerebrospinal Fluid (CSF) specimen handling and after care. We also make some quality improvement suggestions for those designing services incorporating diagnostic Lumbar Puncture. PMID:25075138
The use of an image registration technique in the urban growth monitoring
NASA Technical Reports Server (NTRS)
Parada, N. D. J. (Principal Investigator); Foresti, C.; Deoliveira, M. D. L. N.; Niero, M.; Parreira, E. M. D. M. F.
1984-01-01
The use of an image registration program in the studies of urban growth is described. This program permits a quick identification of growing areas with the overlap of the same scene in different periods, and with the use of adequate filters. The city of Brasilia, Brazil, is selected for the test area. The dynamics of Brasilia urban growth are analyzed with the overlap of scenes dated June 1973, 1978 and 1983. The results showed the utilization of the image registration technique for the monitoring of dynamic urban growth.
Effect of γ-irradiation on the optical and electrical properties of fiber reinforced composites
NASA Astrophysics Data System (ADS)
Anwar, Ahmad; Elfiky, Dalia; Ramadan, Ahmed M.; Hassan, G. M.
2017-05-01
The effect of gamma irradiation on the optical and electrical properties of the reinforced fiber polymeric based materials became an important issue. Fiberglass/epoxy and Kevlar fiber/epoxy were selected as investigated samples manufactured with hand lay-up without autoclave curing technique. The selected technique is simple and low cost while being rarely used in space materials production. The electric conductivity and dielectric constant for those samples were measured with increasing the gamma radiation dose. Moreover, the absorptivity, band gap and color change were determined. Fourier transform infrared (FTIR) was performed to each of the material's constituent to evaluate the change in the investigated materials due to radiation exposure dose. In this study, the change of electrical properties for both investigated materials showed a slight variation of the test parameters with respect to the gamma dose increase; this variation is placed in the insulators rang. The tested samples showed an insulator stable behavior during the test period. The change of optical properties for both composite specimens showed the maximum absorptivity at the gamma dose 750 kGy. These materials are suitable for structure materials and thermal control for orbital life less than 7 years. In addition, the transparency of epoxy matrix was degraded. However, there is no color change for either Kevlar fiber or fiberglass.
Application of additive laser technologies in the gas turbine blades design process
NASA Astrophysics Data System (ADS)
Shevchenko, I. V.; Rogalev, A. N.; Osipov, S. K.; Bychkov, N. M.; Komarov, I. I.
2017-11-01
An emergence of modern innovative technologies requires delivering new and modernization existing design and production processes. It is especially relevant for designing the high-temperature turbines of gas turbine engines, development of which is characterized by a transition to higher parameters of working medium in order to improve their efficient performance. A design technique for gas turbine blades based on predictive verification of thermal and hydraulic models of their cooling systems by testing of a blade prototype fabricated using the selective laser melting technology was presented in this article. Technique was proven at the time of development of the first stage blade cooling system for the high-pressure turbine. An experimental procedure for verification of a thermal model of the blades with convective cooling systems based on the comparison of heat-flux density obtained from the numerical simulation data and results of tests in a liquid-metal thermostat was developed. The techniques makes it possible to obtain an experimentally tested blade version and to exclude its experimental adjustment after the start of mass production.
Mass balance for on-line alphakLa estimation in activated sludge oxidation ditch.
Chatellier, P; Audic, J M
2001-01-01
The capacity of an aeration system to transfer oxygen to a given activated sludge oxidation ditch is characterised by the alphakLa parameter. This parameter is difficult to measure under normal plant working conditions. Usually this measurement involves off-gas techniques or static mass balance. Therefore an on-line technique has been developed and tested in order to evaluate alphakLa. This technique deduces alphakLa from a data analysis of low cost sensor measurement: two flow meters and one oxygen probe. It involves a dynamic mass balance applied to aeration cycles selected according to given criteria. This technique has been applied to a wastewater treatment plant during four years. Significant variations of the alphakLa values have been detected while the number of blowers changes. This technique has been applied to another plant during two months.
Zweerink, Alwin; Allaart, Cornelis P; Kuijer, Joost P A; Wu, LiNa; Beek, Aernout M; van de Ven, Peter M; Meine, Mathias; Croisille, Pierre; Clarysse, Patrick; van Rossum, Albert C; Nijveldt, Robin
2017-12-01
Although myocardial strain analysis is a potential tool to improve patient selection for cardiac resynchronization therapy (CRT), there is currently no validated clinical approach to derive segmental strains. We evaluated the novel segment length in cine (SLICE) technique to derive segmental strains from standard cardiovascular MR (CMR) cine images in CRT candidates. Twenty-seven patients with left bundle branch block underwent CMR examination including cine imaging and myocardial tagging (CMR-TAG). SLICE was performed by measuring segment length between anatomical landmarks throughout all phases on short-axis cines. This measure of frame-to-frame segment length change was compared to CMR-TAG circumferential strain measurements. Subsequently, conventional markers of CRT response were calculated. Segmental strains showed good to excellent agreement between SLICE and CMR-TAG (septum strain, intraclass correlation coefficient (ICC) 0.76; lateral wall strain, ICC 0.66). Conventional markers of CRT response also showed close agreement between both methods (ICC 0.61-0.78). Reproducibility of SLICE was excellent for intra-observer testing (all ICC ≥0.76) and good for interobserver testing (all ICC ≥0.61). The novel SLICE post-processing technique on standard CMR cine images offers both accurate and robust segmental strain measures compared to the 'gold standard' CMR-TAG technique, and has the advantage of being widely available. • Myocardial strain analysis could potentially improve patient selection for CRT. • Currently a well validated clinical approach to derive segmental strains is lacking. • The novel SLICE technique derives segmental strains from standard CMR cine images. • SLICE-derived strain markers of CRT response showed close agreement with CMR-TAG. • Future studies will focus on the prognostic value of SLICE in CRT candidates.
Investigation of electroforming techniques. [fabrication of regeneratively cooled thrust chambers
NASA Technical Reports Server (NTRS)
Malone, G. A.
1975-01-01
Copper and nickel electroforming was examined for the purpose of establishing the necessary processes and procedures for repeatable, successful fabrication of the outer structures of regeneratively cooled thrust chambers. The selection of electrolytes for copper and nickel deposition is described. The development studies performed to refine and complete the processes necessary for successful chamber shell fabrication and the testing employed to verify the applicability of the processes and procedures to small scale hardware are described. Specifications were developed to afford a guideline for the electroforming of high quality outer shells on regeneratively cooled thrust chamber liners. Test results indicated repeatable mechanical properties could be produced in copper deposits from the copper sulfate electrolyte with periodic current reversal and in nickel deposits from the sulfamate solution. Use of inert, removable channel fillers and the conductivizing of such is described. Techniques (verified by test) which produce high integrity bonds to copper and copper alloy liners are discussed.
NASA Technical Reports Server (NTRS)
Baumgardner, M. F. (Principal Investigator)
1974-01-01
The author has identified the following significant results. Multispectral scanner data obtained by ERTS-1 over six test sites in the Central United States were analyzed and interpreted. ERTS-1 data for some of the test sites were geometrically corrected and temporally overlayed. Computer-implemented pattern recognition techniques were used in the analysis of all multispectral data. These techniques were used to evaluate ERTS-1 data as a tool for soil survey. Geology maps and land use inventories were prepared by digital analysis of multispectral data. Identification and mapping of crop species and rangelands were achieved throught the analysis of 1972 and 1973 ERTS-1 data. Multiple dates of ERTS-1 data were examined to determine the variation with time of the areal extent of surface water resources on the Southern Great Plain.
Selected Performance Measurements of the F-15 Active Axisymmetric Thrust-vectoring Nozzle
NASA Technical Reports Server (NTRS)
Orme, John S.; Sims, Robert L.
1998-01-01
Flight tests recently completed at the NASA Dryden Flight Research Center evaluated performance of a hydromechanically vectored axisymmetric nozzle onboard the F-15 ACTIVE. A flight-test technique whereby strain gages installed onto engine mounts provided for the direct measurement of thrust and vector forces has proven to be extremely valuable. Flow turning and thrust efficiency, as well as nozzle static pressure distributions were measured and analyzed. This report presents results from testing at an altitude of 30,000 ft and a speed of Mach 0.9. Flow turning and thrust efficiency were found to be significantly different than predicted, and moreover, varied substantially with power setting and pitch vector angle. Results of an in-flight comparison of the direct thrust measurement technique and an engine simulation fell within the expected uncertainty bands. Overall nozzle performance at this flight condition demonstrated the F100-PW-229 thrust-vectoring nozzles to be highly capable and efficient.
Selected Performance Measurements of the F-15 ACTIVE Axisymmetric Thrust-Vectoring Nozzle
NASA Technical Reports Server (NTRS)
Orme, John S.; Sims, Robert L.
1999-01-01
Flight tests recently completed at the NASA Dryden Flight Research Center evaluated performance of a hydromechanically vectored axisymmetric nozzle onboard the F-15 ACTIVE. A flight-test technique whereby strain gages installed onto engine mounts provided for the direct measurement of thrust and vector forces has proven to be extremely valuable. Flow turning and thrust efficiency, as well as nozzle static pressure distributions were measured and analyzed. This report presents results from testing at an altitude of 30,000 ft and a speed of Mach 0.9. Flow turning and thrust efficiency were found to be significantly different than predicted, and moreover, varied substantially with power setting and pitch vector angle. Results of an in-flight comparison of the direct thrust measurement technique and an engine simulation fell within the expected uncertainty bands. Overall nozzle performance at this flight condition demonstrated the F100-PW-229 thrust-vectoring nozzles to be highly capable and efficient.
Snowpack ground truth: Radar test site, Steamboat Springs, Colorado, 8-16 April 1976
NASA Technical Reports Server (NTRS)
Howell, S.; Jones, E. B.; Leaf, C. F.
1976-01-01
Ground-truth data taken at Steamboat Springs, Colorado is presented. Data taken during the period April 8, 1976 - April 16, 1976 included the following: (1) snow depths and densities at selected locations (using a Mount Rose snow tube); (2) snow pits for temperature, density, and liquid water determinations using the freezing calorimetry technique and vertical layer classification; (3) snow walls were also constructed of various cross sections and documented with respect to sizes and snow characteristics; (4) soil moisture at selected locations; and (5) appropriate air temperature and weather data.
Woven TPS Mechanical Property Evaluation
NASA Technical Reports Server (NTRS)
Gonzales, Gregory Lewis; Kao, David Jan-Woei; Stackpoole, Margaret M.
2013-01-01
Woven Thermal Protection Systems (WTPS) is a relatively new program funded by the Office of the Chief Technologist (OCT). The WTPS approach to producing TPS architectures uses precisely engineered 3-D weaving techniques that allow tailoring material characteristics needed to meet specific mission requirements. A series of mechanical tests were performed to evaluate performance of different weave types, and get a better understanding of failure modes expected in these three-dimensional architectures. These properties will aid in material down selection and guide selection of the appropriate WTPS for a potential mission.
NASA/ESA CV-990 spacelab simulation
NASA Technical Reports Server (NTRS)
Reller, J. O., Jr.
1976-01-01
Simplified techniques were applied to conduct an extensive spacelab simulation using the airborne laboratory. The scientific payload was selected to perform studies in upper atmospheric physics and infrared astronomy. The mission was successful and provided extensive data relevant to spacelab objectives on overall management of a complex international payload; experiment preparation, testing, and integration; training for proxy operation in space; data handling; multiexperimenter use of common experimenter facilities (telescopes); multiexperiment operation by experiment operators; selection criteria for spacelab experiment operators; and schedule requirements to prepare for such a spacelab mission.
NASA Astrophysics Data System (ADS)
Khai Tiu, Ervin Shan; Huang, Yuk Feng; Ling, Lloyd
2018-03-01
An accurate streamflow forecasting model is important for the development of flood mitigation plan as to ensure sustainable development for a river basin. This study adopted Variational Mode Decomposition (VMD) data-preprocessing technique to process and denoise the rainfall data before putting into the Support Vector Machine (SVM) streamflow forecasting model in order to improve the performance of the selected model. Rainfall data and river water level data for the period of 1996-2016 were used for this purpose. Homogeneity tests (Standard Normal Homogeneity Test, the Buishand Range Test, the Pettitt Test and the Von Neumann Ratio Test) and normality tests (Shapiro-Wilk Test, Anderson-Darling Test, Lilliefors Test and Jarque-Bera Test) had been carried out on the rainfall series. Homogenous and non-normally distributed data were found in all the stations, respectively. From the recorded rainfall data, it was observed that Dungun River Basin possessed higher monthly rainfall from November to February, which was during the Northeast Monsoon. Thus, the monthly and seasonal rainfall series of this monsoon would be the main focus for this research as floods usually happen during the Northeast Monsoon period. The predicted water levels from SVM model were assessed with the observed water level using non-parametric statistical tests (Biased Method, Kendall's Tau B Test and Spearman's Rho Test).
NASA Technical Reports Server (NTRS)
Roman, Juan A.; Stitt, George F.; Roman, Felix R.
1997-01-01
This paper will provide a general overview of the molecular contamination philosophy of the Space Simulation Test Engineering Section and how the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) space simulation laboratory controls and maintains the cleanliness of all its facilities, thereby, minimizing down time between tests. It will also briefly cover the proper selection and safety precautions needed when using some chemical solvents for wiping, washing, or spraying thermal shrouds when molecular contaminants increase to unacceptable background levels.
NASA Advances Technologies for Additive Manufacturing of GRCop-84 Copper Alloy
NASA Technical Reports Server (NTRS)
Gradl, Paul; Protz, Chris
2017-01-01
The Low Cost Upper Stage Propulsion project has successfully developed and matured Selective Laser Melting (SLM) Fabrication of the NASA developed GRCop-84 copper alloy. Several parts have been printed in house and at a commercial vendor, and these parts have been successfully machined and have undergone further fabrication steps to allow hot-fire testing. Hot-fire testing has demonstrated parts manufactured with this technique can survive and perform well in the relevant environments for liquid rocket propulsion systems.
Develop real-time dosimetry concepts and instrumentation for long term missions
NASA Technical Reports Server (NTRS)
Braby, L. A.
1982-01-01
The development of a rugged portable instrument to evaluate dose and dose equivalent is described. A tissue-equivalent proportional counter simulating a 2 micrometer spherical tissue volume was operated satisfactorily for over a year. The basic elements of the electronic system were designed and tested. And finally, the most suitable mathematical technique for evaluating dose equivalent with a portable instrument was selected. Design and fabrication of a portable prototype, based on the previously tested circuits, is underway.
Electronics reliability fracture mechanics. Volume 2: Fracture mechanics
NASA Astrophysics Data System (ADS)
Kallis, J.; Duncan, L.; Buechler, D.; Backes, P.; Sandkulla, D.
1992-05-01
This is the second of two volumes. The other volume (WL-TR-92-3015) is 'Causes of Failures of Shop Replaceable Units and Hybrid Microcircuits.' The objective of the Electronics Reliability Fracture Mechanics (ERFM) program was to develop and demonstrate a life prediction technique for electronic assemblies, when subjected to environmental stresses of vibration and thermal cycling, based upon the mechanical properties of the materials and packaging configurations which make up an electronic system. The application of fracture mechanics to microscale phenomena in electronic assemblies was a pioneering research effort. The small scale made the experiments very difficult; for example, the 1-mil-diameter bond wires in microelectronic devices are 1/3 the diameter of a human hair. A number of issues had to be resolved to determine whether a fracture mechanics modelling approach is correct for the selected failures; specifically, the following two issues had to be resolved: What fraction of the lifetime is spent in crack initiation? Are macro fracture mechanics techniques, used in large structures such as bridges, applicable to the tiny structures in electronic equipment? The following structural failure mechanisms were selected for modelling: bondwire fracture from mechanical cycling; bondwire fracture from thermal (power) cycling; plated through hole (PTH) fracture from thermal cycling. The bondwire fracture test specimens were A1-1 percent Si wires, representative of wires used in the parts in the modules selected for detailed investigation in this program (see Vol. 1 of this report); 1-mil-diameter wires were tested in this program. The PTH test specimens were sections of 14-layer printed wiring boards of the type used.
Edward, Joseph; Aziz, Mubarak A; Madhu Usha, Arjun; Narayanan, Jyothi K
2017-12-01
Extractions are routine procedures in dental surgery. Traditional extraction techniques use a combination of severing the periodontal attachment, luxation with an elevator, and removal with forceps. A new technique of extraction of maxillary third molar is introduced in this study-Joedds technique, which is compared with the conventional technique. One hundred people were included in the study, the people were divided into two groups by means of simple random sampling. In one group conventional technique of maxillary third molar extraction was used and on second Joedds technique was used. Statistical analysis was carried out with student's t test. Analysis of 100 patients based on parameters showed that the novel joedds technique had minimal trauma to surrounding tissues, less tuberosity and root fractures and the time taken for extraction was <2 min while compared to other group of patients. This novel technique has proved to be better than conventional third molar extraction technique, with minimal complications. If Proper selection of cases and right technique are used.
Posada, David; Buckley, Thomas R
2004-10-01
Model selection is a topic of special relevance in molecular phylogenetics that affects many, if not all, stages of phylogenetic inference. Here we discuss some fundamental concepts and techniques of model selection in the context of phylogenetics. We start by reviewing different aspects of the selection of substitution models in phylogenetics from a theoretical, philosophical and practical point of view, and summarize this comparison in table format. We argue that the most commonly implemented model selection approach, the hierarchical likelihood ratio test, is not the optimal strategy for model selection in phylogenetics, and that approaches like the Akaike Information Criterion (AIC) and Bayesian methods offer important advantages. In particular, the latter two methods are able to simultaneously compare multiple nested or nonnested models, assess model selection uncertainty, and allow for the estimation of phylogenies and model parameters using all available models (model-averaged inference or multimodel inference). We also describe how the relative importance of the different parameters included in substitution models can be depicted. To illustrate some of these points, we have applied AIC-based model averaging to 37 mitochondrial DNA sequences from the subgenus Ohomopterus(genus Carabus) ground beetles described by Sota and Vogler (2001).
Selected topics in experimental aeroelasticity at the NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Ricketts, R. H.
1985-01-01
The results of selected studies that have been conducted by the NASA Langley Research Center in the last three years are presented. The topics presented focus primarily on the ever-important transonic flight regime and include the following: body-freedom flutter of a forward-swept-wing configuration with and without relaxed static stability; instabilities associated with a new tilt-rotor vehicle; effects of winglets, supercritical airfoils, and spanwise curvature on wing flutter; wind-tunnel investigation of a flutter-like oscillation on a high-aspect-ratio flight research wing; results of wing-tunnel demonstration of the NASA decoupler pylon concept for passive suppression of wing/store flutter; and, new flutter testing methods which include testing at cryogenic temperatures for full scale Reynolds number simulation, subcritical response techniques for predicting onset of flutter, and a two-degree-of-freedom mount system for testing side-wall-mounted models.
Selected topics in experimental aeroelasticity at the NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Ricketts, R. H.
1985-01-01
The results of selected studies that have been conducted by the NASA Langley Research Center in the last three years are presented. The topics presented focus primarily on the ever-important transonic flight regime and include the following: body-freedom flutter of a forward-swept-wing configuration with and without relaxed static stability; instabilities associated with a new tilt-rotor vehicle; effects of winglets, supercritical airfoils, and spanwise curvature on wing flutter; wind-tunnel investigation of a flutter-like oscillation on a high-aspect-ratio flight research wing; results of wind-tunnel demonstration of the NASA decoupler pylon concept for passive suppression of wing/store flutter; and, new flutter testing methods which include testing at cryogenic temperatures for full scale Reynolds number simulation, subcritical response techniques for predicting onset of flutter, and a two-degree-of-freedom mount system for testing side-wall-mounted models.
Development of dry coal feeders
NASA Technical Reports Server (NTRS)
Bonin, J. H.; Cantey, D. E.; Daniel, A. D., Jr.; Meyer, J. W.
1977-01-01
Design and fabrication of equipment of feed coal into pressurized environments were investigated. Concepts were selected based on feeder system performance and economic projections. These systems include: two approaches using rotating components, a gas or steam driven ejector, and a modified standpipe feeder concept. Results of development testing of critical components, design procedures, and performance prediction techniques are reviewed.
Thermoplastics for aircraft interiors
NASA Technical Reports Server (NTRS)
Silverman, B.
1978-01-01
The goal for this contract is the development of processes and techniques for molding thermally stable, fire retardant, low smoke emitting polymeric materials. Outlined in this presentation are: (1) the typical molding types; (2) a program schedule; (3) physical properties of molding types with the test methods to be used; (4) general properties of injection molding materials; and (5) preliminary materials selection.
Implementation of Structured Inquiry Based Model Learning toward Students' Understanding of Geometry
ERIC Educational Resources Information Center
Salim, Kalbin; Tiawa, Dayang Hjh
2015-01-01
The purpose of this study is implementation of a structured inquiry learning model in instruction of geometry. The model used is a model with a quasi-experimental study amounted to two classes of samples selected from the population of the ten classes with cluster random sampling technique. Data collection tool consists of a test item…
David N. Cole
2008-01-01
When crafting the U.S. Wilderness Act, Howard Zahniser selected the word untrammeled rather than undisturbed to describe wilderness (Harvey 2005). This reflected his belief that places that had been disturbed by humans should be considered for wilderness designation because impaired ecosystems could be restored. Like many others, he hoped that restoration could be...
ERIC Educational Resources Information Center
Egaga, Patrick I.; Aderibigbe, S. Akinwumi
2015-01-01
The study aimed at examining the efficacy of Information and Communication Technology (ICT) in enhancing learning outcomes of students with hearing impairment in Ibadan. The study adopted a pretest, post-test, control group quasi-experimental research design. Purposive sampling techniques was used for the selection of thirty participants…
Apical extrusion of debris and irrigant using hand and rotary systems: A comparative study
Ghivari, Sheetal B; Kubasad, Girish C; Chandak, Manoj G; Akarte, NR
2011-01-01
Aim: To evaluate and compare the amount of debris and irrigant extruded quantitatively by using two hand and rotary nickel–titanium (Ni–Ti) instrumentation techniques. Materials and Methods: Eighty freshly extracted mandibular premolars having similar canal length and curvature were selected and mounted in a debris collection apparatus. After each instrument change, 1 ml of distilled water was used as an irrigant and the amount of irrigant extruded was measured using the Meyers and Montgomery method. After drying, the debris was weighed using an electronic microbalance to determine its weight. Statistical analysis used: The data was analyzed statistically to determine the mean difference between the groups. The mean weight of the dry debris and irrigant within the group and between the groups was calculated by the one-way ANOVA and multiple comparison (Dunnet D) test. Results: The step-back technique extruded a greater quantity of debris and irrigant in comparison to other hand and rotary Ni–Ti systems. Conclusions: All instrumentation techniques extrude debris and irrigant, it is prudent on the part of the clinician to select the instrumentation technique that extrudes the least amount of debris and irrigant, to prevent a flare-up phenomena. PMID:21814364
Deng, Jie; Larson, Andrew C.
2010-01-01
Objectives To test the feasibility of combining inner-volume imaging (IVI) techniques with conventional multishot periodically rotated overlapping parallel lines with enhanced reconstruction (PROPELLER) techniques for targeted-PROPELLER magnetic resonance imaging. Materials and Methods Perpendicular section-selective gradients for spatially selective excitation and refocusing RF pulses were applied to limit the refocused field-of-view (FOV) along the phase-encoding direction for each rectangular blade image. We performed comparison studies in phantoms and normal volunteers by using targeted-PROPELLER methods for a wide range of imaging applications that commonly use turbo-spin-echo (TSE) approaches (brain, abdominal, vessel wall, cardiac). Results In these initial studies, we demonstrated the feasibility of using targeted-PROPELLER approaches to limit the imaging FOV thereby reducing the number of blades or permitting increased spatial resolution without commensurate increases in scan time. Both phantom and in vivo motion studies demonstrated the potential for more robust regional self-navigated motion correction compared with conventional full FOV PROPELLER methods. Conclusion We demonstrated that the reduced FOV targeted-PROPELLER technique offers the potential for reducing imaging time, increasing spatial resolution, and targeting specific areas for robust regional motion correction. PMID:19465860
Spindle speed variation technique in turning operations: Modeling and real implementation
NASA Astrophysics Data System (ADS)
Urbikain, G.; Olvera, D.; de Lacalle, L. N. López; Elías-Zúñiga, A.
2016-11-01
Chatter is still one of the most challenging problems in machining vibrations. Researchers have focused their efforts to prevent, avoid or reduce chatter vibrations by introducing more accurate predictive physical methods. Among them, the techniques based on varying the rotational speed of the spindle (or SSV, Spindle Speed Variation) have gained great relevance. However, several problems need to be addressed due to technical and practical reasons. On one hand, they can generate harmful overheating of the spindle especially at high speeds. On the other hand, the machine may be unable to perform the interpolation properly. Moreover, it is not trivial to select the most appropriate tuning parameters. This paper conducts a study of the real implementation of the SSV technique in turning systems. First, a stability model based on perturbation theory was developed for simulation purposes. Secondly, the procedure to realistically implement the technique in a conventional turning center was tested and developed. The balance between the improved stability margins and acceptable behavior of the spindle is ensured by energy consumption measurements. Mathematical model shows good agreement with experimental cutting tests.
Magenes, G; Bellazzi, R; Malovini, A; Signorini, M G
2016-08-01
The onset of fetal pathologies can be screened during pregnancy by means of Fetal Heart Rate (FHR) monitoring and analysis. Noticeable advances in understanding FHR variations were obtained in the last twenty years, thanks to the introduction of quantitative indices extracted from the FHR signal. This study searches for discriminating Normal and Intra Uterine Growth Restricted (IUGR) fetuses by applying data mining techniques to FHR parameters, obtained from recordings in a population of 122 fetuses (61 healthy and 61 IUGRs), through standard CTG non-stress test. We computed N=12 indices (N=4 related to time domain FHR analysis, N=4 to frequency domain and N=4 to non-linear analysis) and normalized them with respect to the gestational week. We compared, through a 10-fold crossvalidation procedure, 15 data mining techniques in order to select the more reliable approach for identifying IUGR fetuses. The results of this comparison highlight that two techniques (Random Forest and Logistic Regression) show the best classification accuracy and that both outperform the best single parameter in terms of mean AUROC on the test sets.
NASA Technical Reports Server (NTRS)
Vavra, M. H.; Hammer, J. E.; Bell, L. E.
1972-01-01
Experimental data are presented for the tangential and radial stresses in the disks of the 36,000 horsepower, 4000 rpm turbine for the M-1 engine oxidizer turbopump. The two-stage Curtis turbine is a special light-weight design utilizing thin conical disks with hollow sheet metal blades attached by electron-beam welding techniques. The turbine was fabricated from Inconel 718, a nickel-chromium alloy. The stresses were obtained by strain-gage measurements using a slip-ring assembly to transmit the electrical signals. Measurements were made at different rotative speeds and different thermal loads. In addition to presenting test data, the report describes test equipment, design of associated hardware, test procedures, instrumentation, and tests for the selection and calibration of strain gages.
NASA Astrophysics Data System (ADS)
Ben-Zikri, Yehuda Kfir; Linte, Cristian A.
2016-03-01
Region of interest detection is a precursor to many medical image processing and analysis applications, including segmentation, registration and other image manipulation techniques. The optimal region of interest is often selected manually, based on empirical knowledge and features of the image dataset. However, if inconsistently identified, the selected region of interest may greatly affect the subsequent image analysis or interpretation steps, in turn leading to incomplete assessment during computer-aided diagnosis or incomplete visualization or identification of the surgical targets, if employed in the context of pre-procedural planning or image-guided interventions. Therefore, the need for robust, accurate and computationally efficient region of interest localization techniques is prevalent in many modern computer-assisted diagnosis and therapy applications. Here we propose a fully automated, robust, a priori learning-based approach that provides reliable estimates of the left and right ventricle features from cine cardiac MR images. The proposed approach leverages the temporal frame-to-frame motion extracted across a range of short axis left ventricle slice images with small training set generated from les than 10% of the population. This approach is based on histogram of oriented gradients features weighted by local intensities to first identify an initial region of interest depicting the left and right ventricles that exhibits the greatest extent of cardiac motion. This region is correlated with the homologous region that belongs to the training dataset that best matches the test image using feature vector correlation techniques. Lastly, the optimal left ventricle region of interest of the test image is identified based on the correlation of known ground truth segmentations associated with the training dataset deemed closest to the test image. The proposed approach was tested on a population of 100 patient datasets and was validated against the ground truth region of interest of the test images manually annotated by experts. This tool successfully identified a mask around the LV and RV and furthermore the minimal region of interest around the LV that fully enclosed the left ventricle from all testing datasets, yielding a 98% overlap with their corresponding ground truth. The achieved mean absolute distance error between the two contours that normalized by the radius of the ground truth is 0.20 +/- 0.09.
Knick, Steven T.; Rotenberry, J.T.
1998-01-01
We tested the potential of a GIS mapping technique, using a resource selection model developed for black-tailed jackrabbits (Lepus californicus) and based on the Mahalanobis distance statistic, to track changes in shrubsteppe habitats in southwestern Idaho. If successful, the technique could be used to predict animal use areas, or those undergoing change, in different regions from the same selection function and variables without additional sampling. We determined the multivariate mean vector of 7 GIS variables that described habitats used by jackrabbits. We then ranked the similarity of all cells in the GIS coverage from their Mahalanobis distance to the mean habitat vector. The resulting map accurately depicted areas where we sighted jackrabbits on verification surveys. We then simulated an increase in shrublands (which are important habitats). Contrary to expectation, the new configurations were classified as lower similarity relative to the original mean habitat vector. Because the selection function is based on a unimodal mean, any deviation, even if biologically positive, creates larger Malanobis distances and lower similarity values. We recommend the Mahalanobis distance technique for mapping animal use areas when animals are distributed optimally, the landscape is well-sampled to determine the mean habitat vector, and distributions of the habitat variables does not change.
Marchan, Shivaughn M.; Coldero, Larry; White, Daniel; Smith, William A. J.; Rafeek, Reisha N.
2009-01-01
Objective. This in vitro study uses measurements of fracture resistance to compare maxillary premolars restored with the bonded amalgam technique using a new resin luting cement, glass ionomer, and resin-modified glass ionomer as the bonding agents. Materials. Eighty-five sound maxillary premolars were selected and randomly assigned to one of five test groups of 17 teeth each. One group of intact teeth served as the control. The remaining groups were prepared to a standard cavity form relative to the dimensions of the overall tooth and restored with amalgam alone or a bonded amalgam using one of three luting agents: RelyX Arc (a new resin luting cement), RelyX luting (a resin-modified glass ionomer), or Ketac-Cem μ (a glass ionomer) as the bonding agents. Each tooth was then subjected to compressive testing until catastrophic failure occurred. The mean loads at failure of each group were statistically compared using ANOVA with a post hoc Bonferroni test. Results. It was found that regardless of the luting cement used for the amalgam bonding technique, there was little effect on the fracture resistance of teeth. Conclusion. Cusp fracture resistance of premolars prepared with conservative MOD cavity preparations is not improved by using an amalgam-bonding technique compared to similar cavities restored with amalgam alone. PMID:20339450
Method for Reducing Pumping Damage to Blood
NASA Technical Reports Server (NTRS)
Bozeman, Richard J., Jr. (Inventor); Akkerman, James W. (Inventor); Aber, Gregory S. (Inventor); VanDamm, George Arthur (Inventor); Bacak, James W. (Inventor); Svejkovsky, Robert J. (Inventor); Benkowski, Robert J. (Inventor)
1997-01-01
Methods are provided for minimizing damage to blood in a blood pump wherein the blood pump comprises a plurality of pump components that may affect blood damage such as clearance between pump blades and housing, number of impeller blades, rounded or flat blade edges, variations in entrance angles of blades, impeller length, and the like. The process comprises selecting a plurality of pump components believed to affect blood damage such as those listed herein before. Construction variations for each of the plurality of pump components are then selected. The pump components and variations are preferably listed in a matrix for easy visual comparison of test results. Blood is circulated through a pump configuration to test each variation of each pump component. After each test, total blood damage is determined for the blood pump. Preferably each pump component variation is tested at least three times to provide statistical results and check consistency of results. The least hemolytic variation for each pump component is preferably selected as an optimized component. If no statistical difference as to blood damage is produced for a variation of a pump component, then the variation that provides preferred hydrodynamic performance is selected. To compare the variation of pump components such as impeller and stator blade geometries, the preferred embodiment of the invention uses a stereolithography technique for realizing complex shapes within a short time period.
Bakutra, Gaurav; Shankarapillai, Rajesh; Mathur, Lalit; Manohar, Balaji
2017-01-01
There are various treatment modalities to remove the black patches of melanin pigmentation. The aim of the study is to clinically compare the diode laser ablation and surgical stripping technique for gingival depigmentation and to evaluate their effect on the histological changes in melanocyte activity. A total of 40 sites of 20 patients with bilateral melanin hyperpigmentation were treated with the surgical stripping and diode laser ablation technique. Change in Hedin index score, change in area of pigmentation using image analyzing software, pain perception, patient preference of treatment were recorded. All 40 sites were selected for immunohistochemical analysis using HMB-45 immunohistochemical marker. At 12 months post-operative visit, in all sites, repigmentation was observed with different grades of Hedin index. Paired t -test, analysis of variance, and Chi-square tests were used for statistical analysis. Repigmentation in surgical stripping is significantly lesser compared to laser ablation. Lesser numbers of melanocytes were found on immunohistological examination at 12 months postoperatively. Comparison for patient preference and pain indices give statistically significant values for diode laser techniques. Gingival hyperpigmentation is effectively managed by diode laser ablation technique and surgical stripping method. In this study, surgical stripping technique found to be better compared to diode laser ablation.
Bakutra, Gaurav; Shankarapillai, Rajesh; Mathur, Lalit; Manohar, Balaji
2017-01-01
Introduction: There are various treatment modalities to remove the black patches of melanin pigmentation. The aim of the study is to clinically compare the diode laser ablation and surgical stripping technique for gingival depigmentation and to evaluate their effect on the histological changes in melanocyte activity. Materials and Methods: A total of 40 sites of 20 patients with bilateral melanin hyperpigmentation were treated with the surgical stripping and diode laser ablation technique. Change in Hedin index score, change in area of pigmentation using image analyzing software, pain perception, patient preference of treatment were recorded. All 40 sites were selected for immunohistochemical analysis using HMB-45 immunohistochemical marker. Results: At 12 months post-operative visit, in all sites, repigmentation was observed with different grades of Hedin index. Paired t-test, analysis of variance, and Chi-square tests were used for statistical analysis. Repigmentation in surgical stripping is significantly lesser compared to laser ablation. Lesser numbers of melanocytes were found on immunohistological examination at 12 months postoperatively. Comparison for patient preference and pain indices give statistically significant values for diode laser techniques. Conclusion: Gingival hyperpigmentation is effectively managed by diode laser ablation technique and surgical stripping method. In this study, surgical stripping technique found to be better compared to diode laser ablation. PMID:28539864
NASA Technical Reports Server (NTRS)
Bates, Seth P.
1990-01-01
Students are introduced to methods and concepts for systematic selection and evaluation of materials which are to be used to manufacture specific products in industry. For this laboratory exercise, students are asked to work in groups to identify and describe a product, then to proceed through the process to select a list of three candidates to make the item from. The exercise draws on knowledge of mechanical, physical, and chemical properties, common materials test techniques, and resource management skills in finding and assessing property data. A very important part of the exercise is the students' introduction to decision making algorithms, and learning how to apply them to a complex decision making process.
Use of system identification techniques for improving airframe finite element models using test data
NASA Technical Reports Server (NTRS)
Hanagud, Sathya V.; Zhou, Weiyu; Craig, James I.; Weston, Neil J.
1991-01-01
A method for using system identification techniques to improve airframe finite element models was developed and demonstrated. The method uses linear sensitivity matrices to relate changes in selected physical parameters to changes in total system matrices. The values for these physical parameters were determined using constrained optimization with singular value decomposition. The method was confirmed using both simple and complex finite element models for which pseudo-experimental data was synthesized directly from the finite element model. The method was then applied to a real airframe model which incorporated all the complexities and details of a large finite element model and for which extensive test data was available. The method was shown to work, and the differences between the identified model and the measured results were considered satisfactory.
NASA Technical Reports Server (NTRS)
Zell, P. T.; Hoffmann, J.; Sandlin, D. R.
1985-01-01
A study was performed in order to develop the criteria for the selection of flow direction indicators for use in the Integrated Systems Tests (ISTs) of the 40 by 80/80 by 120 Foot Wind Tunnel System. The problems, requirements, and limitations of flow direction measurement in the wind tunnel were investigated. The locations and types of flow direction measurements planned in the facility were discussed. A review of current methods of flow direction measurement was made and the most suitable technique for each location was chosen. A flow direction vane for each location was chosen. A flow direction vane that employs a Hall Effect Transducer was then developed and evaluated for application during the ISTs.
3D model assisted fully automated scanning laser Doppler vibrometer measurements
NASA Astrophysics Data System (ADS)
Sels, Seppe; Ribbens, Bart; Bogaerts, Boris; Peeters, Jeroen; Vanlanduit, Steve
2017-12-01
In this paper, a new fully automated scanning laser Doppler vibrometer (LDV) measurement technique is presented. In contrast to existing scanning LDV techniques which use a 2D camera for the manual selection of sample points, we use a 3D Time-of-Flight camera in combination with a CAD file of the test object to automatically obtain measurements at pre-defined locations. The proposed procedure allows users to test prototypes in a shorter time because physical measurement locations are determined without user interaction. Another benefit from this methodology is that it incorporates automatic mapping between a CAD model and the vibration measurements. This mapping can be used to visualize measurements directly on a 3D CAD model. The proposed method is illustrated with vibration measurements of an unmanned aerial vehicle
Targeted versus statistical approaches to selecting parameters for modelling sediment provenance
NASA Astrophysics Data System (ADS)
Laceby, J. Patrick
2017-04-01
One effective field-based approach to modelling sediment provenance is the source fingerprinting technique. Arguably, one of the most important steps for this approach is selecting the appropriate suite of parameters or fingerprints used to model source contributions. Accordingly, approaches to selecting parameters for sediment source fingerprinting will be reviewed. Thereafter, opportunities and limitations of these approaches and some future research directions will be presented. For properties to be effective tracers of sediment, they must discriminate between sources whilst behaving conservatively. Conservative behavior is characterized by constancy in sediment properties, where the properties of sediment sources remain constant, or at the very least, any variation in these properties should occur in a predictable and measurable way. Therefore, properties selected for sediment source fingerprinting should remain constant through sediment detachment, transportation and deposition processes, or vary in a predictable and measurable way. One approach to select conservative properties for sediment source fingerprinting is to identify targeted tracers, such as caesium-137, that provide specific source information (e.g. surface versus subsurface origins). A second approach is to use statistical tests to select an optimal suite of conservative properties capable of modelling sediment provenance. In general, statistical approaches use a combination of a discrimination (e.g. Kruskal Wallis H-test, Mann-Whitney U-test) and parameter selection statistics (e.g. Discriminant Function Analysis or Principle Component Analysis). The challenge is that modelling sediment provenance is often not straightforward and there is increasing debate in the literature surrounding the most appropriate approach to selecting elements for modelling. Moving forward, it would be beneficial if researchers test their results with multiple modelling approaches, artificial mixtures, and multiple lines of evidence to provide secondary support to their initial modelling results. Indeed, element selection can greatly impact modelling results and having multiple lines of evidence will help provide confidence when modelling sediment provenance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Shiju; Qian, Wei; Guan, Yubao
2016-06-15
Purpose: This study aims to investigate the potential to improve lung cancer recurrence risk prediction performance for stage I NSCLS patients by integrating oversampling, feature selection, and score fusion techniques and develop an optimal prediction model. Methods: A dataset involving 94 early stage lung cancer patients was retrospectively assembled, which includes CT images, nine clinical and biological (CB) markers, and outcome of 3-yr disease-free survival (DFS) after surgery. Among the 94 patients, 74 remained DFS and 20 had cancer recurrence. Applying a computer-aided detection scheme, tumors were segmented from the CT images and 35 quantitative image (QI) features were initiallymore » computed. Two normalized Gaussian radial basis function network (RBFN) based classifiers were built based on QI features and CB markers separately. To improve prediction performance, the authors applied a synthetic minority oversampling technique (SMOTE) and a BestFirst based feature selection method to optimize the classifiers and also tested fusion methods to combine QI and CB based prediction results. Results: Using a leave-one-case-out cross-validation (K-fold cross-validation) method, the computed areas under a receiver operating characteristic curve (AUCs) were 0.716 ± 0.071 and 0.642 ± 0.061, when using the QI and CB based classifiers, respectively. By fusion of the scores generated by the two classifiers, AUC significantly increased to 0.859 ± 0.052 (p < 0.05) with an overall prediction accuracy of 89.4%. Conclusions: This study demonstrated the feasibility of improving prediction performance by integrating SMOTE, feature selection, and score fusion techniques. Combining QI features and CB markers and performing SMOTE prior to feature selection in classifier training enabled RBFN based classifier to yield improved prediction accuracy.« less
Page, M E; Detke, M J; Dalvi, A; Kirby, L G; Lucki, I
1999-11-01
The forced swimming test (FST) is a behavioral test in rodents that predicts the clinical efficacy of many types of antidepressant treatments. Recently, a behavior sampling technique was developed that scores individual response categories, including swimming, climbing and immobility. Although all antidepressant drugs reduce immobility in the FST, at least two distinct active behavioral patterns are produced by pharmacologically selective antidepressant drugs. Serotonin-selective reuptake inhibitors increase swimming behavior, while drugs acting primarily to increase extracellular levels of norepinephrine or dopamine increase climbing behavior. Distinct patterns of active behaviors in the FST may be mediated by distinct neurotransmitters, but this has not been shown directly. The present study examined the role of serotonin in mediating active behaviors in the forced swimming test after treatment with two antidepressant drugs, the selective serotonin reuptake inhibitor, fluoxetine and the selective norepinephrine reuptake inhibitor, desipramine. Endogenous serotonin was depleted by administering para-cholorophenylalanine (PCPA, 150 mg/kg, IP.) to rats 72 h and 48 h prior to the swim test. Fluoxetine (10 mg/kg, SC) or desipramine (10 mg/kg, SC) was given three times over a 24-h period prior to the FST. Behavioral responses, including immobility, swimming and climbing, were counted during the 5-min test. Pretreatment with PCPA blocked fluoxetine-induced reduction in immobility and increase in swimming behavior during the FST. In contrast, PCPA pretreatment did not interfere with the ability of desipramine to reduce immobility and increase climbing behavior. Depletion of serotonin prevented the behavioral effects of the selective serotonin reuptake inhibitor fluoxetine in the rat FST. Furthermore, depletion of serotonin had no impact on the behavioral effects induced by the selective norepinephrine reuptake inhibitor, desipramine. The effects of antidepressant drugs on FST-induced immobility may be exerted by distinguishable contributions from different neurotransmitter systems.
Ivezic, Nenad; Potok, Thomas E.
2003-09-30
A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.
Hybrid real-code ant colony optimisation for constrained mechanical design
NASA Astrophysics Data System (ADS)
Pholdee, Nantiwat; Bureerat, Sujin
2016-01-01
This paper proposes a hybrid meta-heuristic based on integrating a local search simplex downhill (SDH) method into the search procedure of real-code ant colony optimisation (ACOR). This hybridisation leads to five hybrid algorithms where a Monte Carlo technique, a Latin hypercube sampling technique (LHS) and a translational propagation Latin hypercube design (TPLHD) algorithm are used to generate an initial population. Also, two numerical schemes for selecting an initial simplex are investigated. The original ACOR and its hybrid versions along with a variety of established meta-heuristics are implemented to solve 17 constrained test problems where a fuzzy set theory penalty function technique is used to handle design constraints. The comparative results show that the hybrid algorithms are the top performers. Using the TPLHD technique gives better results than the other sampling techniques. The hybrid optimisers are a powerful design tool for constrained mechanical design problems.
Differential Deposition Technique for Figure Corrections in Grazing Incidence X-ray Optics
NASA Technical Reports Server (NTRS)
Kilaru, Kiranmayee; Ramsey, Brian D.; Gubarev, Mikhail
2009-01-01
A differential deposition technique is being developed to correct the low- and mid-spatial-frequency deviations in the axial figure profile of Wolter type grazing incidence X-ray optics. These deviations arise due to various factors in the fabrication process and they degrade the performance of the optics by limiting the achievable angular resolution. In the differential deposition technique, material of varying thickness is selectively deposited along the length of the optic to minimize these deviations, thereby improving the overall figure. High resolution focusing optics being developed at MSFC for small animal radionuclide imaging are being coated to test the differential deposition technique. The required spatial resolution for these optics is 100 m. This base resolution is achievable with the regular electroform-nickel-replication fabrication technique used at MSFC. However, by improving the figure quality of the optics through differential deposition, we aim at significantly improving the resolution beyond this value.
Abuo-Rahma, Gamal El-Din A A; Abdel-Aziz, Mohamed; Farag, Nahla A; Kaoud, Tamer S
2014-08-18
A novel series of 1,2,4-triazole derivatives were synthesized and confirmed with different spectroscopic techniques. The prepared compounds exhibited remarkable anti-inflammatory activity comparable to that of indomethacin and celecoxib after 3 h. The tested compounds exhibited very low incidence of gastric ulceration compared to indomethacin. Most of the newly developed compounds showed excellent selectivity towards human COX-2 with selectivity indices (COX-1 IC50/COX-2 IC50) ranged from 62.5 to 2127. Docking studies results revealed that the highly selective tested compounds 6h and 6j showed lower CDOCKER energies, which means that they require less energy for proper interaction with the enzyme. The additional H-bonds with the oxygen of the amide and/or H of NH of the amide with the amino acid residues may be responsible for the higher binding affinity of this group of compounds towards COX-2. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
Which Approach Is More Effective in the Selection of Plants with Antimicrobial Activity?
Silva, Ana Carolina Oliveira; Santana, Elidiane Fonseca; Saraiva, Antonio Marcos; Coutinho, Felipe Neves; Castro, Ricardo Henrique Acre; Pisciottano, Maria Nelly Caetano; Amorim, Elba Lúcia Cavalcanti; Albuquerque, Ulysses Paulino
2013-01-01
The development of the present study was based on selections using random, direct ethnopharmacological, and indirect ethnopharmacological approaches, aiming to evaluate which method is the best for bioprospecting new antimicrobial plant drugs. A crude extract of 53 species of herbaceous plants collected in the semiarid region of Northeast Brazil was tested against 11 microorganisms. Well-agar diffusion and minimum inhibitory concentration (MIC) techniques were used. Ten extracts from direct, six from random, and three from indirect ethnopharmacological selections exhibited activities that ranged from weak to very active against the organisms tested. The strain most susceptible to the evaluated extracts was Staphylococcus aureus. The MIC analysis revealed the best result for the direct ethnopharmacological approach, considering that some species yielded extracts classified as active or moderately active (MICs between 250 and 1000 µg/mL). Furthermore, one species from this approach inhibited the growth of the three Candida strains. Thus, it was concluded that the direct ethnopharmacological approach is the most effective when selecting species for bioprospecting new plant drugs with antimicrobial activities. PMID:23878595
Statistical analysis for validating ACO-KNN algorithm as feature selection in sentiment analysis
NASA Astrophysics Data System (ADS)
Ahmad, Siti Rohaidah; Yusop, Nurhafizah Moziyana Mohd; Bakar, Azuraliza Abu; Yaakub, Mohd Ridzwan
2017-10-01
This research paper aims to propose a hybrid of ant colony optimization (ACO) and k-nearest neighbor (KNN) algorithms as feature selections for selecting and choosing relevant features from customer review datasets. Information gain (IG), genetic algorithm (GA), and rough set attribute reduction (RSAR) were used as baseline algorithms in a performance comparison with the proposed algorithm. This paper will also discuss the significance test, which was used to evaluate the performance differences between the ACO-KNN, IG-GA, and IG-RSAR algorithms. This study evaluated the performance of the ACO-KNN algorithm using precision, recall, and F-score, which were validated using the parametric statistical significance tests. The evaluation process has statistically proven that this ACO-KNN algorithm has been significantly improved compared to the baseline algorithms. The evaluation process has statistically proven that this ACO-KNN algorithm has been significantly improved compared to the baseline algorithms. In addition, the experimental results have proven that the ACO-KNN can be used as a feature selection technique in sentiment analysis to obtain quality, optimal feature subset that can represent the actual data in customer review data.
NASA Technical Reports Server (NTRS)
Adams, K. M.; Lucas, J. J.
1975-01-01
The development of a frame/stringer/skin fabrication technique for composite airframe construction was studied as a low cost approach to the manufacture of large helicopter airframe components. A center cabin aluminum airframe section of the Sikorsky CH-53D helicopter was selected for evaluation as a composite structure. The design, as developed, is composed of a woven KEVLAR-49/epoxy skin and graphite/epoxy frames and stringers. To support the selection of this specific design concept a materials study was conducted to develop and select a cure compatible graphite and KEVLAR-49/epoxy resin system, and a foam system capable of maintaining shape and integrity under the processing conditions established. The materials selected were, Narmco 5209/Thornel T-300 graphite, Narmco 5209/KEVLAR-49 woven fabric, and Stathane 8747 polyurethane foam. Eight specimens were fabricated, representative of the frame, stringer, and splice joint attachments. Evaluation of the results of analysis and test indicate that design predictions are good to excellent except for some conservatism of the complex frame splice.
Sakkas, Denny; Ramalingam, Mythili; Garrido, Nicolas; Barratt, Christopher L.R.
2015-01-01
BACKGROUND In natural conception only a few sperm cells reach the ampulla or the site of fertilization. This population is a selected group of cells since only motile cells can pass through cervical mucus and gain initial entry into the female reproductive tract. In animals, some studies indicate that the sperm selected by the reproductive tract and recovered from the uterus and the oviducts have higher fertilization rates but this is not a universal finding. Some species show less discrimination in sperm selection and abnormal sperm do arrive at the oviduct. In contrast, assisted reproductive technologies (ART) utilize a more random sperm population. In this review we contrast the journey of the spermatozoon in vivo and in vitro and discuss this in the context of developing new sperm preparation and selection techniques for ART. METHODS A review of the literature examining characteristics of the spermatozoa selected in vivo is compared with recent developments in in vitro selection and preparation methods. Contrasts and similarities are presented. RESULTS AND CONCLUSIONS New technologies are being developed to aid in the diagnosis, preparation and selection of spermatozoa in ART. To date progress has been frustrating and these methods have provided variable benefits in improving outcomes after ART. It is more likely that examining the mechanisms enforced by nature will provide valuable information in regard to sperm selection and preparation techniques in vitro. Identifying the properties of those spermatozoa which do reach the oviduct will also be important for the development of more effective tests of semen quality. In this review we examine the value of sperm selection to see how much guidance for ART can be gleaned from the natural selection processes in vivo. PMID:26386468
Afantitis, Antreas; Melagraki, Georgia; Sarimveis, Haralambos; Koutentis, Panayiotis A; Igglessi-Markopoulou, Olga; Kollias, George
2010-05-01
A novel QSAR workflow is constructed that combines MLR with LS-SVM classification techniques for the identification of quinazolinone analogs as "active" or "non-active" CXCR3 antagonists. The accuracy of the LS-SVM classification technique for the training set and test was 100% and 90%, respectively. For the "active" analogs a validated MLR QSAR model estimates accurately their I-IP10 IC(50) inhibition values. The accuracy of the QSAR model (R (2) = 0.80) is illustrated using various evaluation techniques, such as leave-one-out procedure (R(LOO2)) = 0.67) and validation through an external test set (R(pred2) = 0.78). The key conclusion of this study is that the selected molecular descriptors, Highest Occupied Molecular Orbital energy (HOMO), Principal Moment of Inertia along X and Y axes PMIX and PMIZ, Polar Surface Area (PSA), Presence of triple bond (PTrplBnd), and Kier shape descriptor ((1) kappa), demonstrate discriminatory and pharmacophore abilities.
Advancements in optical techniques and imaging in the diagnosis and management of bladder cancer.
Rose, Tracy L; Lotan, Yair
2018-03-01
Accurate detection and staging is critical to the appropriate management of urothelial cancer (UC). The use of advanced optical techniques during cystoscopy is becoming more widespread to prevent recurrent nonmuscle invasive bladder cancer. Standard of care for muscle-invasive UC includes the use of computed tomography and/or magnetic resonance imaging, but staging accuracy of these tests remains imperfect. Novel imaging modalities are being developed to improve current test performance. Positron emission tomography/computed tomography has a role in the initial evaluation of select patients with muscle-invasive bladder cancer and in disease recurrence in some cases. Several novel immuno-positron emission tomography tracers are currently in development to address the inadequacy of current imaging modalities for monitoring of tumor response to newer immune-based treatments. This review summaries the current standards and recent advances in optical techniques and imaging modalities in localized and metastatic UC. Copyright © 2018 Elsevier Inc. All rights reserved.
Hypervelocity Impact Test Facility: A gun for hire
NASA Technical Reports Server (NTRS)
Johnson, Calvin R.; Rose, M. F.; Hill, D. C.; Best, S.; Chaloupka, T.; Crawford, G.; Crumpler, M.; Stephens, B.
1994-01-01
An affordable technique has been developed to duplicate the types of impacts observed on spacecraft, including the Shuttle, by use of a certified Hypervelocity Impact Facility (HIF) which propels particulates using capacitor driven electric gun techniques. The fully operational facility provides a flux of particles in the 10-100 micron diameter range with a velocity distribution covering the space debris and interplanetary dust particle environment. HIF measurements of particle size, composition, impact angle and velocity distribution indicate that such parameters can be controlled in a specified, tailored test designed for or by the user. Unique diagnostics enable researchers to fully describe the impact for evaluating the 'targets' under full power or load. Users regularly evaluate space hardware, including solar cells, coatings, and materials, exposing selected portions of space-qualified items to a wide range of impact events and environmental conditions. Benefits include corroboration of data obtained from impact events, flight simulation of designs, accelerated aging of systems, and development of manufacturing techniques.
An adaptive technique to maximize lossless image data compression of satellite images
NASA Technical Reports Server (NTRS)
Stewart, Robert J.; Lure, Y. M. Fleming; Liou, C. S. Joe
1994-01-01
Data compression will pay an increasingly important role in the storage and transmission of image data within NASA science programs as the Earth Observing System comes into operation. It is important that the science data be preserved at the fidelity the instrument and the satellite communication systems were designed to produce. Lossless compression must therefore be applied, at least, to archive the processed instrument data. In this paper, we present an analysis of the performance of lossless compression techniques and develop an adaptive approach which applied image remapping, feature-based image segmentation to determine regions of similar entropy and high-order arithmetic coding to obtain significant improvements over the use of conventional compression techniques alone. Image remapping is used to transform the original image into a lower entropy state. Several techniques were tested on satellite images including differential pulse code modulation, bi-linear interpolation, and block-based linear predictive coding. The results of these experiments are discussed and trade-offs between computation requirements and entropy reductions are used to identify the optimum approach for a variety of satellite images. Further entropy reduction can be achieved by segmenting the image based on local entropy properties then applying a coding technique which maximizes compression for the region. Experimental results are presented showing the effect of different coding techniques for regions of different entropy. A rule-base is developed through which the technique giving the best compression is selected. The paper concludes that maximum compression can be achieved cost effectively and at acceptable performance rates with a combination of techniques which are selected based on image contextual information.
Endodontic filling removal procedure: an ex vivo comparative study between two rotary techniques.
Vale, Mônica Sampaio do; Moreno, Melinna dos Santos; Silva, Priscila Macêdo França da; Botelho, Thereza Cristina Farias
2013-01-01
In this study, we compared the ex vivo removal capacity of two endodontic rotary techniques and determined whether there was a significant quantitative difference in residual material when comparing root thirds. Forty extracted molars were used. The palatal roots were selected, and the canals were prepared using a step-back technique and filled using a lateral condensation technique with gutta-percha points and Endofill sealer. After two weeks of storage in a 0.9% saline solution at 37 ºC in an oven, the specimens were divided into 2 groups of 20, with group 1 samples subjected to Gates-Glidden drills and group 2 samples subjected to the ProTaper retreatment System. Hedstroem files and eucalyptol solvent were used in both groups to complete the removal procedure. Then, the roots thirds were radiographed and the images were submitted to the NIH ImageJ program to measure the residual filling material in mm. Each root third was related to the total area of the root canals. The data were analyzed using Student's t test. There was a statistically significant difference between the two techniques as more filling material was removed by technique 2 (ProTaper) than technique 1 (Gates-Glidden drills, p < 0.05). The apical third had a greater amount of residual filling material than the cervical and middle thirds, and the difference was statistically significant (p < 0.05). None of the selected techniques removed all filling material, and the material was most difficult to remove from the apical third. The ProTaper files removed more material than the Gates-Glidden drills.
A lab-based study of subground passive cooling system for indoor temperature control
NASA Astrophysics Data System (ADS)
Chok, Mun-Hong; Chan, Chee-Ming
2017-11-01
Passive cooling is an alternative cooling technique which helps to reduce high energy consumption. Respectively, dredged marine soil (DMS) is either being dumped or disposed as waste materials. Dredging works had resulted high labor cost, therefore reuse DMS as to fill it along the coastal area. In this study, DMS chosen to examine the effectiveness of passive cooling system by model tests. Soil characterization were carried out according to BS1377: Part 2: 1990. Model were made into scale of 3 cm to 1 m. Heat exchange unit consists of three pipe designs namely, parallel, ramp and spiral. Preliminary tests including flow rate test and soil sample selection were done to select the best heat exchange unit to carry out the model test. Model test is classified into 2 conditions, day and night, each condition consists of 4 configurations which the temperature results are determined. The result shows that window left open and fan switched on (WO/FO) recorded the most effective cooling effects, from 29 °C to 27 °C with drop of 6.9 %.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1982-06-01
The objective of this study is to assess the effectiveness of air sprays and foam systems for dust control on longwall double-drum shearer faces. Laboratory testing has been conducted using foam systems and promising results have been obtained. Upon Bureau approval, underground testing will be scheduled to assess the effectiveness of foam systems under actual operating conditions. Laboratory testing of air sprays is being conducted at present. This report presents the results of the laboratory testing of foam systems. Specifically, the results obtained on the evaluation of selected foaming agents are presented, the feasibility investigation of flushing foam through themore » shearer-drum are demonstrated, and conceptual layout of the foam system on the shearer is discussed. The laboratory investigation of the selected foaming agents reveal that the Onyx Microfoam, Onyx Maprosyl and DeTer Microfoam foaming agents have higher expansion ratios compared to the others tested. Flushing foam through the shearer drum is entirely feasible and could be a viable technique for dust suppression on longwall faces.« less
NASA Technical Reports Server (NTRS)
Moxson, V. S.; Moracz, D. J.; Bhat, B. N.; Dolan, F. J.; Thom, R.
1987-01-01
Traditionally, vacuum melted 440C stainless steel is used for high performance bearings for aerospace cryogenic systems where corrosion due to condensation is a major concern. For the Space Shuttle Main Engine (SSME), however, 440C performance in the high-pressure turbopumps has been marginal. A basic assumption of this study was that powder metallurgy, rather than cast/wrought, processing would provide the finest, most homogeneous bearing alloy structure. Preliminary testing of P/M alloys (hardness, corrosion resistance, wear resistance, fatigue resistance, and fracture toughness) was used to 'de-select' alloys which did perform as well as baseline 440C. Five out of eleven candidate materials (14-4/6V, X-405, MRC-2001, T-440V, and D-5) based on preliminary screening were selected for the actual rolling-sliding five-ball testing. The results of this test were compared with high-performance vacuum-melted M50 bearing steel. The results of the testing indicated outstanding performance of two P/M alloys, X-405 and MRC-2001, which eventually will be further evaluated by full-scale bearing testing.
Indications for endoscopic third ventriculostomy in normal pressure hydrocephalus.
Paidakakos, Nikolaos; Borgarello, S; Naddeo, M
2012-01-01
Controversies remain regarding the proper diagnostic studies and prediction of outcome in patients with normal pressure hydrocephalus (NPH), and their management remains controversial. We propose a preoperative assessment routine the aim of which is to correctly select NPH patients, and to differentiate between them in terms of surgical treatment, identifying probable endoscopic third ventriculostomy (ETV) responders. We prospectively considered a group of 44 patients with suspected NPH on the basis of clinical symptoms and neuroradiological evidence, who have undergone supplemental diagnostic testing (tap test, external lumbar drainage, cerebrospinal fluid outflow resistance [Rout] determination through lumbar and ventricular infusion test). All 44 of these patients were treated with either shunt procedures or ETV. To choose the kind of treatment (shunt or ETV), we evaluated the individual response during infusion tests. The efficacy of both surgical techniques was approximately 70%, with a significantly lower complication rate for ETV. We evaluated the correlation between the various tests and the postoperative outcomes both for shunting and for ETV. Rout proved useful for preoperative assessment and choice of treatment. In carefully selected patients, ETV had qualitative results similar to shunting, presenting significantly fewer complications.
NASA Astrophysics Data System (ADS)
Dubey, Satish Kumar; Singh Mehta, Dalip; Anand, Arun; Shakher, Chandra
2008-01-01
We demonstrate simultaneous topography and tomography of latent fingerprints using full-field swept-source optical coherence tomography (OCT). The swept-source OCT system comprises a superluminescent diode (SLD) as broad-band light source, an acousto-optic tunable filter (AOTF) as frequency tuning device, and a compact, nearly common-path interferometer. Both the amplitude and the phase map of the interference fringe signal are reconstructed. Optical sectioning of the latent fingerprint sample is obtained by selective Fourier filtering and the topography is retrieved from the phase map. Interferometry, selective filtering, low coherence and hence better resolution are some of the advantages of the proposed system over the conventional fingerprint detection techniques. The present technique is non-invasive in nature and does not require any physical or chemical processing. Therefore, the quality of the sample does not alter and hence the same fingerprint can be used for other types of forensic test. Exploitation of low-coherence interferometry for fingerprint detection itself provides an edge over other existing techniques as fingerprints can even be lifted from low-reflecting surfaces. The proposed system is very economical and compact.
Strain gage selection in loads equations using a genetic algorithm
NASA Technical Reports Server (NTRS)
1994-01-01
Traditionally, structural loads are measured using strain gages. A loads calibration test must be done before loads can be accurately measured. In one measurement method, a series of point loads is applied to the structure, and loads equations are derived via the least squares curve fitting algorithm using the strain gage responses to the applied point loads. However, many research structures are highly instrumented with strain gages, and the number and selection of gages used in a loads equation can be problematic. This paper presents an improved technique using a genetic algorithm to choose the strain gages used in the loads equations. Also presented are a comparison of the genetic algorithm performance with the current T-value technique and a variant known as the Best Step-down technique. Examples are shown using aerospace vehicle wings of high and low aspect ratio. In addition, a significant limitation in the current methods is revealed. The genetic algorithm arrived at a comparable or superior set of gages with significantly less human effort, and could be applied in instances when the current methods could not.
Yu, Yinan; Diamantaras, Konstantinos I; McKelvey, Tomas; Kung, Sun-Yuan
2018-02-01
In kernel-based classification models, given limited computational power and storage capacity, operations over the full kernel matrix becomes prohibitive. In this paper, we propose a new supervised learning framework using kernel models for sequential data processing. The framework is based on two components that both aim at enhancing the classification capability with a subset selection scheme. The first part is a subspace projection technique in the reproducing kernel Hilbert space using a CLAss-specific Subspace Kernel representation for kernel approximation. In the second part, we propose a novel structural risk minimization algorithm called the adaptive margin slack minimization to iteratively improve the classification accuracy by an adaptive data selection. We motivate each part separately, and then integrate them into learning frameworks for large scale data. We propose two such frameworks: the memory efficient sequential processing for sequential data processing and the parallelized sequential processing for distributed computing with sequential data acquisition. We test our methods on several benchmark data sets and compared with the state-of-the-art techniques to verify the validity of the proposed techniques.
Student nurse selection and predictability of academic success: The Multiple Mini Interview project.
Gale, Julia; Ooms, Ann; Grant, Robert; Paget, Kris; Marks-Maran, Di
2016-05-01
With recent reports of public enquiries into failure to care, universities are under pressure to ensure that candidates selected for undergraduate nursing programmes demonstrate academic potential as well as characteristics and values such as compassion, empathy and integrity. The Multiple Mini Interview (MMI) was used in one university as a way of ensuring that candidates had the appropriate numeracy and literacy skills as well as a range of communication, empathy, decision-making and problem-solving skills as well as ethical insights and integrity, initiative and team-work. To ascertain whether there is evidence of bias in MMIs (gender, age, nationality and location of secondary education) and to determine the extent to which the MMI is predictive of academic success in nursing. A longitudinal retrospective analysis of student demographics, MMI data and the assessment marks for years 1, 2 and 3. One university in southwest London. One cohort of students who commenced their programme in September 2011, including students in all four fields of nursing (adult, child, mental health and learning disability). Inferential statistics and a Bayesian Multilevel Model. MMI in conjunction with MMI numeracy test and MMI literacy test shows little or no bias in terms of ages, gender, nationality or location of secondary school education. Although MMI in conjunction with numeracy and literacy testing is predictive of academic success, it is only weakly predictive. The MMI used in conjunction with literacy and numeracy testing appears to be a successful technique for selecting candidates for nursing. However, other selection methods such as psychological profiling or testing of emotional intelligence may add to the extent to which selection methods are predictive of academic success on nursing. Copyright © 2016 Elsevier Ltd. All rights reserved.
Can tutoring improve performance on a reasoning task under deadline conditions?
Osman, Magda
2007-03-01
The present study examined the effectiveness of a tutoring technique that has been used to identify and address participants' misunderstandings in Wason's selection task. In particular, the study investigated whether the technique would lead to improvements in performance when the task was presented in a deadline format (a condition in which time restrictions are imposed). In Experiment 1, the effects of tutoring on performance were compared in free time (conditions in which no time restrictions are imposed) and deadline task formats. In Experiment 2, improvements in performance were studied in deadline task formats, in which the tutoring and test phases were separated by an interval of 1 day. The results suggested that tutoring improved performance on the selection task under deadline and in free time conditions. Additionally, the study showed that participants made errors because they had misinterpreted the task. With tutoring, they were able to modify their initial misunderstandings.
Overview of selected surrogate technologies for continuous suspended-sediment monitoring
Gray, J.R.; Gartner, J.W.
2006-01-01
Surrogate technologies for inferring selected characteristics of suspended sediments in surface waters are being tested by the U.S. Geological Survey and several partners with the ultimate goal of augmenting or replacing traditional monitoring methods. Optical properties of water such as turbidity and optical backscatter are the most commonly used surrogates for suspended-sediment concentration, but use of other techniques such as those based on acoustic backscatter, laser diffraction, digital photo-optic, and pressure-difference principles is increasing for concentration and, in some cases, particle-size distribution and flux determinations. The potential benefits of these technologies include acquisition of automated, continuous, quantifiably accurate data obtained with increased safety and at less expense. When suspended-sediment surrogate data meet consensus accuracy criteria and appropriate sediment-record computation techniques are applied, these technologies have the potential to revolutionize the way fluvial-sediment data are collected, analyzed, and disseminated.
Sensor data validation and reconstruction. Phase 1: System architecture study
NASA Technical Reports Server (NTRS)
1991-01-01
The sensor validation and data reconstruction task reviewed relevant literature and selected applicable validation and reconstruction techniques for further study; analyzed the selected techniques and emphasized those which could be used for both validation and reconstruction; analyzed Space Shuttle Main Engine (SSME) hot fire test data to determine statistical and physical relationships between various parameters; developed statistical and empirical correlations between parameters to perform validation and reconstruction tasks, using a computer aided engineering (CAE) package; and conceptually designed an expert system based knowledge fusion tool, which allows the user to relate diverse types of information when validating sensor data. The host hardware for the system is intended to be a Sun SPARCstation, but could be any RISC workstation with a UNIX operating system and a windowing/graphics system such as Motif or Dataviews. The information fusion tool is intended to be developed using the NEXPERT Object expert system shell, and the C programming language.
Advantages of Synthetic Noise and Machine Learning for Analyzing Radioecological Data Sets.
Shuryak, Igor
2017-01-01
The ecological effects of accidental or malicious radioactive contamination are insufficiently understood because of the hazards and difficulties associated with conducting studies in radioactively-polluted areas. Data sets from severely contaminated locations can therefore be small. Moreover, many potentially important factors, such as soil concentrations of toxic chemicals, pH, and temperature, can be correlated with radiation levels and with each other. In such situations, commonly-used statistical techniques like generalized linear models (GLMs) may not be able to provide useful information about how radiation and/or these other variables affect the outcome (e.g. abundance of the studied organisms). Ensemble machine learning methods such as random forests offer powerful alternatives. We propose that analysis of small radioecological data sets by GLMs and/or machine learning can be made more informative by using the following techniques: (1) adding synthetic noise variables to provide benchmarks for distinguishing the performances of valuable predictors from irrelevant ones; (2) adding noise directly to the predictors and/or to the outcome to test the robustness of analysis results against random data fluctuations; (3) adding artificial effects to selected predictors to test the sensitivity of the analysis methods in detecting predictor effects; (4) running a selected machine learning method multiple times (with different random-number seeds) to test the robustness of the detected "signal"; (5) using several machine learning methods to test the "signal's" sensitivity to differences in analysis techniques. Here, we applied these approaches to simulated data, and to two published examples of small radioecological data sets: (I) counts of fungal taxa in samples of soil contaminated by the Chernobyl nuclear power plan accident (Ukraine), and (II) bacterial abundance in soil samples under a ruptured nuclear waste storage tank (USA). We show that the proposed techniques were advantageous compared with the methodology used in the original publications where the data sets were presented. Specifically, our approach identified a negative effect of radioactive contamination in data set I, and suggested that in data set II stable chromium could have been a stronger limiting factor for bacterial abundance than the radionuclides 137Cs and 99Tc. This new information, which was extracted from these data sets using the proposed techniques, can potentially enhance the design of radioactive waste bioremediation.
Advantages of Synthetic Noise and Machine Learning for Analyzing Radioecological Data Sets
Shuryak, Igor
2017-01-01
The ecological effects of accidental or malicious radioactive contamination are insufficiently understood because of the hazards and difficulties associated with conducting studies in radioactively-polluted areas. Data sets from severely contaminated locations can therefore be small. Moreover, many potentially important factors, such as soil concentrations of toxic chemicals, pH, and temperature, can be correlated with radiation levels and with each other. In such situations, commonly-used statistical techniques like generalized linear models (GLMs) may not be able to provide useful information about how radiation and/or these other variables affect the outcome (e.g. abundance of the studied organisms). Ensemble machine learning methods such as random forests offer powerful alternatives. We propose that analysis of small radioecological data sets by GLMs and/or machine learning can be made more informative by using the following techniques: (1) adding synthetic noise variables to provide benchmarks for distinguishing the performances of valuable predictors from irrelevant ones; (2) adding noise directly to the predictors and/or to the outcome to test the robustness of analysis results against random data fluctuations; (3) adding artificial effects to selected predictors to test the sensitivity of the analysis methods in detecting predictor effects; (4) running a selected machine learning method multiple times (with different random-number seeds) to test the robustness of the detected “signal”; (5) using several machine learning methods to test the “signal’s” sensitivity to differences in analysis techniques. Here, we applied these approaches to simulated data, and to two published examples of small radioecological data sets: (I) counts of fungal taxa in samples of soil contaminated by the Chernobyl nuclear power plan accident (Ukraine), and (II) bacterial abundance in soil samples under a ruptured nuclear waste storage tank (USA). We show that the proposed techniques were advantageous compared with the methodology used in the original publications where the data sets were presented. Specifically, our approach identified a negative effect of radioactive contamination in data set I, and suggested that in data set II stable chromium could have been a stronger limiting factor for bacterial abundance than the radionuclides 137Cs and 99Tc. This new information, which was extracted from these data sets using the proposed techniques, can potentially enhance the design of radioactive waste bioremediation. PMID:28068401
APPLICATIONS OF THE PLASTIC FILM TECHNIQUE IN THE ISOLATION AND STUDY OF ANAEROBIC BACTERIA
Shank, J. L.
1963-01-01
Shank, J. L. (Swift & Co., Chicago, Ill.). Applications of the plastic film technique in the isolation and study of anaerobic bacteria. J. Bacteriol. 86:95–100. 1963.—The use of plastic films as oxygen barriers on the surface of agar pour plates, in conjunction with thioglycolate and other selective and differential agents, allows the primary isolation and enumeration of clostridia and other anaerobes. Quantitative studies reveal little if any inhibition of the test organisms under these conditions, and toxin production, where it occurs, is shown to be virtually unimpaired. Images PMID:14051828
Applications of remote sensing, volume 3
NASA Technical Reports Server (NTRS)
Landgrebe, D. A. (Principal Investigator)
1977-01-01
The author has identified the following significant results. Of the four change detection techniques (post classification comparison, delta data, spectral/temporal, and layered spectral temporal), the post classification comparison was selected for further development. This was based upon test performances of the four change detection method, straightforwardness of the procedures, and the output products desired. A standardized modified, supervised classification procedure for analyzing the Texas coastal zone data was compiled. This procedure was developed in order that all quadrangles in the study are would be classified using similar analysis techniques to allow for meaningful comparisons and evaluations of the classifications.
Is it worth changing pattern recognition methods for structural health monitoring?
NASA Astrophysics Data System (ADS)
Bull, L. A.; Worden, K.; Cross, E. J.; Dervilis, N.
2017-05-01
The key element of this work is to demonstrate alternative strategies for using pattern recognition algorithms whilst investigating structural health monitoring. This paper looks to determine if it makes any difference in choosing from a range of established classification techniques: from decision trees and support vector machines, to Gaussian processes. Classification algorithms are tested on adjustable synthetic data to establish performance metrics, then all techniques are applied to real SHM data. To aid the selection of training data, an informative chain of artificial intelligence tools is used to explore an active learning interaction between meaningful clusters of data.
NASA Technical Reports Server (NTRS)
Porro, A. R.; Hingst, W. R.; Davis, D. O.; Blair, A. B., Jr.
1991-01-01
The feasibility of using a contoured honeycomb model to generate a thick boundary layer in high-speed, compressible flow was investigated. The contour of the honeycomb was tailored to selectively remove momentum in a minimum of streamwise distance to create an artificially thickened turbulent boundary layer. Three wind tunnel experiments were conducted to verify the concept. Results indicate that this technique is a viable concept, especially for high-speed inlet testing applications. In addition, the compactness of the honeycomb boundary layer simulator allows relatively easy integration into existing wind tunnel model hardware.
ERIC Educational Resources Information Center
Crocker, Linda M.; Mehrens, William A.
Four new methods of item analysis were used to select subsets of items which would yield measures of attitude change. The sample consisted of 263 students at Michigan State University who were tested on the Inventory of Beliefs as freshmen and retested on the same instrument as juniors. Item change scores and total change scores were computed for…
NCFES
1966-01-01
Included are (1) 22 technical papers (by researchers from many sections of the United States and Canada) pertaining to selection and progeny testing, radiation genetics, intraspecific variation, natural and artificial hybridization, breeding systems, breeding methodology and specialized tree breeding techniques, and applied breeding and allied fields; (2) concise...
Composite materials flown on the Long Duration Exposure Facility
NASA Technical Reports Server (NTRS)
George, Pete E.; Dursch, Harry W.; Pippin, H. Gary
1995-01-01
Organic composite test specimens were flown on several LDEF experiments. Both bare and coated composites were flown. Atomic oxygen eroded bare composite material, with the resins being recessed at a greater rate than the fibers. Selected coating techniques protected the composite substrate in each case. Tensile and optical properties are reported for numerous specimens. Fiberglass and metal matrix composites were also flown.
Grace Sun; Rebecca Ibach; Marek Gnatowski; Jessie Glaeser; Mathew Leung; John Haight
2014-01-01
Various instrumental techniques were used to study the fungal decay process in wood plastic composite (WPC) boards. Commercial boards exposed near Hilo, Hawaii (HI) for eight years in both sun and shadow locations were inspected and tested periodically. After eight years of exposure, both boards were evaluated using magnetic resonance imaging (MRI), while a selected...
The Contribution of Human Factors in Military System Development: Methodological Considerations
1980-07-01
Risk/Uncertainty Analysis - Project Scoring - Utility Scales - Relevance Tree Techniques (Reverse Factor Analysis) 2. Computer Simulation Simulation...effectiveness of mathematical models for R&D project selection. Management Science, April 1973, 18. 6-43 .1~ *.-. Souder, W.E. h scoring methodology for...per some interval PROFICIENCY test scores (written) RADIATION radiation effects aircrew performance on radiation environments REACTION TIME 1) (time
NASA Astrophysics Data System (ADS)
Kuraszkiewicz, Bożena
2011-01-01
The purpose of this review is to present selected tests available with the potential to detect the development of respiratory muscle fatigue in normal subjects and patients. All reviewed techniques represent a part of a variety of measures and indices, which have been employed to assess this complex process at the present time.
Skill Analysis as a Technique for Predicting Vocational Success of the Mentally Retarded.
ERIC Educational Resources Information Center
Human Resources Center, Albertson, NY.
This study was designed to develop a skill anaysis test battery which would aid in the prediction of achievement in two specific areas of training. A total of forty educable mentally retarded students in work study classes were selected for training. A three part rating scale specifically for this study was used as criterion meansure against the…
ERIC Educational Resources Information Center
Valine, Warren J.
This study examines the relative effectiveness of 3 group counseling techniques and a control group in counseling with underachieving college freshmen. The effectiveness of each method was determined through comparison of grade point averages (GPA) as well as by pre- and post-test scores on selected self concept variables of the Tennessee Self…
Elderly quality of life impacted by traditional chinese medicine techniques
Figueira, Helena A; Figueira, Olivia A; Figueira, Alan A; Figueira, Joana A; Giani, Tania S; Dantas, Estélio HM
2010-01-01
Background: The shift in age structure is having a profound impact, suggesting that the aged should be consulted as reporters on the quality of their own lives. Objectives: The aim of this research was to establish the possible impact of traditional Chinese medicine (TCM) techniques on the quality of life (QOL) of the elderly. Sample: Two non-selected, volunteer groups of Rio de Janeiro municipality inhabitants: a control group (36 individuals), not using TCM, and an experimental group (28 individuals), using TCM at ABACO/Sohaku-in Institute, Brazil. Methods: A questionnaire on elderly QOL devised by the World Health Organization, the WHOQOL-Old, was adopted and descriptive statistical techniques were used: mean and standard deviation. The Shapiro–Wilk test checked the normality of the distribution. Furthermore, based on its normality distribution for the intergroup comparison, the Student t test was applied to facets 2, 4, 5, 6, and total score, and the Mann–Whitney U rank test to facets 1 and 3, both tests aiming to analyze the P value between experimental and control groups. The significance level utilized was 95% (P < 0.05). Results: The experimental group reported the highest QOL for every facet and the total score. Conclusions: The results suggest that TCM raises the level of QOL. PMID:21103400
NASA Technical Reports Server (NTRS)
1972-01-01
The experimental determination of purge bag materials properties, development of purge bag manufacturing techniques, experimental evaluation of a subscale purge bag under simulated operating conditions and the experimental evaluation of the purge pin concept for MLI purging are discussed. The basic purge bag material, epoxy fiberglass bounded by skins of FEP Teflon, showed no significant permeability to helium flow under normal operating conditions. Purge bag small scale manufacturing tests were conducted to develop tooling and fabrication techniques for use in full scale bag manufacture. A purge bag material layup technique was developed whereby the two plys of epoxy fiberglass enclosed between skins of FEP Teflon are vacuum bag cured in an oven in a single operation. The material is cured on a tool with the shape of a purge bag half. Plastic tooling was selected for use in bag fabrication. A model purge bag 0.6 m in diameter was fabricated and subjected to a series of structural and environmental tests simulating various flight type environments. Pressure cycling tests at high (450 K) and low (200 K) temperature as well as acoustic loading tests were performed. The purge bag concept proved to be structurally sound and was used for the full scale bag detailed design model.
Kumar, Sasi; Adiga, Kasturi Ramesh; George, Anice
2014-01-01
Old age is a period when people need physical, emotional, and psychological support. Depression is the most prevalent mental health problem among older adults and it contributes to increase in medical morbidity and mortality, reduces quality of life and elevates health care costs. Therefore early diagnosis and effective management are required to improve the quality of life of older adults suffering from depression. Intervention like Mindfulness based Stress Reduction is a powerful relaxation technique to provide quick way to get rid of depression and negative emotions by increasing mindfulness. The study was undertaken to assess the effectiveness of MBSR on depression among elderly residing in residential homes, Bangalore. In this study, quasi experimental pre-test post-test control group research design was used. There were two groups: experimental and control, each group had 30 samples selected from different residential homes by non-probability convenience sampling technique. Pre-test depression and mindfulness was assessed before the first day of intervention. Experimental group participants were provided intervention on MBSR. Assessment of post-test depression and mindfulness was done at the end of the intervention programme for both group participants. The study revealed significant reduction in depression (p < 0.001) and increase in mindfulness (p < 0.001) among elderly in the experimental group who were subjected to MBSR technique.
Protein and genome evolution in Mammalian cells for biotechnology applications.
Majors, Brian S; Chiang, Gisela G; Betenbaugh, Michael J
2009-06-01
Mutation and selection are the essential steps of evolution. Researchers have long used in vitro mutagenesis, expression, and selection techniques in laboratory bacteria and yeast cultures to evolve proteins with new properties, termed directed evolution. Unfortunately, the nature of mammalian cells makes applying these mutagenesis and whole-organism evolution techniques to mammalian protein expression systems laborious and time consuming. Mammalian evolution systems would be useful to test unique mammalian cell proteins and protein characteristics, such as complex glycosylation. Protein evolution in mammalian cells would allow for generation of novel diagnostic tools and designer polypeptides that can only be tested in a mammalian expression system. Recent advances have shown that mammalian cells of the immune system can be utilized to evolve transgenes during their natural mutagenesis processes, thus creating proteins with unique properties, such as fluorescence. On a more global level, researchers have shown that mutation systems that affect the entire genome of a mammalian cell can give rise to cells with unique phenotypes suitable for commercial processes. This review examines the advances in mammalian cell and protein evolution and the application of this work toward advances in commercial mammalian cell biotechnology.
Jung, Gyu-Un; Kim, Jun Hwan; Lim, Nam Hun; Yoon, Gil Ho; Han, Ji-Young
2017-06-01
Ridge splitting techniques are used for horizontal ridge augmentation in implant dentistry. Recently, a novel engine-driven ridge splitting technique was introduced. This study compared the mechanical forces produced by conventional and engine-driven ridge splitting techniques in porcine mandibles. In 33 pigs, mandibular premolar areas were selected for the ridge splitting procedures, designed as a randomized split-mouth study. The conventional group underwent a chisel-and-mallet procedure (control group, n = 20), and percussive impulse (Newton second, Ns) was measured using a sensor attached to the mallet. In the engine-driven ridge spreader group (test group, n = 23), a load cell was used to measure torque values (Newton centimeter, Ncm). Horizontal acceleration generated during procedures (control group, n = 10 and test group, n = 10) was compared between the groups. After ridge splitting, the alveolar crest width was significantly increased both in the control (1.23 ± 0.45 mm) and test (0.98 ± 0.41 mm) groups with no significant differences between the groups. The average impulse of the control group was 4.74 ± 1.05 Ns. Torque generated by rotation in the test group was 9.07 ± 2.15 Ncm. Horizontal acceleration was significantly less in the test group (0.82 ± 1.05 g) than the control group (64.07 ± 42.62 g) (P < 0.001). Narrow edentulous ridges can be expanded by novel engine-driven ridge spreaders. Within the limits of this study, the results suggested that an engine-driven ridge splitting technique may be less traumatic and less invasive than a conventional ridge splitting technique. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Mapping chemicals in air using an environmental CAT scanning system: evaluation of algorithms
NASA Astrophysics Data System (ADS)
Samanta, A.; Todd, L. A.
A new technique is being developed which creates near real-time maps of chemical concentrations in air for environmental and occupational environmental applications. This technique, we call Environmental CAT Scanning, combines the real-time measuring technique of open-path Fourier transform infrared spectroscopy with the mapping capabilitites of computed tomography to produce two-dimensional concentration maps. With this system, a network of open-path measurements is obtained over an area; measurements are then processed using a tomographic algorithm to reconstruct the concentrations. This research focussed on the process of evaluating and selecting appropriate reconstruction algorithms, for use in the field, by using test concentration data from both computer simultation and laboratory chamber studies. Four algorithms were tested using three types of data: (1) experimental open-path data from studies that used a prototype opne-path Fourier transform/computed tomography system in an exposure chamber; (2) synthetic open-path data generated from maps created by kriging point samples taken in the chamber studies (in 1), and; (3) synthetic open-path data generated using a chemical dispersion model to create time seires maps. The iterative algorithms used to reconstruct the concentration data were: Algebraic Reconstruction Technique without Weights (ART1), Algebraic Reconstruction Technique with Weights (ARTW), Maximum Likelihood with Expectation Maximization (MLEM) and Multiplicative Algebraic Reconstruction Technique (MART). Maps were evaluated quantitatively and qualitatively. In general, MART and MLEM performed best, followed by ARTW and ART1. However, algorithm performance varied under different contaminant scenarios. This study showed the importance of using a variety of maps, particulary those generated using dispersion models. The time series maps provided a more rigorous test of the algorithms and allowed distinctions to be made among the algorithms. A comprehensive evaluation of algorithms, for the environmental application of tomography, requires the use of a battery of test concentration data before field implementation, which models reality and tests the limits of the algorithms.
Solomon Technique Versus Selective Coagulation for Twin-Twin Transfusion Syndrome.
Slaghekke, Femke; Oepkes, Dick
2016-06-01
Monochorionic twin pregnancies can be complicated by twin-to-twin transfusion syndrome (TTTS). The best treatment option for TTTS is fetoscopic laser coagulation of the vascular anastomoses between donor and recipient. After laser therapy, up to 33% residual anastomoses were seen. These residual anastomoses can cause twin anemia polycythemia sequence (TAPS) and recurrent TTTS. In order to reduce the number of residual anastomoses and their complications, a new technique, the Solomon technique, where the whole vascular equator will be coagulated, was introduced. The Solomon technique showed a reduction of recurrent TTS compared to the selective technique. The incidence of recurrent TTTS after the Solomon technique ranged from 0% to 3.9% compared to 5.3-8.5% after the selective technique. The incidence of TAPS after the Solomon technique ranged from 0% to 2.9% compared to 4.2-15.6% after the selective technique. The Solomon technique may improve dual survival rates ranging from 64% to 85% compared to 46-76% for the selective technique. There was no difference reported in procedure-related complications such as intrauterine infection and preterm premature rupture of membranes. The Solomon technique significantly reduced the incidence of TAPS and recurrent TTTS and may improve survival and neonatal outcome, without identifiable adverse outcome or complications; therefore, the Solomon technique is recommended for the treatment of TTTS.
NASA Astrophysics Data System (ADS)
Findeis, Dirk; Gryzagoridis, Jasson; Musonda, Vincent
2008-09-01
Digital Shearography and Infrared Thermography (IRT) techniques were employed to test non-destructively samples from aircraft structures of composite material nature. Background information on the techniques is presented and it is noted that much of the inspection work reviewed in the literature has focused on qualitative evaluation of the defects rather than quantitative. There is however, need to quantify the defects if the threshold rejection criterion of whether the component inspected is fit for service has to be established. In this paper an attempt to quantify induced defects on a helicopter main rotor blade and Unmanned Aerospace Vehicle (UAV) composite material is presented. The fringe patterns exhibited by Digital Shearography were used to quantify the defects by relating the number of fringes created to the depth of the defect or flaw. Qualitative evaluation of defects with IRT was achieved through a hot spot temperature indication above the flaw on the surface of the material. The results of the work indicate that the Shearographic technique proved to be more sensitive than the IRT technique. It should be mentioned that there is "no set standard procedure" tailored for testing of composites. Each composite material tested is more likely to respond differently to defect detection and this depends generally on the component geometry and a suitable selection of the loading system to suit a particular test. The experimental procedure that is reported in this paper can be used as a basis for designing a testing or calibration procedure for defects detection on any particular composite material component or structure.
NASA Astrophysics Data System (ADS)
Vatutin, Eduard
2017-12-01
The article deals with the problem of analysis of effectiveness of the heuristic methods with limited depth-first search techniques of decision obtaining in the test problem of getting the shortest path in graph. The article briefly describes the group of methods based on the limit of branches number of the combinatorial search tree and limit of analyzed subtree depth used to solve the problem. The methodology of comparing experimental data for the estimation of the quality of solutions based on the performing of computational experiments with samples of graphs with pseudo-random structure and selected vertices and arcs number using the BOINC platform is considered. It also shows description of obtained experimental results which allow to identify the areas of the preferable usage of selected subset of heuristic methods depending on the size of the problem and power of constraints. It is shown that the considered pair of methods is ineffective in the selected problem and significantly inferior to the quality of solutions that are provided by ant colony optimization method and its modification with combinatorial returns.
NASA Technical Reports Server (NTRS)
Langhans, Robert W.
1997-01-01
The aims under this training grant, as under the subsequent fellowship, were to elaborate the theory and technique of cultivar evaluation for specialized controlled environments, then to employ the technique on selected crops, ultimately conducting cultivar trials, and making the knowledge gained available for use in NASA's space program. We undertook a comprehensive search of the Cornell agricultural library (Mann library) and its data-bases for any and all material relating to cultivar evaluation of vegetable crops, and also developed the logic of how to go about narrowing down the field of contending cultivars when undertaking cultivar trials. The results of this work, the principal outcome of the grant, are reflected in his MS thesis, particularly in Chapter 2, "Commercial and Scientific Literature," and even more so in Chapter 8, "Selecting cultivars and lines for screening." David also attended annual conferences of vegetable crop plant breeders, annual yield trials and breeding trials for vegetable crops, as well as relevant professional conferences such as the ASHS annual meetings, and the others.
Murphy, Christian; Vaughan, Moses; Ilahi, Waseem; Kaiser, Gail
2010-01-01
For large, complex software systems, it is typically impossible in terms of time and cost to reliably test the application in all possible execution states and configurations before releasing it into production. One proposed way of addressing this problem has been to continue testing and analysis of the application in the field, after it has been deployed. A practical limitation of many such automated approaches is the potentially high performance overhead incurred by the necessary instrumentation. However, it may be possible to reduce this overhead by selecting test cases and performing analysis only in previously-unseen application states, thus reducing the number of redundant tests and analyses that are run. Solutions for fault detection, model checking, security testing, and fault localization in deployed software may all benefit from a technique that ignores application states that have already been tested or explored. In this paper, we present a solution that ensures that deployment environment tests are only executed in states that the application has not previously encountered. In addition to discussing our implementation, we present the results of an empirical study that demonstrates its effectiveness, and explain how the new approach can be generalized to assist other automated testing and analysis techniques intended for the deployment environment. PMID:21197140
Prediction of Baseflow Index of Catchments using Machine Learning Algorithms
NASA Astrophysics Data System (ADS)
Yadav, B.; Hatfield, K.
2017-12-01
We present the results of eight machine learning techniques for predicting the baseflow index (BFI) of ungauged basins using a surrogate of catchment scale climate and physiographic data. The tested algorithms include ordinary least squares, ridge regression, least absolute shrinkage and selection operator (lasso), elasticnet, support vector machine, gradient boosted regression trees, random forests, and extremely randomized trees. Our work seeks to identify the dominant controls of BFI that can be readily obtained from ancillary geospatial databases and remote sensing measurements, such that the developed techniques can be extended to ungauged catchments. More than 800 gauged catchments spanning the continental United States were selected to develop the general methodology. The BFI calculation was based on the baseflow separated from daily streamflow hydrograph using HYSEP filter. The surrogate catchment attributes were compiled from multiple sources including digital elevation model, soil, landuse, climate data, other publicly available ancillary and geospatial data. 80% catchments were used to train the ML algorithms, and the remaining 20% of the catchments were used as an independent test set to measure the generalization performance of fitted models. A k-fold cross-validation using exhaustive grid search was used to fit the hyperparameters of each model. Initial model development was based on 19 independent variables, but after variable selection and feature ranking, we generated revised sparse models of BFI prediction that are based on only six catchment attributes. These key predictive variables selected after the careful evaluation of bias-variance tradeoff include average catchment elevation, slope, fraction of sand, permeability, temperature, and precipitation. The most promising algorithms exceeding an accuracy score (r-square) of 0.7 on test data include support vector machine, gradient boosted regression trees, random forests, and extremely randomized trees. Considering both the accuracy and the computational complexity of these algorithms, we identify the extremely randomized trees as the best performing algorithm for BFI prediction in ungauged basins.
Practical input optimization for aircraft parameter estimation experiments. Ph.D. Thesis, 1990
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.
1993-01-01
The object of this research was to develop an algorithm for the design of practical, optimal flight test inputs for aircraft parameter estimation experiments. A general, single pass technique was developed which allows global optimization of the flight test input design for parameter estimation using the principles of dynamic programming with the input forms limited to square waves only. Provision was made for practical constraints on the input, including amplitude constraints, control system dynamics, and selected input frequency range exclusions. In addition, the input design was accomplished while imposing output amplitude constraints required by model validity and considerations of safety during the flight test. The algorithm has multiple input design capability, with optional inclusion of a constraint that only one control move at a time, so that a human pilot can implement the inputs. It is shown that the technique can be used to design experiments for estimation of open loop model parameters from closed loop flight test data. The report includes a new formulation of the optimal input design problem, a description of a new approach to the solution, and a summary of the characteristics of the algorithm, followed by three example applications of the new technique which demonstrate the quality and expanded capabilities of the input designs produced by the new technique. In all cases, the new input design approach showed significant improvement over previous input design methods in terms of achievable parameter accuracies.
Lung Function Measurements in Rodents in Safety Pharmacology Studies
Hoymann, Heinz Gerd
2012-01-01
The ICH guideline S7A requires safety pharmacology tests including measurements of pulmonary function. In the first step – as part of the “core battery” – lung function tests in conscious animals are requested. If potential adverse effects raise concern for human safety, these should be explored in a second step as a “follow-up study”. For these two stages of safety pharmacology testing, both non-invasive and invasive techniques are needed which should be as precise and reliable as possible. A short overview of typical in vivo measurement techniques is given, their advantages and disadvantages are discussed and out of these the non-invasive head-out body plethysmography and the invasive but repeatable body plethysmography in orotracheally intubated rodents are presented in detail. For validation purposes the changes in the respective parameters such as tidal midexpiratory flow (EF50) or lung resistance have been recorded in the same animals in typical bronchoconstriction models and compared. In addition, the technique of head-out body plethysmography has been shown to be useful to measure lung function in juvenile rats starting from day two of age. This allows safety pharmacology testing and toxicological studies in juvenile animals as a model for the young developing organism as requested by the regulatory authorities (e.g., EMEA Guideline 1/2008). It is concluded that both invasive and non-invasive pulmonary function tests are capable of detecting effects and alterations on the respiratory system with different selectivity and area of operation. The use of both techniques in a large number of studies in mice and rats in the last years have demonstrated that they provide useful and reliable information on pulmonary mechanics in safety pharmacology and toxicology testing, in investigations of respiratory disorders, and in pharmacological efficacy studies. PMID:22973226
A New Continuous Cooling Transformation Diagram for AISI M4 High-Speed Tool Steel
NASA Astrophysics Data System (ADS)
Briki, Jalel; Ben Slima, Souad
2008-12-01
The increasing evolution of dilatometric techniques now allows for the identification of structural transformations with very low signal. The use of dilatometric techniques coupled with more common techniques, such as metallographic, hardness testing, and x-ray diffraction allows to plot a new CCT diagram for AISI M4 high-speed tool steel. This diagram is useful for a better selection of alternate solutions, hardening, and tempering heat treatments. More accurate determination of the various fields of transformation of austenite during its cooling was made. The precipitation of carbides highlighted at high temperature is at the origin of the martrensitic transformation into two stages (splitting phenomena). For slow cooling rates, it was possible to highlight the ferritic, pearlitic, and bainitic transformation.
NASA Technical Reports Server (NTRS)
Baum, J. D.; Levine, J. N.
1980-01-01
The selection of a satisfactory numerical method for calculating the propagation of steep fronted shock life waveforms in a solid rocket motor combustion chamber is discussed. A number of different numerical schemes were evaluated by comparing the results obtained for three problems: the shock tube problems; the linear wave equation, and nonlinear wave propagation in a closed tube. The most promising method--a combination of the Lax-Wendroff, Hybrid and Artificial Compression techniques, was incorporated into an existing nonlinear instability program. The capability of the modified program to treat steep fronted wave instabilities in low smoke tactical motors was verified by solving a number of motor test cases with disturbance amplitudes as high as 80% of the mean pressure.
Analysis of 3D printing parameters of gears for hybrid manufacturing
NASA Astrophysics Data System (ADS)
Budzik, Grzegorz; Przeszlowski, Łukasz; Wieczorowski, Michal; Rzucidlo, Arkadiusz; Gapinski, Bartosz; Krolczyk, Grzegorz
2018-05-01
The paper deals with analysis and selection of parameters of rapid prototyping of gears by selective sintering of metal powders. Presented results show wide spectrum of application of RP systems in manufacturing processes of machine elements, basing on analysis of market in term of application of additive manufacturing technology in different sectors of industry. Considerable growth of these methods over the past years can be observed. The characteristic errors of printed model with respect to ideal one for each technique were pointed out. Special attention was paid to the method of preparation of numerical data CAD/STL/RP. Moreover the analysis of manufacturing processes of gear type elements was presented. The tested gears were modeled with different allowances for final machining and made by DMLS. Metallographic analysis and strength tests on prepared specimens were performed. The above mentioned analysis and tests were used to compare the real properties of material with the nominal ones. To improve the quality of surface after sintering the gears were subjected to final machining. The analysis of geometry of gears after hybrid manufacturing method was performed (fig.1). The manufacturing process was defined in a traditional way as well as with the aid of modern manufacturing techniques. Methodology and obtained results can be used for other machine elements than gears and constitutes the general theory of production processes in rapid prototyping methods as well as in designing and implementation of production.
Clones identification of Sequoia sempervirens (D. Don) Endl. in Chile by using PCR-RAPDs technique.
Toral Ibañez, Manuel; Caru, Margarita; Herrera, Miguel A; Gonzalez, Luis; Martin, Luis M; Miranda, Jorge; Navarro-Cerrillo, Rafael M
2009-02-01
A protocol of polymerase chain reaction-random amplified polymorphic DNAs (PCR-RAPDs) was established to analyse the gene diversity and genotype identification for clones of Sequoia sempervirens (D. Don) Endl. in Chile. Ten (out of 34) clones from introduction trial located in Voipir-Villarrica, Chile, were studied. The PCR-RAPDs technique and a modified hexadecyltrimethylammonium bromide (CTAB) protocol were used for genomic DNA extraction. The PCR tests were carried out employing 10-mer random primers. The amplification products were detected by electrophoresis in agarose gels. Forty nine polymorphic bands were obtained with the selected primers (BG04, BF07, BF12, BF13, and BF14) and were ordered according to their molecular size. The genetic similarity between samples was calculated by the Jaccard index and a dendrogram was constructed using a cluster analysis of unweighted pair group method using arithmetic averages (UPGMA). Of the primers tested, 5 (out of 60) RAPD primers were selected for their reproducibility and high polymorphism. A total of 49 polymorphic RAPD bands were detected out of 252 bands. The genetic similarity analysis demonstrates an extensive genetic variability between the tested clones and the dendrogram depicts the genetic relationships among the clones, suggesting a geographic relationship. The results indicate that the RAPD markers permitted the identification of the assayed clones, although they are derived from the same geographic origin.
Clones identification of Sequoia sempervirens (D. Don) Endl. in Chile by using PCR-RAPDs technique*
Toral Ibañez, Manuel; Caru, Margarita; Herrera, Miguel A.; Gonzalez, Luis; Martin, Luis M.; Miranda, Jorge; Navarro-Cerrillo, Rafael M.
2009-01-01
A protocol of polymerase chain reaction-random amplified polymorphic DNAs (PCR-RAPDs) was established to analyse the gene diversity and genotype identification for clones of Sequoia sempervirens (D. Don) Endl. in Chile. Ten (out of 34) clones from introduction trial located in Voipir-Villarrica, Chile, were studied. The PCR-RAPDs technique and a modified hexadecyltrimethylammonium bromide (CTAB) protocol were used for genomic DNA extraction. The PCR tests were carried out employing 10-mer random primers. The amplification products were detected by electrophoresis in agarose gels. Forty nine polymorphic bands were obtained with the selected primers (BG04, BF07, BF12, BF13, and BF14) and were ordered according to their molecular size. The genetic similarity between samples was calculated by the Jaccard index and a dendrogram was constructed using a cluster analysis of unweighted pair group method using arithmetic averages (UPGMA). Of the primers tested, 5 (out of 60) RAPD primers were selected for their reproducibility and high polymorphism. A total of 49 polymorphic RAPD bands were detected out of 252 bands. The genetic similarity analysis demonstrates an extensive genetic variability between the tested clones and the dendrogram depicts the genetic relationships among the clones, suggesting a geographic relationship. The results indicate that the RAPD markers permitted the identification of the assayed clones, although they are derived from the same geographic origin. PMID:19235269
NASA Astrophysics Data System (ADS)
McIntyre, Gregory; Neureuther, Andrew; Slonaker, Steve; Vellanki, Venu; Reynolds, Patrick
2006-03-01
The initial experimental verification of a polarization monitoring technique is presented. A series of phase shifting mask patterns produce polarization dependent signals in photoresist and are capable of monitoring the Stokes parameters of any arbitrary illumination scheme. Experiments on two test reticles have been conducted. The first reticle consisted of a series of radial phase gratings (RPG) and employed special apertures to select particular illumination angles. Measurement sensitivities of about 0.3 percent of the clear field per percent change in polarization state were observed. The second test reticle employed the more sensitive proximity effect polarization analyzers (PEPA), a more robust experimental setup, and a backside pinhole layer for illumination angle selection and to enable characterization of the full illuminator. Despite an initial complication with the backside pinhole alignment, the results correlate with theory. Theory suggests that, once the pinhole alignment is corrected in the near future, the second reticle should achieve a measurement sensitivity of about 1 percent of the clear field per percent change in polarization state. This corresponds to a measurement of the Stokes parameters after test mask calibration, to within about 0.02 to 0.03. Various potential improvements to the design, fabrication of the mask, and experimental setup are discussed. Additionally, to decrease measurement time, a design modification and double exposure technique is proposed to enable electrical detection of the measurement signal.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fossaceca, Rita, E-mail: rfossaceca@hotmail.com; Guzzardi, Giuseppe, E-mail: guz@libero.it; Cerini, Paolo, E-mail: cerini84@hotmail.it
Purpose. To evaluate the efficacy of percutaneous transluminal angioplasty (PTA) in a selected population of diabetic patients with below-the-knee (BTK) disease and to analyze the reliability of the angiosome model. Methods. We made a retrospective analysis of the results of PTA performed in 201 diabetic patients with BTK-only disease treated at our institute from January 2005 to December 2011. We evaluated the postoperative technical success, and at 1, 6, and 12 months' follow-up, we assessed the rates and values of partial and complete ulcer healing, restenosis, major and minor amputation, limb salvage, and percutaneous oximetry (TcPO{sub 2}) (Student's t test).more » We used the angiosome model to compare different clinicolaboratory outcomes in patients treated by direct revascularization (DR) from patients treated with indirect revascularization (IR) technique by Student's t test and the {chi}{sup 2} test. Results. At a mean {+-} standard deviation follow-up of 17.5 {+-} 12 months, we observed a mortality rate of 3.5 %, a major amputation rate of 9.4 %, and a limb salvage rate of 87 % with a statistically significant increase of TcPO{sub 2} values at follow-up compared to baseline (p < 0.05). In 34 patients, treatment was performed with the IR technique and in 167 by DR; in both groups, there was a statistically significant increase of TcPO{sub 2} values at follow-up compared to baseline (p < 0.05), without statistically significant differences in therapeutic efficacy. Conclusion. PTA of the BTK-only disease is a safe and effective option. The DR technique is the first treatment option; we believe, however, that IR is similarly effective, with good results over time.« less
Llorente Ballesteros, M T; Navarro Serrano, I; López Colón, J L
2015-01-01
The aim of this report is to propose a scheme for validation of an analytical technique according to ISO 17025. According to ISO 17025, the fundamental parameters tested were: selectivity, calibration model, precision, accuracy, uncertainty of measurement, and analytical interference. A protocol has been developed that has been applied successfully to quantify zinc in serum by atomic absorption spectrometry. It is demonstrated that our method is selective, linear, accurate, and precise, making it suitable for use in routine diagnostics. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.
Statistical approach for selection of biologically informative genes.
Das, Samarendra; Rai, Anil; Mishra, D C; Rai, Shesh N
2018-05-20
Selection of informative genes from high dimensional gene expression data has emerged as an important research area in genomics. Many gene selection techniques have been proposed so far are either based on relevancy or redundancy measure. Further, the performance of these techniques has been adjudged through post selection classification accuracy computed through a classifier using the selected genes. This performance metric may be statistically sound but may not be biologically relevant. A statistical approach, i.e. Boot-MRMR, was proposed based on a composite measure of maximum relevance and minimum redundancy, which is both statistically sound and biologically relevant for informative gene selection. For comparative evaluation of the proposed approach, we developed two biological sufficient criteria, i.e. Gene Set Enrichment with QTL (GSEQ) and biological similarity score based on Gene Ontology (GO). Further, a systematic and rigorous evaluation of the proposed technique with 12 existing gene selection techniques was carried out using five gene expression datasets. This evaluation was based on a broad spectrum of statistically sound (e.g. subject classification) and biological relevant (based on QTL and GO) criteria under a multiple criteria decision-making framework. The performance analysis showed that the proposed technique selects informative genes which are more biologically relevant. The proposed technique is also found to be quite competitive with the existing techniques with respect to subject classification and computational time. Our results also showed that under the multiple criteria decision-making setup, the proposed technique is best for informative gene selection over the available alternatives. Based on the proposed approach, an R Package, i.e. BootMRMR has been developed and available at https://cran.r-project.org/web/packages/BootMRMR. This study will provide a practical guide to select statistical techniques for selecting informative genes from high dimensional expression data for breeding and system biology studies. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Lynch, John T.
1987-02-01
The present technique for coping with fading and burst noise on HF channels used in digital voice communications transmits digital voice only during high S/N time intervals, and speeds up the speech when necessary to avoid conversation-hindering delays. On the basis of informal listening tests, four test conditions were selected in order to characterize those conditions of speech interruption which would render it comprehensible or incomprehensible. One of the test conditions, 2 secs on and 1/2-sec off, yielded test scores comparable to the reference continuous speech case and is a reasonable match to the temporal variations of a disturbed ionosphere.
Chacko, Shiny
2014-01-01
The conceptual framework of the study, undertaken in select health centres of New Delhi, was based on General System Model. The research approach was evaluative with one group pre-test and post-test design. The study population comprised of Community Health Workers working in selected centres in Najafgarh, Delhi. Purposive sampling technique was used to select a sample of 30 Community Health Workers. A structured knowledge questionnaire was developed to assess the knowledge of subjects. A Structured Teaching Programme was developed to enhance the knowledge of Community Health Workers. Pre-test was given on day 1 and Structured Teaching Programme administered on same day. Post-test was conducted on day 7. Most of the Community Health Workers were in the age group of 21-30 years with academic qualification up to Higher Secondary level. Maximum Community Health Workers had professional qualification as ANM/MPHW (female). Majority of the Community Health Workers had experience up to 5 years. Initially there was deficit in scores of knowledge of Community Health Workers regarding Visual Inspection with Acetic Acid (VIA) test. Mean post-test knowledge scores of Community Health Workers were found to be signifi- cantly higher than their mean pre-test knowledge score. The Community Health Workers after expo- sure to Structured Teaching Programme gained a significant positive relationship between post-test knowledge scores. The study reveals the efficacy of Structured Teaching Programme in enhancing the knowledge of Community Health Workers regarding VIA test and a need for conducting a regular and well planned health teaching programme on VIA test for improving their knowledge on VIA test for the early detection and diagnosis of cervical cancer.
Richardson, Alice M; Lidbury, Brett A
2017-08-14
Data mining techniques such as support vector machines (SVMs) have been successfully used to predict outcomes for complex problems, including for human health. Much health data is imbalanced, with many more controls than positive cases. The impact of three balancing methods and one feature selection method is explored, to assess the ability of SVMs to classify imbalanced diagnostic pathology data associated with the laboratory diagnosis of hepatitis B (HBV) and hepatitis C (HCV) infections. Random forests (RFs) for predictor variable selection, and data reshaping to overcome a large imbalance of negative to positive test results in relation to HBV and HCV immunoassay results, are examined. The methodology is illustrated using data from ACT Pathology (Canberra, Australia), consisting of laboratory test records from 18,625 individuals who underwent hepatitis virus testing over the decade from 1997 to 2007. Overall, the prediction of HCV test results by immunoassay was more accurate than for HBV immunoassay results associated with identical routine pathology predictor variable data. HBV and HCV negative results were vastly in excess of positive results, so three approaches to handling the negative/positive data imbalance were compared. Generating datasets by the Synthetic Minority Oversampling Technique (SMOTE) resulted in significantly more accurate prediction than single downsizing or multiple downsizing (MDS) of the dataset. For downsized data sets, applying a RF for predictor variable selection had a small effect on the performance, which varied depending on the virus. For SMOTE, a RF had a negative effect on performance. An analysis of variance of the performance across settings supports these findings. Finally, age and assay results for alanine aminotransferase (ALT), sodium for HBV and urea for HCV were found to have a significant impact upon laboratory diagnosis of HBV or HCV infection using an optimised SVM model. Laboratories looking to include machine learning via SVM as part of their decision support need to be aware that the balancing method, predictor variable selection and the virus type interact to affect the laboratory diagnosis of hepatitis virus infection with routine pathology laboratory variables in different ways depending on which combination is being studied. This awareness should lead to careful use of existing machine learning methods, thus improving the quality of laboratory diagnosis.
Felipe, Maria Emília M C; Andrade, Patrícia F; Grisi, Marcio F M; Souza, Sérgio L S; Taba, Mário; Palioto, Daniela B; Novaes, Arthur B
2007-07-01
The aim of this randomized, controlled, clinical investigation was to compare two surgical techniques for root coverage with the acellular dermal matrix graft to evaluate which technique provided better root coverage, a better esthetic result, and less postoperative discomfort. Fifteen patients with bilateral Miller Class I or II gingival recessions were selected. Fifteen pairs of recessions were treated and assigned randomly to the test group, and the contralateral recessions were assigned to the control group. The control group was treated with a broader flap and vertical releasing incisions; the test group was treated with the proposed surgical technique, without vertical releasing incisions. The clinical parameters evaluated were probing depth, relative clinical attachment level, gingival recession (GR), width of keratinized tissue, thickness of keratinized tissue, esthetic result, and pain evaluation. The measurements were taken before the surgeries and after 6 months. At baseline, all parameters were similar for both groups. At 6 months, a statistically significant greater reduction in GR favored the control group. The percentage of root coverage was 68.98% and 84.81% for the test and control groups, respectively. The esthetic result was equivalent between the groups, and all patients tolerated both procedures well. Both techniques provided significant root coverage, good esthetic results, and similar levels of postoperative discomfort. However, the control technique had statistically significantly better results for root coverage of localized gingival recessions.
Feathering effect detection and artifact agglomeration index-based video deinterlacing technique
NASA Astrophysics Data System (ADS)
Martins, André Luis; Rodrigues, Evandro Luis Linhari; de Paiva, Maria Stela Veludo
2018-03-01
Several video deinterlacing techniques have been developed, and each one presents a better performance in certain conditions. Occasionally, even the most modern deinterlacing techniques create frames with worse quality than primitive deinterlacing processes. This paper validates that the final image quality can be improved by combining different types of deinterlacing techniques. The proposed strategy is able to select between two types of deinterlaced frames and, if necessary, make the local correction of the defects. This decision is based on an artifact agglomeration index obtained from a feathering effect detection map. Starting from a deinterlaced frame produced by the "interfield average" method, the defective areas are identified, and, if deemed appropriate, these areas are replaced by pixels generated through the "edge-based line average" method. Test results have proven that the proposed technique is able to produce video frames with higher quality than applying a single deinterlacing technique through getting what is good from intra- and interfield methods.
Systematic Review of Retraction Devices for Laparoscopic Surgery.
Vargas-Palacios, Armando; Hulme, Claire; Veale, Thomas; Downey, Candice L
2016-02-01
Retraction plays a vital role in optimizing the field of vision in minimal-access surgery. As such, a number of devices have been marketed to aid the surgeon in laparoscopic retraction. This systematic review explores the advantages and disadvantages of the different instruments in order to aid surgeons and their institutions in selecting the appropriate device. Primary outcome measures include operation time, length of stay, use of staff, patient morbidity, ease of use, conversion rates to open surgery, and cost. Systematic literature searches were performed in MEDLINE, EMBASE, The Cochrane Library, Current Controlled Trials, and ClinicalTrials.gov. The search strategy focused on studies testing a retraction device. The selection process was based on a predefined set of inclusion and exclusion criteria. Data were then extracted and analyzed. Out of 1360 papers initially retrieved, 12 articles were selected for data extraction and analysis. A total of 10 instruments or techniques were tested. Devices included the Nathanson's liver retractor, liver suspension tape, the V-List technique, a silicone disk with or without a snake retractor, the Endoloop, the Endograb, a magnetic retractor, the VaroLift, a laparoscope holder, and a retraction sponge. None of the instruments reported were associated with increased morbidity. No studies found increased rates of conversion to open surgery. All articles reported that the tested instruments might spare the use of an assistant during the procedure. It was not possible to determine the impact on length of stay or operation time. Each analyzed device facilitates retraction, providing a good field of view while allowing reduced staff numbers and minimal patient morbidity. Due to economic and environmental advantages, reusable devices may be preferable to disposable instruments, although the choice must be primarily based on clinical judgement. © The Author(s) 2015.
Cho, Chak-Lam; Majzoub, Ahmad; Esteves, Sandro C.
2017-01-01
Sperm DNA fragmentation (SDF) testing has been emerging as a valuable tool for male fertility evaluation. While the essential role of sperm DNA integrity in human reproduction was extensively studied, the clinical indication of SDF testing is less clear. This clinical practice guideline provides recommendations of clinical utility of the test supported by evidence. It is intended to serve as a reference for fertility specialists in identifying the circumstances in which SDF testing should be of greatest clinical value. SDF testing is recommended in patients with clinical varicocele and borderline to normal semen parameters as it can better select varicocelectomy candidates. Outcomes of natural pregnancy and assisted reproductive techniques (ART) can be predicted by result of SDF tests. High SDF is also linked with recurrent pregnancy loss (RPL) and failure of ART. Result of SDF testing may change the management decision by selecting the most appropriate ART with the highest success rate for infertile couples. Several studies have demonstrated the benefit in using testicular instead of ejaculated sperm in men with high SDF, oligozoospermia or recurrent in vitro fertilization (IVF) failure. Infertile men with modifiable lifestyle factor may benefit from SDF testing by reinforcing risk factor modification and monitoring patient’s progress to intervention. PMID:29082206
Mapping land use changes in the carboniferous region of Santa Catarina, report 2
NASA Technical Reports Server (NTRS)
Valeriano, D. D. (Principal Investigator); Bitencourtpereira, M. D.
1983-01-01
The techniques applied to MSS-LANDSAT data in the land-use mapping of Criciuma region (Santa Catarina state, Brazil) are presented along with the results of a classification accuracy estimate tested on the resulting map. The MSS-LANDSAT data digital processing involves noise suppression, features selection and a hybrid classifier. The accuracy test is made through comparisons with aerial photographs of sampled points. The utilization of digital processing to map the classes agricultural lands, forest lands and urban areas is recommended, while the coal refuse areas should be mapped visually.
Gazdik, Gertrude C.; Behum, Paul T.
1983-01-01
During the recent U.S. Bureau of Mines field investigation, 21 samples were collected (fig. 2) and were submitted to the Bureau's Reno Metallurgy Research Center, Reno, Nev., for analysis. All samples were tested for 40 elements by semiquantitative spectrographic analyses. Additional testing by atomic absorption, neutron activation, and wet chemical techniques was performed for selected elements on some samples. Two shale samples were submitted to the Bureau of Mines, Tuscaloosa Metallurgy Research Center, Tuscaloosa, Ala., for the evaluation of ceramic properties.
Demonstration of Advanced C/SiC Cooled Ramp
NASA Technical Reports Server (NTRS)
Bouquet, Clement; Laithier, Frederic; Lawrence, Timothy; Eckel, Andrew; Munafo, Paul M. (Technical Monitor)
2002-01-01
Under a NASA contract, SPS is evaluating its C/SiC to metal brazing technique for the development of light, composite, actively cooled panels. The program first consisted of defining a system applicable to the X-33 nozzle ramp. SPS then performed evaluation tests for tube, composite, and braze material selection, and for the adaptation of braze process parameters to the parts geometry. SPS is presently manufacturing a 250x60 millimeter squared specimen, including 10 metallic tubes, which will be cycled in the NASA/GRC-CELL-22 test bed under engine representative conditions.
Carbon Mineralization by Aqueous Precipitation for Beneficial Use of CO2 from Flue Gas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Devenney, Martin; Gilliam, Ryan; Seeker, Randy
2013-08-01
The objective of this project is to demonstrate an innovative process to mineralize CO2 from flue gas directly to reactive carbonates and maximize the value and versatility of its beneficial use products. The program scope includes the design, construction, and testing of a CO2 Conversion to Material Products (CCMP) Pilot Demonstration Plant utilizing CO2 from the flue gas of a power production facility in Moss Landing, CA. This topical report covers Subphase 2a which is the design phase of pilot demonstration subsystems. Materials of construction have been selected and proven in both lab scale and prototype testing to be acceptablemore » for the reagent conditions of interest. The target application for the reactive carbonate material has been selected based upon small-scale feasibility studies and the design of a continuous fiber board production line has been completed. The electrochemical cell architecture and components have been selected based upon both lab scale and prototype testing. The appropriate quality control and diagnostic techniques have been developed and tested along with the required instrumentation and controls. Finally the demonstrate site infrastructure, NEPA categorical exclusion, and permitting is all ready for the construction and installation of the new units and upgrades.« less
Flight testing of a luminescent surface pressure sensor
NASA Technical Reports Server (NTRS)
Mclachlan, B. G.; Bell, J. H.; Espina, J.; Gallery, J.; Gouterman, M.; Demandante, C. G. N.; Bjarke, L.
1992-01-01
NASA ARC has conducted flight tests of a new type of aerodynamic pressure sensor based on a luminescent surface coating. Flights were conducted at the NASA ARC-Dryden Flight Research Facility. The luminescent pressure sensor is based on a surface coating which, when illuminated with ultraviolet light, emits visible light with an intensity dependent on the local air pressure on the surface. This technique makes it possible to obtain pressure data over the entire surface of an aircraft, as opposed to conventional instrumentation, which can only make measurements at pre-selected points. The objective of the flight tests was to evaluate the effectiveness and practicality of a luminescent pressure sensor in the actual flight environment. A luminescent pressure sensor was installed on a fin, the Flight Test Fixture (FTF), that is attached to the underside of an F-104 aircraft. The response of one particular surface coating was evaluated at low supersonic Mach numbers (M = 1.0-1.6) in order to provide an initial estimate of the sensor's capabilities. This memo describes the test approach, the techniques used, and the pressure sensor's behavior under flight conditions. A direct comparison between data provided by the luminescent pressure sensor and that produced by conventional pressure instrumentation shows that the luminescent sensor can provide quantitative data under flight conditions. However, the test results also show that the sensor has a number of limitations which must be addressed if this technique is to prove useful in the flight environment.
Treatment of atomic and molecular line blanketing by opacity sampling
NASA Technical Reports Server (NTRS)
Johnson, H. R.; Krupp, B. M.
1976-01-01
A sampling technique for treating the radiative opacity of large numbers of atomic and molecular lines in cool stellar atmospheres is subjected to several tests. In this opacity sampling (OS) technique, the global opacity is sampled at only a selected set of frequencies, and at each of these frequencies the total monochromatic opacity is obtained by summing the contribution of every relevant atomic and molecular line. In accord with previous results, we find that the structure of atmospheric models is accurately fixed by the use of 1000 frequency points, and 100 frequency points are adequate for many purposes. The effects of atomic and molecular lines are separately studied. A test model computed using the OS method agrees very well with a model having identical atmospheric parameters, but computed with the giant line (opacity distribution function) method.
NASA Technical Reports Server (NTRS)
Guasp, Edwin; Manzo, Michelle A.
1997-01-01
Secondary alkaline batteries, such as nickel-cadmium and silver-zinc, are commonly used for aerospace applications. The uniform evaluation and comparison of separator properties for these systems is dependent upon the measurement techniques. This manual presents a series of standard test procedures that can be used to evaluate, compare, and select separator materials for use in alkaline batteries. Detailed test procedures evaluating the following characteristics are included in this manual: physical measurements of thickness and area weight, dimensional stability measurements, electrolyte retention, resistivity, permeability as measured via bubble pressure, surface evaluation via SEM, chemical stability, and tensile strength.
The use of applied software for the professional training of students studying humanities
NASA Astrophysics Data System (ADS)
Sadchikova, A. S.; Rodin, M. M.
2017-01-01
Research practice is an integral part of humanities students' training process. In this regard the training process is to include modern information techniques of the training process of students studying humanities. This paper examines the most popular applied software products used for data processing in social science. For testing purposes we selected the most commonly preferred professional packages: MS Excel, IBM SPSS Statistics, STATISTICA, STADIA. Moreover the article contains testing results of a specialized software Prikladnoy Sotsiolog that is applicable for the preparation stage of the research. The specialised software were tested during one term in groups of students studying humanities.
[Development of critical thinking skill evaluation scale for nursing students].
You, So Young; Kim, Nam Cho
2014-04-01
To develop a Critical Thinking Skill Test for Nursing Students. The construct concepts were drawn from a literature review and in-depth interviews with hospital nurses and surveys were conducted among students (n=607) from nursing colleges. The data were collected from September 13 to November 23, 2012 and analyzed using the SAS program, 9.2 version. The KR 20 coefficient for reliability, difficulty index, discrimination index, item-total correlation and known group technique for validity were performed. Four domains and 27 skills were identified and 35 multiple choice items were developed. Thirty multiple choice items which had scores higher than .80 on the content validity index were selected for the pre test. From the analysis of the pre test data, a modified 30 items were selected for the main test. In the main test, the KR 20 coefficient was .70 and Corrected Item-Total Correlations range was .11-.38. There was a statistically significant difference between two academic systems (p=.001). The developed instrument is the first critical thinking skill test reflecting nursing perspectives in hospital settings and is expected to be utilized as a tool which contributes to improvement of the critical thinking ability of nursing students.
Potential Audiological and MRI Markers of Tinnitus.
Gopal, Kamakshi V; Thomas, Binu P; Nandy, Rajesh; Mao, Deng; Lu, Hanzhang
2017-09-01
Subjective tinnitus, or ringing sensation in the ear, is a common disorder with no accepted objective diagnostic markers. The purpose of this study was to identify possible objective markers of tinnitus by combining audiological and imaging-based techniques. Case-control studies. Twenty adults drawn from our audiology clinic served as participants. The tinnitus group consisted of ten participants with chronic bilateral constant tinnitus, and the control group consisted of ten participants with no history of tinnitus. Each participant with tinnitus was closely matched with a control participant on the basis of age, gender, and hearing thresholds. Data acquisition focused on systematic administration and evaluation of various audiological tests, including auditory-evoked potentials (AEP) and otoacoustic emissions, and magnetic resonance imaging (MRI) tests. A total of 14 objective test measures (predictors) obtained from audiological and MRI tests were subjected to statistical analyses to identify the best predictors of tinnitus group membership. The least absolute shrinkage and selection operator technique for feature extraction, supplemented by the leave-one-out cross-validation technique, were used to extract the best predictors. This approach provided a conservative model that was highly regularized with its error within 1 standard error of the minimum. The model selected increased frontal cortex (FC) functional MRI activity to pure tones matching their respective tinnitus pitch, and augmented AEP wave N₁ amplitude growth in the tinnitus group as the top two predictors of tinnitus group membership. These findings suggest that the amplified responses to acoustic signals and hyperactivity in attention regions of the brain may be a result of overattention among individuals that experience chronic tinnitus. These results suggest that increased functional MRI activity in the FC to sounds and augmented N₁ amplitude growth may potentially be the objective diagnostic indicators of tinnitus. However, due to the small sample size and lack of subgroups within the tinnitus population in this study, more research is needed before generalizing these findings. American Academy of Audiology
Neutron Imaging for Selective Laser Melting Inconel Hardware with Internal Passages
NASA Technical Reports Server (NTRS)
Tramel, Terri L.; Norwood, Joseph K.; Bilheux, Hassina
2014-01-01
Additive Manufacturing is showing great promise for the development of new innovative designs and large potential life cycle cost reduction for the Aerospace Industry. However, more development work is required to move this technology into space flight hardware production. With selective laser melting (SLM), hardware that once consisted of multiple, carefully machined and inspected pieces, joined together can be made in one part. However standard inspection techniques cannot be used to verify that the internal passages are within dimensional tolerances or surface finish requirements. NASA/MSFC traveled to Oak Ridge National Lab's (ORNL) Spallation Neutron Source to perform some non-destructive, proof of concept imaging measurements to assess the capabilities to understand internal dimensional tolerances and internal passages surface roughness. This presentation will describe 1) the goals of this proof of concept testing, 2) the lessons learned when designing and building these Inconel 718 test specimens to minimize beam time, 3) the neutron imaging test setup and test procedure to get the images, 4) the initial results in images, volume and a video, 4) the assessment of using this imaging technique to gather real data for designing internal flow passages in SLM manufacturing aerospace hardware, and lastly 5) how proper cleaning of the internal passages is critically important. In summary, the initial results are very promising and continued development of a technique to assist in SLM development for aerospace components is desired by both NASA and ORNL. A plan forward that benefits both ORNL and NASA will also be presented, based on the promising initial results. The initial images and volume reconstruction showed that clean, clear images of the internal passages geometry are obtainable. These clear images of the internal passages of simple geometries will be compared to the build model to determine any differences. One surprising result was that a new cleaning process was used on these simply geometric specimens that resulted in what appears to be very smooth internal surfaces, when compared to other aerospace hardware cleaning methods.
Using Ensemble Decisions and Active Selection to Improve Low-Cost Labeling for Multi-View Data
NASA Technical Reports Server (NTRS)
Rebbapragada, Umaa; Wagstaff, Kiri L.
2011-01-01
This paper seeks to improve low-cost labeling in terms of training set reliability (the fraction of correctly labeled training items) and test set performance for multi-view learning methods. Co-training is a popular multiview learning method that combines high-confidence example selection with low-cost (self) labeling. However, co-training with certain base learning algorithms significantly reduces training set reliability, causing an associated drop in prediction accuracy. We propose the use of ensemble labeling to improve reliability in such cases. We also discuss and show promising results on combining low-cost ensemble labeling with active (low-confidence) example selection. We unify these example selection and labeling strategies under collaborative learning, a family of techniques for multi-view learning that we are developing for distributed, sensor-network environments.
Selected techniques in water resources investigations, 1965
Mesnier, Glennon N.; Chase, Edith B.
1966-01-01
Increasing world activity in water-resources development has created an interest in techniques for conducting investigations in the field. In the United States, the Geological Survey has the responsibility for extensive and intensive hydrologic studies, and the Survey places considerable emphasis on discovering better ways to carry out its responsibility. For many years, the dominant interest in field techniques has been "in house," but the emerging world interest has led to a need for published accounts of this progress. In 1963 the Geological Survey published "Selected Techniques in Water Resources Investigations" (Water-Supply Paper 1669-Z) as part of the series "Contributions to the Hydrology of the United States."The report was so favorably received that successive volumes are planned, of which this is the first. The present report contains 25 papers that represent new ideas being tested or applied in the hydrologic field program of the Geological Survey. These ideas range from a proposed system for monitoring fluvial sediment to how to construct stream-gaging wells from steel oil drums. The original papers have been revised and edited by the compilers, but the ideas presented are those of the authors. The general description of the bubble gage on page 2 has been given by the compilers as supplementary information.
NASA Astrophysics Data System (ADS)
Guoxing, Zheng; Minghu, Jiang; Hongliang, Gong; Nannan, Zhang; Jianguang, Wei
2018-02-01
According to basic principles of combining series of strata and demands of same-well injection-production technique, the optimization designing method of same-well injection-production technique’s injection-production circulatory system is given. Based on oil-water two-phase model with condition of arbitrarily well network, a dynamic forecast method for the application of same-well injection-production reservoir is established with considering the demands and capacity of same-well injection-production technique, sample wells are selected to launch the forecast evaluation and analysis of same-well injection-production reservoir application’s effect. Results show: single-test-well composite water cut decreases by 4.7% and test-well-group composite water cut decreases by 1.56% under the condition of basically invariant ground water injection rate. The method provides theoretical support for the proof of same-well injection-production technique’s reservoir development improving effect and further tests.
Non-surgical and non-chemical attempts to treat echinococcosis: do they work?
Tamarozzi, Francesca; Vuitton, Lucine; Brunetti, Enrico; Vuitton, Dominique Angèle; Koch, Stéphane
2014-01-01
Cystic echinococcosis (CE) and alveolar echinococcosis (AE) are chronic, complex and neglected diseases. Their treatment depends on a number of factors related to the lesion, setting and patient. We performed a literature review of curative or palliative non-surgical, non-chemical interventions in CE and AE. In CE, some of these techniques, like radiofrequency thermal ablation (RFA), were shelved after initial attempts, while others, such as High-Intensity Focused Ultrasound, appear promising but are still in a pre-clinical phase. In AE, RFA has never been tested, however, radiotherapy or heavy-ion therapies have been attempted in experimental models. Still, application to humans is questionable. In CE, although prospective clinical studies are still lacking, therapeutic, non-surgical drainage techniques, such as PAIR (puncture, aspiration, injection, re-aspiration) and its derivatives, are now considered a useful option in selected cases. Finally, palliative, non-surgical drainage techniques such as US- or CT-guided percutaneous biliary drainage, centro-parasitic abscesses drainage, or vascular stenting were performed successfully. Recently, endoscopic retrograde cholangiopancreatography (ERCP)-associated techniques have become increasingly used to manage biliary fistulas in CE and biliary obstructions in AE. Development of pre-clinical animal models would allow testing for AE techniques developed for other indications, e.g. cancer. Prospective trials are required to determine the best use of PAIR, and associated procedures, and the indications and techniques of palliative drainage. PMID:25531730
The Colour Test for drug susceptibility testing of Mycobacterium tuberculosis strains.
Toit, K; Mitchell, S; Balabanova, Y; Evans, C A; Kummik, T; Nikolayevskyy, V; Drobniewski, F
2012-08-01
Tartu, Estonia. To assess the performance and feasibility of the introduction of the thin-layer agar MDR/XDR-TB Colour Test (Colour Test) as a non-commercial method of drug susceptibility testing (DST). The Colour Test combines the thin-layer agar technique with a simple colour-coded quadrant format, selective medium to reduce contamination and colorimetric indication of bacterial growth to simplify interpretation. DST patterns for isoniazid (INH), rifampicin (RMP) and ciprofloxacin (CFX) were determined using the Colour Test for 201 archived Mycobacterium tuberculosis isolates. Susceptibilities were compared to blinded DST results obtained routinely using the BACTEC™ Mycobacteria Growth Indicator Tube™ (MGIT) 960 to assess performance characteristics. In all, 98% of the isolates produced interpretable results. The average time to positivity was 13 days, and all results were interpretable. The Colour Test detected drug resistance with 98% sensitivity for INH, RMP and CFX and 99% for multidrug-resistant tuberculosis. Specificities were respectively 100% (95%CI 82-100), 88% (95%CI 69-97) and 91% (95%CI 83-96) and 90% (95%CI 74-98). Agreement between the Colour Test and BACTEC MGIT 960 were respectively 98%, 96%, 94% and 97%. The Colour Test could be an economical, accurate and simple technique for testing tuberculosis strains for drug resistance. As it requires little specialist equipment, it may be particularly useful in resource-constrained settings with growing drug resistance rates.
Development of test methodology for dynamic mechanical analysis instrumentation
NASA Technical Reports Server (NTRS)
Allen, V. R.
1982-01-01
Dynamic mechanical analysis instrumentation was used for the development of specific test methodology in the determination of engineering parameters of selected materials, esp. plastics and elastomers, over a broad range of temperature with selected environment. The methodology for routine procedures was established with specific attention given to sample geometry, sample size, and mounting techniques. The basic software of the duPont 1090 thermal analyzer was used for data reduction which simplify the theoretical interpretation. Clamps were developed which allowed 'relative' damping during the cure cycle to be measured for the fiber-glass supported resin. The correlation of fracture energy 'toughness' (or impact strength) with the low temperature (glassy) relaxation responses for a 'rubber-modified' epoxy system was negative in result because the low-temperature dispersion mode (-80 C) of the modifier coincided with that of the epoxy matrix, making quantitative comparison unrealistic.
Regulski, Miłosz; Piotrowska-Kempisty, Hanna; Prukała, Wiesław; Dutkiewicz, Zbigniew; Regulska, Katarzyna; Stanisz, Beata; Murias, Marek
2018-01-01
25 new trans-stilbene and trans-stilbazole derivatives were investigated using in vitro and in silico techniques. The selectivity and potency of the compounds were assessed using commercial ELISA test. The obtained results were incorporated into 2D QSAR assay. The most promising compound 4-nitro-3',4',5'-trihydroxy-trans-stilbene (N1) was synthetized and its potency and selectivity were confirmed. N1 was classified as preferential COX-2 inhibitor. Its ability to inhibit COX-2 in MCF-7 cell line was established and its cytotoxicity by MTT test was assessed. The compound was more cytotoxic than celecoxib within studied concentration range. Finally, the investigated trans-stilbene was docked into COX-1 and COX-2 active sites using "CDOCKER" protocol. Copyright © 2017 Elsevier Ltd. All rights reserved.
History and Evolution of the Barium Swallow for Evaluation of the Pharynx and Esophagus.
Levine, Marc S; Rubesin, Stephen E
2017-02-01
This article reviews the history of the barium swallow from its early role in radiology to its current status as an important diagnostic test in modern radiology practice. Though a variety of diagnostic procedures can be performed to evaluate patients with dysphagia or other pharyngeal or esophageal symptoms, the barium study has evolved into a readily available, non-invasive, and cost-effective technique that can facilitate the selection of additional diagnostic tests and guide decisions about medical, endoscopic, or surgical management. This article focuses on the evolution of fluoroscopic equipment, radiography, and contrast media for evaluating the pharynx and esophagus, the importance of understanding pharyngoesophageal relationships, and major advances that have occurred in the radiologic diagnosis of select esophageal diseases, including gastroesophageal reflux disease, infectious esophagitis, eosinophilic esophagitis, esophageal carcinoma, and esophageal motility disorders.
Ebner, T; Shebl, O; Moser, M; Mayer, R B; Arzt, W; Tews, G
2011-01-01
Sperm DNA fragmentation is increased in poor-quality semen samples and correlates with failed fertilization, impaired preimplantation development and reduced pregnancy outcome. Common sperm preparation techniques may reduce the percentage of strandbreak-positive spermatozoa, but, to date, there is no reliable approach to exclusively accumulate strandbreak-free spermatozoa. To analyse the efficiency of special sperm selection chambers (Zech-selectors made of glass or polyethylene) in terms of strandbreak reduction, 39 subfertile men were recruited and three probes (native, density gradient and Zech-selector) were used to check for strand breaks using the sperm chromatin dispersion test. The mean percentage of affected spermatozoa in the ejaculate was 15.8 ± 7.8% (range 5.0–42.1%). Density gradient did not significantly improve the quality of spermatozoa selected(14.2 ± 7.0%). However, glass chambers completely removed 90% spermatozoa showing strand breaks and polyethylene chambers removed 76%. Both types of Zech-selectors were equivalent in their efficiency, significantly reduced DNA damage (P < 0.001) and,with respect to this, performed better than density gradient centrifugation (P < 0.001). As far as is known, this is the first report ona sperm preparation technique concentrating spermatozoa unaffected in terms of DNA damage. The special chambers most probably select for sperm motility and/or maturity. Copyright © 2010 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.
Efficient Text Encryption and Hiding with Double-Random Phase-Encoding
Sang, Jun; Ling, Shenggui; Alam, Mohammad S.
2012-01-01
In this paper, a double-random phase-encoding technique-based text encryption and hiding method is proposed. First, the secret text is transformed into a 2-dimensional array and the higher bits of the elements in the transformed array are used to store the bit stream of the secret text, while the lower bits are filled with specific values. Then, the transformed array is encoded with double-random phase-encoding technique. Finally, the encoded array is superimposed on an expanded host image to obtain the image embedded with hidden data. The performance of the proposed technique, including the hiding capacity, the recovery accuracy of the secret text, and the quality of the image embedded with hidden data, is tested via analytical modeling and test data stream. Experimental results show that the secret text can be recovered either accurately or almost accurately, while maintaining the quality of the host image embedded with hidden data by properly selecting the method of transforming the secret text into an array and the superimposition coefficient. By using optical information processing techniques, the proposed method has been found to significantly improve the security of text information transmission, while ensuring hiding capacity at a prescribed level. PMID:23202003
Usability evaluation techniques in mobile commerce applications: A systematic review
NASA Astrophysics Data System (ADS)
Hussain, Azham; Mkpojiogu, Emmanuel O. C.
2016-08-01
Obviously, there are a number of literatures concerning the usability of mobile commerce (m-commerce) applications and related areas, but they do not adequately provide knowledge about usability techniques used in most of the empirical usability evaluation for m-commerce application. Therefore, this paper is aimed at producing the usability techniques frequently used in the aspect of usability evaluation for m-commerce applications. To achieve the stated objective, systematic literature review was employed. Sixty seven papers were downloaded in usability evaluation for m-commerce and related areas; twenty one most relevant studies were selected for review in order to extract the appropriate information. The results from the review shows that heuristic evaluation, formal test and think aloud methods are the most commonly used methods in m-commerce application in comparison to cognitive walkthrough and the informal test methods. Moreover, most of the studies applied control experiment (33.3% of the total studies); other studies that applied case study for usability evaluation are 14.28%. The results from this paper provide additional knowledge to the usability practitioners and research community for the current state and use of usability techniques in m-commerce application.
Effectiveness of touch and feel (TAF) technique on first aid measures for visually challenged.
Mary, Helen; Sasikalaz, D; Venkatesan, Latha
2013-01-01
There is a common perception that a blind person cannot even help his own self. In order to challenge that view, a workshop for visually-impaired people to develop the skills to be independent and productive members of society was conceived. An experimental study was conducted at National Institute of Visually Handicapped, Chennai with the objective to assess the effectiveness of Touch and Feel (TAF) technique on first aid measures for the visually challenged. Total 25 visually challenged people were selected by non-probability purposive sampling technique and data was collected using demographic variable and structured knowledge questionnaire. The score obtained was categorised into three levels: inadequate (0-8), moderately adequate (8 - 17), adequate (17 -25). The study revealed that most of the visually challenged (40%) had inadequate knowledge, and 56 percent had moderately adequate and only few (4%) had adequate knowledge in the pre-test, whereas most (68%) of them had adequate knowledge in the post-test which is statistically significant at p < 0.000 with t-value 6.779. This proves that TAF technique was effective for the visually challenged. There was no association between the demographic variables and their level of knowledge regarding first aid.
Selectivity/Specificity Improvement Strategies in Surface-Enhanced Raman Spectroscopy Analysis
Wang, Feng; Cao, Shiyu; Yan, Ruxia; Wang, Zewei; Wang, Dan; Yang, Haifeng
2017-01-01
Surface-enhanced Raman spectroscopy (SERS) is a powerful technique for the discrimination, identification, and potential quantification of certain compounds/organisms. However, its real application is challenging due to the multiple interference from the complicated detection matrix. Therefore, selective/specific detection is crucial for the real application of SERS technique. We summarize in this review five selective/specific detection techniques (chemical reaction, antibody, aptamer, molecularly imprinted polymers and microfluidics), which can be applied for the rapid and reliable selective/specific detection when coupled with SERS technique. PMID:29160798
Donald J. Kaczmarek; Randall Rousseau; Jeff A. Wright; Brian Wachelka
2014-01-01
Four eastern cottonwood clones, including standard operational clone ST66 and three advanced clonal selections were produced and included in a test utilizing five different plant propagation methods. Despite relatively large first-year growth differences among clones, all clones demonstrated similar responses to the treatments and clone à cutting treatment interactions...
Numerical Solution for Navier-Stokes Equations
NASA Technical Reports Server (NTRS)
Warsi, Z. U. A.; Weed, R. A.; Thompson, J. F.
1982-01-01
Carefully selected blend of computational techniques solves complete set of equations for viscous, unsteady, hypersonic flow in general curvilinear coordinates. New algorithm has tested computation of axially directed flow about blunt body having shape similar to that of such practical bodies as wide-body aircraft or artillery shells. Method offers significant computational advantages because of conservation-law form of equations and because it reduces amount of metric data required.
Results from the Thermal Vacuum Tests of the Chinese-Brazilian Earth Resources Satellite - CBERS FM2
NASA Astrophysics Data System (ADS)
Almeida, J. S.; Garcia, E. C.; Santos, M. B.; Fu, P. Z.
2002-01-01
This paper details the approach adopted and applied on the thermal vacuum tests campaign of the CBERS Flight Model #2 spacecraft, successfully performed at the Integration and Tests Laboratory - LIT, National Institute for Space Research - INPE, in São José dos Campos, SP, from September 7th to 28th, 2001. Measuring approximately 1.8 x 2.0 x 2.2m, weighting 1,500kg and carrying three cameras as the main payload, this spacecraft is scheduled to be launched in China and it will be orbiting the Earth at 778km as part of its remote sensing mission profile. Exploring the capabilities of LIT's 3m x 3m thermal vacuum chamber in terms of appropriately selecting and adjusting its cold shroud temperatures and also some low cost heat input/sink techniques, the spacecraft was adequately disjoined at its service and payload model interface in such a way that each part could physically fit inside the T/V chamber one at a time. Assuring all the necessary functional and test cabling interconnection between the two models through the chamber walls for the proper spacecraft electrical operations as an integrated system, specific thermal test techniques were applied in order to obtain the required hot and cold acceptance levels of temperature at the spacecraft subsystems and structural surfaces, as a consequence of the simulated thermal conditioning from the distinct orbital configurations. These thermal simulation techniques consisted of a combination of skin-heaters, the thermal vacuum chamber main shrouds and dedicated LN2 cold plates, effectively leading to reliable and very satisfactory testing methodology results. Taking more than 350 hours and having 67 people directly involved, including teams from both Brazil and China, this test can be considered as a very important accomplishment in terms of distinct technique of spacecraft testing and also in terms of the satisfactory working relationship between two quite different cultures.
Reibnegger, Gilbert; Caluba, Hans-Christian; Ithaler, Daniel; Manhal, Simone; Neges, Heide Maria; Smolle, Josef
2011-08-01
Admission to medical studies in Austria since academic year 2005-2006 has been regulated by admission tests. At the Medical University of Graz, an admission test focusing on secondary-school-level knowledge in natural sciences has been used for this purpose. The impact of this important change on dropout rates of female versus male students and older versus younger students is reported. All 2,860 students admitted to the human medicine diploma program at the Medical University of Graz from academic years 2002-2003 to 2008-2009 were included. Nonparametric and semiparametric survival analysis techniques were employed to compare cumulative probability of dropout between demographic groups. Cumulative probability of dropout was significantly reduced in students selected by active admission procedure versus those admitted openly (P < .0001). Relative hazard ratio of selected versus openly admitted students was only 0.145 (95% CI, 0.106-0.198). Among openly admitted students, but not for selected ones, the cumulative probabilities for dropout were higher for females (P < .0001) and for older students (P < .0001). Generally, dropout hazard is highest during the second year of study. The introduction of admission testing significantly decreased the cumulative probability for dropout. In openly admitted students a significantly higher risk for dropout was found in female students and in older students, whereas no such effects can be detected after admission testing. Future research should focus on the sex dependence, with the aim of improving success rates among female applicants on the admission tests.
Spectral unmixing of urban land cover using a generic library approach
NASA Astrophysics Data System (ADS)
Degerickx, Jeroen; Lordache, Marian-Daniel; Okujeni, Akpona; Hermy, Martin; van der Linden, Sebastian; Somers, Ben
2016-10-01
Remote sensing based land cover classification in urban areas generally requires the use of subpixel classification algorithms to take into account the high spatial heterogeneity. These spectral unmixing techniques often rely on spectral libraries, i.e. collections of pure material spectra (endmembers, EM), which ideally cover the large EM variability typically present in urban scenes. Despite the advent of several (semi-) automated EM detection algorithms, the collection of such image-specific libraries remains a tedious and time-consuming task. As an alternative, we suggest the use of a generic urban EM library, containing material spectra under varying conditions, acquired from different locations and sensors. This approach requires an efficient EM selection technique, capable of only selecting those spectra relevant for a specific image. In this paper, we evaluate and compare the potential of different existing library pruning algorithms (Iterative Endmember Selection and MUSIC) using simulated hyperspectral (APEX) data of the Brussels metropolitan area. In addition, we develop a new hybrid EM selection method which is shown to be highly efficient in dealing with both imagespecific and generic libraries, subsequently yielding more robust land cover classification results compared to existing methods. Future research will include further optimization of the proposed algorithm and additional tests on both simulated and real hyperspectral data.
NASA Astrophysics Data System (ADS)
Hsu, T.; Tien, K. C.
2005-05-01
This research investigates selected South Dakota science educational outcomes as a function of selected educational reform policies. In the state of South Dakota, echoing divergent reform initiatives from "A Nation at Risk" to "No Child Left Behind," new guidelines and requirements have been instituted. Yet, very little effort has been made to assess the progress of these educational changes. In this study, selected educational outcomes-SAT8/9/10 scores-as a function of selected South Dakota educational reform policies were examined. School districts, ranked in the top and bottom five percent of socioeconomic status (SES) in the state, were selected for analysis. Comparison on student's science educational outcomes was also be made between the two major ethnic populations-Caucasians and Native Americans. All research questions were stated in the null form for hypothesis for statistical testing. Critical t was the statistic technique used to test the hypotheses. The findings revealed that the selected reform policies in South Dakota appeared to assist students from the higher socioeconomic backgrounds to perform better than pupils from the lower socioeconomic backgrounds. The academic performance for the ethnic and social class minorities remained unchanged within the study timeline for reform. Examined from the prism of Michael Apple's critical theory, the selected South Dakota reform policies have paid little attention to the issues of social equality. Continuing and collective efforts to promote equitable reform policies for enhancing the learning experience of all children in South Dakota seem necessary.
Evans, T M; LeChevallier, M W; Waarvick, C E; Seidler, R J
1981-01-01
The species of total coliform bacteria isolated from drinking water and untreated surface water by the membrane filter (MF), the standard most-probable-number (S-MPN), and modified most-probable-number (M-MPN) techniques were compared. Each coliform detection technique selected for a different profile of coliform species from both types of water samples. The MF technique indicated that Citrobacter freundii was the most common coliform species in water samples. However, the fermentation tube techniques displayed selectivity towards the isolation of Escherichia coli and Klebsiella. The M-MPN technique selected for more C. freundii and Enterobacter spp. from untreated surface water samples and for more Enterobacter and Klebsiella spp. from drinking water samples than did the S-MPN technique. The lack of agreement between the number of coliforms detected in a water sample by the S-MPN, M-MPN, and MF techniques was a result of the selection for different coliform species by the various techniques. PMID:7013706
[Reliable microbiological diagnosis of vulvovaginal candidiasis].
Baykushev, R; Ouzounova-Raykova, V; Stoykova, V; Mitov, I
2014-01-01
Vulvovaginal candidiasis is common infection among those affecting the vulva and vagina. Is caused by the perpesentatives from the genus Candida, in most cases C. albicans (85-90%). An increase in the percentage of the so-called non-albicans agents is seen and these pathgogens are often resistant to the most commonly used in the practice antifungals. Faulty diagnosis, incorrect use of azoles, and self-treatment lead to selection of resistant strains and recurrent infections. Identification of Candida species associated with vulvovaginal candidiasis by conventional and PCR techniques. For six months a total number of 213 vaginal secretions were tested applying Gram stain and cultivation on ChromAgar. API Candida fermentation tests and API 20CAUX assimilation tests were performed for the identification of the bacteria. Extraction of DNA of all the smears with subsequent PCR detection of different Candida species were done. 80.7% materials showed presence of blastospores and/or hyphae. Positive culture results were detected in 60 (28.2%) samples. The species specific identification revealed presence of C. albicans in 51 (85%) smears, C. glabrata--in 8 (13.3%), C. krusei--in 2 (3.3%), and S. cervisie--in 1 (2.1%). The PCR technique confirmed the results of the conventional methods. It is worth to mention that 51 of the tested smears were positive for G. vaginalis using additional PCR. The correct diagnosis of the cause of vulvovaginal candidiasis helps in the correct choice of appropriate antifungal therapy and prevents development of recurrent infections and consequences. The PCR based method is rapid, specific and sensitive. It perfectly correlates with the results from the conventional diagnostic tests so it could be selected as a method of choice for the diagnosis of vulvovaginal candidiasis.
Suarez, Ralph O.; Golby, Alexandra; Whalen, Stephen; Sato, Susumu; Theodore, William H.; Kufta, Conrad V.; Devinsky, Orrin; Balish, Marshall; Bromfield, Edward B.
2009-01-01
INTRODUCTION Although the substrates that mediate singing abilities in the human brain are not well understood, invasive brain mapping techniques used for clinical decision making such as intracranial electrocortical testing and Wada testing offer a rare opportunity to examine music-related function in a select group of subjects, affording exceptional spatial and temporal specificity. METHODS We studied eight patients with medically refractory epilepsy undergoing indwelling subdural electrode seizure focus localization. All patients underwent Wada testing for language lateralization. Functional assessment of language and music tasks was done by electrode grid cortical stimulation. One patient was also tested non-invasively with functional MRI. Functional organization of singing ability compared to language ability was determined based on four regions-ofinterest: left and right inferior frontal gyrus (IFG), and left and right posterior superior temporal gyrus (pSTG). RESULTS In some subjects, electrical stimulation of dominant pSTG can interfere with speech and not singing, whereas stimulation of non-dominant pSTG area can interfere with singing and not speech. Stimulation of the dominant IFG tends to interfere with both musical and language expression, while non-dominant IFG stimulation was often observed to cause no interference with either task; and finally, that stimulation of areas adjacent to but not within non-dominant pSTG typically does not affect either ability. FMRI mappings of one subject revealed similar music/language dissociation with respect to activation asymmetry within the regions-of-interest. CONCLUSION Despite inherent limitations with respect to strictly research objectives, invasive clinical techniques offer a rare opportunity to probe musical and language cognitive processes of the brain in a select group of patients. PMID:19570530
Kala, K
2015-01-01
Iron deficiency anaemia is the most common form of malnutrition in the world. The global prevalence of anaemia mainly in South East Asia is 65.5 percent, in India 56 percent among adolescent girls. A study conducted to assess the effectiveness of structured teaching programme on knowledge and attitude of adolescent girls in prevention of iron and folic acid deficiency anaemia at a selected corporation school. It adopted one group pre-test post-test design with 60 samples selected by employing stratified random sampling technique. The study revealed that during pre-test 90 percent of them had inadequate knowledge and 65 percent of them had unfavourable attitude towards iron and folic acid deficiency anaemia. After the structured teaching programme the knowledge and attitude was improved (73% had adequate knowledge and 79% had most favourable attitude). Overall the structured teaching programme was found effective in improving the knowledge and attitude of adolescent girls in prevention of iron and folic acid deficiency anaemia.
LOX/Hydrogen Coaxial Injector Atomization Test Program
NASA Technical Reports Server (NTRS)
Zaller, M.
1990-01-01
Quantitative information about the atomization of injector sprays is needed to improve the accuracy of computational models that predict the performance and stability margin of liquid propellant rocket engines. To obtain this data, a facility for the study of spray atomization is being established at NASA-Lewis to determine the drop size and velocity distributions occurring in vaporizing liquid sprays at supercritical pressures. Hardware configuration and test conditions are selected to make the cold flow simulant testing correspond as closely as possible to conditions in liquid oxygen (LOX)/gaseous H2 rocket engines. Drop size correlations from the literature, developed for liquid/gas coaxial injector geometries, are used to make drop size predictions for LOX/H2 coaxial injectors. The mean drop size predictions for a single element coaxial injector range from 0.1 to 2000 microns, emphasizing the need for additional studies of the atomization process in LOX/H2 engines. Selection of cold flow simulants, measured techniques, and hardware for LOX/H2 atomization simulations are discussed.
Evaluation of the hydrometer for testing immunoglobulin G1 concentrations in Holstein colostrum.
Pritchett, L C; Gay, C C; Hancock, D D; Besser, T E
1994-06-01
Hydrometer measurement in globulin and IgG1 concentration measured by the radial immunodiffusion technique were compared for 915 samples of first milking colostrum from Holstein cows. Least squares analysis of the relationship between hydrometer measurement and IgG1 concentration was improved by log transformation of IgG1 concentration and resulted in a significant linear relationship between hydrometer measurement and log10 IgG1 concentration; r2 = .469. At 50 mg of globulin/ml of colostrum, the recommended hydrometer cutoff point for colostrum selection, the sensitivity of the hydrometer as a test of IgG1 concentration in Holstein colostrum was 26%, and the negative predictive value was 67%. The negative predictive value and sensitivity of the hydrometer as a test of IgG1 in Holstein colostrum was improved, and the cost of misclassification of colostrum was minimized, when the cutoff point for colostrum selection was increased above the recommended 50 mg/ml.
NASA Technical Reports Server (NTRS)
Koontz, S. L.; Albyn, K.; Leger, L.
1990-01-01
The use of thermal atom test methods as a materials selection and screening technique for low-earth orbit (LEO) spacecraft is critically evaluated. The chemistry and physics of thermal atom environments are compared with the LEO environment. The relative reactivities of a number of materials determined in thermal atom environments are compared with those observed in LEO and in high-quality LEO simulations. Reaction efficiencies (cu cm/atom) measured in a new type of thermal atom apparatus are one-thousandth to one ten-thousandth those observed in LEO, and many materials showing nearly identical reactivities in LEO show relative reactivities differing by as much as a factor of eight in thermal atom systems. A simple phenomenological kinetic model for the reaction of oxygen atoms with organic materials can be used to explain the differences in reactivity in different environments. Certain speciic thermal atom test environments can be used as reliable materials screening tools.
Radioactivity in trinitite six decades later.
Parekh, Pravin P; Semkow, Thomas M; Torres, Miguel A; Haines, Douglas K; Cooper, Joseph M; Rosenberg, Peter M; Kitto, Michael E
2006-01-01
The first nuclear explosion test, named the Trinity test, was conducted on July 16, 1945 near Alamogordo, New Mexico. In the tremendous heat of the explosion, the radioactive debris fused with the local soil into a glassy material named Trinitite. Selected Trinitite samples from ground zero (GZ) of the test site were investigated in detail for radioactivity. The techniques used included alpha spectrometry, high-efficiency gamma-ray spectrometry, and low-background beta counting, following the radiochemistry for selected radionuclides. Specific activities were determined for fission products (90Sr, 137Cs), activation products (60Co, 133Ba, 152Eu, 154Eu, 238Pu, 241Pu), and the remnants of the nuclear fuel (239Pu, 240Pu). Additionally, specific activities of three natural radionuclides (40K, 232Th, 238U) and their progeny were measured. The determined specific activities of radionuclides and their relationships are interpreted in the context of the fission process, chemical behavior of the elements, as well as the nuclear explosion phenomenology.
A data base and analysis program for shuttle main engine dynamic pressure measurements
NASA Technical Reports Server (NTRS)
Coffin, T.
1986-01-01
A dynamic pressure data base management system is described for measurements obtained from space shuttle main engine (SSME) hot firing tests. The data were provided in terms of engine power level and rms pressure time histories, and power spectra of the dynamic pressure measurements at selected times during each test. Test measurements and engine locations are defined along with a discussion of data acquisition and reduction procedures. A description of the data base management analysis system is provided and subroutines developed for obtaining selected measurement means, variances, ranges and other statistics of interest are discussed. A summary of pressure spectra obtained at SSME rated power level is provided for reference. Application of the singular value decomposition technique to spectrum interpolation is discussed and isoplots of interpolated spectra are presented to indicate measurement trends with engine power level. Program listings of the data base management and spectrum interpolation software are given. Appendices are included to document all data base measurements.
NASA Astrophysics Data System (ADS)
Grohs, Jacob R.; Li, Yongqiang; Dillard, David A.; Case, Scott W.; Ellis, Michael W.; Lai, Yeh-Hung; Gittleman, Craig S.
Temperature and humidity fluctuations in operating fuel cells impose significant biaxial stresses in the constrained proton exchange membranes (PEMs) of a fuel cell stack. The strength of the PEM, and its ability to withstand cyclic environment-induced stresses, plays an important role in membrane integrity and consequently, fuel cell durability. In this study, a pressure loaded blister test is used to characterize the biaxial strength of Gore-Select ® series 57 over a range of times and temperatures. Hencky's classical solution for a pressurized circular membrane is used to estimate biaxial strength values from burst pressure measurements. A hereditary integral is employed to construct the linear viscoelastic analog to Hencky's linear elastic exact solution. Biaxial strength master curves are constructed using traditional time-temperature superposition principle techniques and the associated temperature shift factors show good agreement with shift factors obtained from constitutive (stress relaxation) and fracture (knife slit) tests of the material.
Lap Shear Testing of Candidate Radiator Panel Adhesives
NASA Technical Reports Server (NTRS)
Ellis, David; Briggs, Maxwell; McGowan, Randy
2013-01-01
During testing of a subscale radiator section used to develop manufacturing techniques for a full-scale radiator panel, the adhesive bonds between the titanium heat pipes and the aluminum face sheets failed during installation and operation. Analysis revealed that the thermal expansion mismatch between the two metals resulted in relatively large shear stresses being developed even when operating the radiator at moderate temperatures. Lap shear testing of the adhesive used in the original joints demonstrated that the two-part epoxy adhesive fell far short of the strength required. A literature review resulted in several candidate adhesives being selected for lap shear joint testing at room temperature and 398 K, the nominal radiator operating temperature. The results showed that two-part epoxies cured at room and elevated temperatures generally did not perform well. Epoxy film adhesives cured at elevated temperatures, on the other hand, did very well with most being sufficiently strong to cause yielding in the titanium sheet used for the joints. The use of an epoxy primer generally improved the strength of the joint. Based upon these results, a new adhesive was selected for the second subscale radiator section.
Baseline Fracture Toughness and CGR testing of alloys X-750 and XM-19 (EPRI Phase I)
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. H. Jackson; S. P. Teysseyre
2012-10-01
The Advanced Test Reactor National Scientific User Facility (ATR NSUF) and Electric Power Research Institute (EPRI) formed an agreement to test representative alloys used as reactor structural materials as a pilot program toward establishing guidelines for future ATR NSUF research programs. This report contains results from the portion of this program established as Phase I (of three phases) that entails baseline fracture toughness, stress corrosion cracking (SCC), and tensile testing of selected materials for comparison to similar tests conducted at GE Global Research. The intent of this Phase I research program is to determine baseline properties for the materials ofmore » interest prior to irradiation, and to ensure comparability between laboratories using similar testing techniques, prior to applying these techniques to the same materials after having been irradiated at the Advanced Test Reactor (ATR). The materials chosen for this research are the nickel based super alloy X-750, and nitrogen strengthened austenitic stainless steel XM-19. A spare core shroud upper support bracket of alloy X-750 was purchased by EPRI from Southern Co. and a section of XM-19 plate was purchased by EPRI from GE-Hitachi. These materials were sectioned at GE Global Research and provided to INL.« less
Baseline Fracture Toughness and CGR testing of alloys X-750 and XM-19 (EPRI Phase I)
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. H. Jackson; S. P. Teysseyre
2012-02-01
The Advanced Test Reactor National Scientific User Facility (ATR NSUF) and Electric Power Research Institute (EPRI) formed an agreement to test representative alloys used as reactor structural materials as a pilot program toward establishing guidelines for future ATR NSUF research programs. This report contains results from the portion of this program established as Phase I (of three phases) that entails baseline fracture toughness, stress corrosion cracking (SCC), and tensile testing of selected materials for comparison to similar tests conducted at GE Global Research. The intent of this Phase I research program is to determine baseline properties for the materials ofmore » interest prior to irradiation, and to ensure comparability between laboratories using similar testing techniques, prior to applying these techniques to the same materials after having been irradiated at the Advanced Test Reactor (ATR). The materials chosen for this research are the nickel based super alloy X-750, and nitrogen strengthened austenitic stainless steel XM-19. A spare core shroud upper support bracket of alloy X-750 was purchased by EPRI from Southern Co. and a section of XM-19 plate was purchased by EPRI from GE-Hitachi. These materials were sectioned at GE Global Research and provided to INL.« less
NASA Astrophysics Data System (ADS)
Batubara, I.; Suparto, I. H.; Wulandari, N. S.
2017-03-01
Guava leaves contain various compounds that have biological activity such as kaempferol and quercetin as anticancer. Twelve extraction techniques were performed to obtain the best extraction technique to isolate kaempferol and quercetin from the guava leaves. Toxicity of extracts was tested against Artemia salina larvae. All extracts were toxic (LC50 value less than 1000 ppm) except extract of direct soxhletation on guava leaves, and extract of sonication and soxhletation using n-hexane. The extract with high content of total phenols and total flavonoids, low content of tannins, intense color of spot on thin layer chromatogram was selected for high performance liquid chromatography analysis. Direct sonication of guava leaves was chosen as the best extraction technique with kampferol and quercetin content of 0.02% and 2.15%, respectively. In addition to high content of kaempferol and quercetin, direct sonication was chosen due to the shortest extraction time, lesser impurities and high toxicity.
A leakage-free resonance sparse decomposition technique for bearing fault detection in gearboxes
NASA Astrophysics Data System (ADS)
Osman, Shazali; Wang, Wilson
2018-03-01
Most of rotating machinery deficiencies are related to defects in rolling element bearings. Reliable bearing fault detection still remains a challenging task, especially for bearings in gearboxes as bearing-defect-related features are nonstationary and modulated by gear mesh vibration. A new leakage-free resonance sparse decomposition (LRSD) technique is proposed in this paper for early bearing fault detection of gearboxes. In the proposed LRSD technique, a leakage-free filter is suggested to remove strong gear mesh and shaft running signatures. A kurtosis and cosine distance measure is suggested to select appropriate redundancy r and quality factor Q. The signal residual is processed by signal sparse decomposition for highpass and lowpass resonance analysis to extract representative features for bearing fault detection. The effectiveness of the proposed technique is verified by a succession of experimental tests corresponding to different gearbox and bearing conditions.
Managing distribution changes in time series prediction
NASA Astrophysics Data System (ADS)
Matias, J. M.; Gonzalez-Manteiga, W.; Taboada, J.; Ordonez, C.
2006-07-01
When a problem is modeled statistically, a single distribution model is usually postulated that is assumed to be valid for the entire space. Nonetheless, this practice may be somewhat unrealistic in certain application areas, in which the conditions of the process that generates the data may change; as far as we are aware, however, no techniques have been developed to tackle this problem.This article proposes a technique for modeling and predicting this change in time series with a view to improving estimates and predictions. The technique is applied, among other models, to the hypernormal distribution recently proposed. When tested on real data from a range of stock market indices the technique produces better results that when a single distribution model is assumed to be valid for the entire period of time studied.Moreover, when a global model is postulated, it is highly recommended to select the hypernormal distribution parameter in the same likelihood maximization process.
Liquid electrolyte informatics using an exhaustive search with linear regression.
Sodeyama, Keitaro; Igarashi, Yasuhiko; Nakayama, Tomofumi; Tateyama, Yoshitaka; Okada, Masato
2018-06-14
Exploring new liquid electrolyte materials is a fundamental target for developing new high-performance lithium-ion batteries. In contrast to solid materials, disordered liquid solution properties have been less studied by data-driven information techniques. Here, we examined the estimation accuracy and efficiency of three information techniques, multiple linear regression (MLR), least absolute shrinkage and selection operator (LASSO), and exhaustive search with linear regression (ES-LiR), by using coordination energy and melting point as test liquid properties. We then confirmed that ES-LiR gives the most accurate estimation among the techniques. We also found that ES-LiR can provide the relationship between the "prediction accuracy" and "calculation cost" of the properties via a weight diagram of descriptors. This technique makes it possible to choose the balance of the "accuracy" and "cost" when the search of a huge amount of new materials was carried out.
Blondeel, Evelyne; Depuydt, Veerle; Cornelis, Jasper; Chys, Michael; Verliefde, Arne; Van Hulle, Stijin Wim Henk
2015-01-01
Pilot-scale optimisation of different possible physical-chemical water treatment techniques was performed on the wastewater originating from three different recovery and recycling companies in order to select a (combination of) technique(s) for further full-scale implementation. This implementation is necessary to reduce the concentration of both common pollutants (such as COD, nutrients and suspended solids) and potentially toxic metals, polyaromatic hydrocarbons and poly-chlorinated biphenyls frequently below the discharge limits. The pilot-scale tests (at 250 L h(-1) scale) demonstrate that sand anthracite filtration or coagulation/flocculation are interesting as first treatment techniques with removal efficiencies of about 19% to 66% (sand anthracite filtration), respectively 18% to 60% (coagulation/flocculation) for the above mentioned pollutants (metals, polyaromatic hydrocarbons and poly chlorinated biphenyls). If a second treatment step is required, the implementation of an activated carbon filter is recommended (about 46% to 86% additional removal is obtained).
Dual-band frequency selective surface with large band separation and stable performance
NASA Astrophysics Data System (ADS)
Zhou, Hang; Qu, Shao-Bo; Peng, Wei-Dong; Lin, Bao-Qin; Wang, Jia-Fu; Ma, Hua; Zhang, Jie-Qiu; Bai, Peng; Wang, Xu-Hua; Xu, Zhuo
2012-05-01
A new technique of designing a dual-band frequency selective surface with large band separation is presented. This technique is based on a delicately designed topology of L- and Ku-band microwave filters. The two band-pass responses are generated by a capacitively-loaded square-loop frequency selective surface and an aperture-coupled frequency selective surface, respectively. A Faraday cage is located between the two frequency selective surface structures to eliminate undesired couplings. Based on this technique, a dual-band frequency selective surface with large band separation is designed, which possesses large band separation, high selectivity, and stable performance under various incident angles and different polarizations.
Genome-wide selection components analysis in a fish with male pregnancy.
Flanagan, Sarah P; Jones, Adam G
2017-04-01
A major goal of evolutionary biology is to identify the genome-level targets of natural and sexual selection. With the advent of next-generation sequencing, whole-genome selection components analysis provides a promising avenue in the search for loci affected by selection in nature. Here, we implement a genome-wide selection components analysis in the sex role reversed Gulf pipefish, Syngnathus scovelli. Our approach involves a double-digest restriction-site associated DNA sequencing (ddRAD-seq) technique, applied to adult females, nonpregnant males, pregnant males, and their offspring. An F ST comparison of allele frequencies among these groups reveals 47 genomic regions putatively experiencing sexual selection, as well as 468 regions showing a signature of differential viability selection between males and females. A complementary likelihood ratio test identifies similar patterns in the data as the F ST analysis. Sexual selection and viability selection both tend to favor the rare alleles in the population. Ultimately, we conclude that genome-wide selection components analysis can be a useful tool to complement other approaches in the effort to pinpoint genome-level targets of selection in the wild. © 2017 The Author(s). Evolution © 2017 The Society for the Study of Evolution.
Towards a framework for testing general relativity with extreme-mass-ratio-inspiral observations
NASA Astrophysics Data System (ADS)
Chua, A. J. K.; Hee, S.; Handley, W. J.; Higson, E.; Moore, C. J.; Gair, J. R.; Hobson, M. P.; Lasenby, A. N.
2018-07-01
Extreme-mass-ratio-inspiral observations from future space-based gravitational-wave detectors such as LISA will enable strong-field tests of general relativity with unprecedented precision, but at prohibitive computational cost if existing statistical techniques are used. In one such test that is currently employed for LIGO black hole binary mergers, generic deviations from relativity are represented by N deformation parameters in a generalized waveform model; the Bayesian evidence for each of its 2N combinatorial submodels is then combined into a posterior odds ratio for modified gravity over relativity in a null-hypothesis test. We adapt and apply this test to a generalized model for extreme-mass-ratio inspirals constructed on deformed black hole spacetimes, and focus our investigation on how computational efficiency can be increased through an evidence-free method of model selection. This method is akin to the algorithm known as product-space Markov chain Monte Carlo, but uses nested sampling and improved error estimates from a rethreading technique. We perform benchmarking and robustness checks for the method, and find order-of-magnitude computational gains over regular nested sampling in the case of synthetic data generated from the null model.
Towards a framework for testing general relativity with extreme-mass-ratio-inspiral observations
NASA Astrophysics Data System (ADS)
Chua, A. J. K.; Hee, S.; Handley, W. J.; Higson, E.; Moore, C. J.; Gair, J. R.; Hobson, M. P.; Lasenby, A. N.
2018-04-01
Extreme-mass-ratio-inspiral observations from future space-based gravitational-wave detectors such as LISA will enable strong-field tests of general relativity with unprecedented precision, but at prohibitive computational cost if existing statistical techniques are used. In one such test that is currently employed for LIGO black-hole binary mergers, generic deviations from relativity are represented by N deformation parameters in a generalised waveform model; the Bayesian evidence for each of its 2N combinatorial submodels is then combined into a posterior odds ratio for modified gravity over relativity in a null-hypothesis test. We adapt and apply this test to a generalised model for extreme-mass-ratio inspirals constructed on deformed black-hole spacetimes, and focus our investigation on how computational efficiency can be increased through an evidence-free method of model selection. This method is akin to the algorithm known as product-space Markov chain Monte Carlo, but uses nested sampling and improved error estimates from a rethreading technique. We perform benchmarking and robustness checks for the method, and find order-of-magnitude computational gains over regular nested sampling in the case of synthetic data generated from the null model.
Soluble Antigen Fluorescent-Antibody Technique
Toussaint, Andre J.; Anderson, Robert I.
1965-01-01
An indirect fluorescent-antibody (FA) procedure employing soluble antigen fixed onto an artificial matrix was developed, and a mechanical means for reading of test results was devised. The method employs two small cellulose acetate paper discs for each test. One disc contains soluble antigen diluted in 1% bovine serum albumin (BSA); the other contains only 1% BSA and serves as a control. After testing by the indirect FA procedure, the results of the tests are read on a fluorometer fitted with a paper chromatogram door. The instrument is set at zero with the control disc as a blank, and the specific fluorescence of the antigen disc is determined. Findings obtained with homologous and heterologous antisera indicated that the method yields excellent results. The soluble antigen fluorescent-antibody technique has definite advantages over the conventional indirect FA procedures. (i) The investigator may objectively select the antigen to be employed. (ii) It is possible to obtain objective mechanical reading of test results rather than the highly subjective readings required by conventional methods. (iii) The system compensates for any nonspecific fluorescence contributed either by the serum (e.g., drugs) or by free fluorescein in the conjugated antiserum. Images Fig. 1 PMID:14339261
NASA Technical Reports Server (NTRS)
Wolfer, B. M.
1977-01-01
Features basic to the integrated utility system, such as solid waste incineration, heat recovery and usage, and water recycling/treatment, are compared in terms of cost, fuel conservation, and efficiency to conventional utility systems in the same mean-climatic area of Washington, D. C. The larger of the two apartment complexes selected for the test showed the more favorable results in the three areas of comparison. Restrictions concerning the sole use of currently available technology are hypothetically removed to consider the introduction and possible advantages of certain advanced techniques in an integrated utility system; recommendations are made and costs are estimated for each type of system.
NASA Astrophysics Data System (ADS)
Fetsco, Sara Elizabeth
There are several topics that introductory physics students typically have difficulty understanding. The purpose of this thesis is to investigate if multiple instructional techniques will help students to better understand and retain the material. The three units analyzed in this study are graphing motion, projectile motion, and conservation of momentum. For each unit students were taught using new or altered instructional methods including online laboratory simulations, inquiry labs, and interactive demonstrations. Additionally, traditional instructional methods such as lecture and problem sets were retained. Effectiveness was measured through pre- and post-tests and student opinion surveys. Results suggest that incorporating multiple instructional techniques into teaching will improve student understanding and retention. Students stated that they learned well from all of the instructional methods used except the online simulations.
Pseudo-random number generator for the Sigma 5 computer
NASA Technical Reports Server (NTRS)
Carroll, S. N.
1983-01-01
A technique is presented for developing a pseudo-random number generator based on the linear congruential form. The two numbers used for the generator are a prime number and a corresponding primitive root, where the prime is the largest prime number that can be accurately represented on a particular computer. The primitive root is selected by applying Marsaglia's lattice test. The technique presented was applied to write a random number program for the Sigma 5 computer. The new program, named S:RANDOM1, is judged to be superior to the older program named S:RANDOM. For applications requiring several independent random number generators, a table is included showing several acceptable primitive roots. The technique and programs described can be applied to any computer having word length different from that of the Sigma 5.
NASA Technical Reports Server (NTRS)
Tedesco, Marco; Kim, Edward J.
2005-01-01
In this paper, GA-based techniques are used to invert the equations of an electromagnetic model based on Dense Medium Radiative Transfer Theory (DMRT) under the Quasi Crystalline Approximation with Coherent Potential to retrieve snow depth, mean grain size and fractional volume from microwave brightness temperatures. The technique is initially tested on both noisy and not-noisy simulated data. During this phase, different configurations of genetic algorithm parameters are considered to quantify how their change can affect the algorithm performance. A configuration of GA parameters is then selected and the algorithm is applied to experimental data acquired during the NASA Cold Land Process Experiment. Snow parameters retrieved with the GA-DMRT technique are then compared with snow parameters measured on field.
Search for Superconductivity in Micrometeorites
Guénon, S.; Ramírez, J. G.; Basaran, Ali C.; Wampler, J.; Thiemens, M.; Taylor, S.; Schuller, Ivan K.
2014-01-01
We have developed a very sensitive, highly selective, non-destructive technique for screening inhomogeneous materials for the presence of superconductivity. This technique, based on phase sensitive detection of microwave absorption is capable of detecting 10−12 cc of a superconductor embedded in a non-superconducting, non-magnetic matrix. For the first time, we apply this technique to the search for superconductivity in extraterrestrial samples. We tested approximately 65 micrometeorites collected from the water well at the Amundsen-Scott South pole station and compared their spectra with those of eight reference materials. None of these micrometeorites contained superconducting compounds, but we saw the Verwey transition of magnetite in our microwave system. This demonstrates that we are able to detect electro-magnetic phase transitions in extraterrestrial materials at cryogenic temperatures. PMID:25476841
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shedlock, Daniel; Dugan, Edward T.; Jacobs, Alan M.
X-ray backscatter radiography by selective detection (RSD) is a field tested and innovative approach to non-destructive evaluation (NDE). RSD is an enhanced single-side x-ray Compton backscatter imaging (CBI) technique which selectively detects scatter components to improve image contrast and quality. Scatter component selection is accomplished through a set of specially designed detectors with fixed and movable collimators. Experimental results have shown that this NDE technique can be used to detect boric acid deposition on a metallic plate through steel foil reflective insulation commonly covering reactor pressure vessels. The current system is capable of detecting boric acid deposits with sub-millimeter resolution,more » through such insulating materials. Industrial systems have been built for Lockheed Martin Space Co. and NASA. Currently the x-ray backscatter RSD scanning systems developed by the University of Florida are being used to inspect the spray-on foam insulation (SOFI) used on the external tank of the space shuttle. RSD inspection techniques have found subsurface cracking in the SOFI thought to be responsible for the foam debris which separated from the external tank during the last shuttle launch. These industrial scanning systems can be customized for many applications, and a smaller, lighter, more compact unit design is being developed. The smaller design is approximately four inches wide, three inches high, and about 12 inches in length. This smaller RSD system can be used for NDE of areas that cannot be reached with larger equipment. X-ray backscatter RSD is a proven technology that has been tested on a wide variety of materials and applications. Currently the system has been used to inspect materials such as aluminum, plastics, honeycomb laminates, reinforced carbon composites, steel, and titanium. The focus of RSD is for one-sided detection for applications where conventional non-destructive examination methods either will not work or give poor results. Acquired images have clearly shown, for a variety of conditions, that proper selection of x-ray field scatter components leads to a significant improvement in image quality and contrast. Improvements are significant enough in some cases that objects not visible to conventional CBI or transmission radiography become readily discernable with RSD. (authors)« less
Development of materials for the rapid manufacture of die cast tooling
NASA Astrophysics Data System (ADS)
Hardro, Peter Jason
The focus of this research is to develop a material composition that can be processed by rapid prototyping (RP) in order to produce tooling for the die casting process. Where these rapidly produced tools will be superior to traditional tooling production methods by offering one or more of the following advantages: reduced tooling cost, shortened tooling creation time, reduced man-hours for tool creation, increased tool life, and shortened die casting cycle time. By utilizing RP's additive build process and vast material selection, there was a prospect that die cast tooling may be produced quicker and with superior material properties. To this end, the material properties that influence die life and cycle time were determined, and a list of materials that fulfill these "optimal" properties were highlighted. Physical testing was conducted in order to grade the processability of each of the material systems and to optimize the manufacturing process for the downselected material system. Sample specimens were produced and microscopy techniques were utilized to determine a number of physical properties of the material system. Additionally, a benchmark geometry was selected and die casting dies were produced from traditional tool materials (H13 steel) and techniques (machining) and from the newly developed materials and RP techniques (selective laser sintering (SLS) and laser engineered net shaping (LENS)). Once the tools were created, a die cast alloy was selected and a preset number of parts were shot into each tool. During tool creation, the manufacturing time and cost was closely monitored and an economic model was developed to compare traditional tooling to RP tooling. This model allows one to determine, in the early design stages, when it is advantageous to implement RP tooling and when traditional tooling would be best. The results of the physical testing and economic analysis has shown that RP tooling is able to achieve a number of the research objectives, namely, reduce tooling cost, shorten tooling creation time, and reduce the man-hours needed for tool creation. Though identifying the appropriate time to use RP tooling appears to be the most important aspect in achieving successful implementation.
NASA Technical Reports Server (NTRS)
Kattan, Michael W.; Hess, Kenneth R.; Kattan, Michael W.
1998-01-01
New computationally intensive tools for medical survival analyses include recursive partitioning (also called CART) and artificial neural networks. A challenge that remains is to better understand the behavior of these techniques in effort to know when they will be effective tools. Theoretically they may overcome limitations of the traditional multivariable survival technique, the Cox proportional hazards regression model. Experiments were designed to test whether the new tools would, in practice, overcome these limitations. Two datasets in which theory suggests CART and the neural network should outperform the Cox model were selected. The first was a published leukemia dataset manipulated to have a strong interaction that CART should detect. The second was a published cirrhosis dataset with pronounced nonlinear effects that a neural network should fit. Repeated sampling of 50 training and testing subsets was applied to each technique. The concordance index C was calculated as a measure of predictive accuracy by each technique on the testing dataset. In the interaction dataset, CART outperformed Cox (P less than 0.05) with a C improvement of 0.1 (95% Cl, 0.08 to 0.12). In the nonlinear dataset, the neural network outperformed the Cox model (P less than 0.05), but by a very slight amount (0.015). As predicted by theory, CART and the neural network were able to overcome limitations of the Cox model. Experiments like these are important to increase our understanding of when one of these new techniques will outperform the standard Cox model. Further research is necessary to predict which technique will do best a priori and to assess the magnitude of superiority.
Investigation of thiol derivatized gold nanoparticle sensors for gas analysis
NASA Astrophysics Data System (ADS)
Stephens, Jared S.
Analysis of volatile organic compounds (VOCs) in air and exhaled breath by sensor array is a very useful testing technique. It can provide non-invasive, fast, inexpensive testing for many diseases. Breath analysis has been very successful in identifying cancer and other diseases by using a chemiresistor sensor or array with gold nanoparticles to detect biomarkers. Acetone is a biomarker for diabetes and having a portable testing device could help to monitor diabetic and therapeutic progress. An advantage to this testing method is it is conducted at room temperature instead of 200 degrees Celsius. 3. The objective of this research is to determine the effect of thiol derivatized gold nanoparticles based on sensor(s) detection of VOCs. The VOCs to be tested are acetone, ethanol, and a mixture of acetone and ethanol. Each chip is tested under all three VOCs and three concentration levels (0.1, 1, and 5.0 ppm). VOC samples are used to test the sensors' ability to detect and differentiate VOCs. Sensors (also referred to as a chip) are prepared using several types of thiol derivatized gold nanoparticles. The factors are: thiol compound and molar volume loading of the thiol in synthesis. The average resistance results are used to determine the VOC selectivity of the sensors tested. The results show a trend of increasing resistance as VOC concentration is increased relative to dry air; which is used as baseline for VOCs. Several sensors show a high selectivity to one or more VOCs. Overall the 57 micromoles of 4-methoxy-toluenethiol sensor shows the strongest selectivity for VOCs tested. 3. Gerfen, Kurt. 2012. Detection of Acetone in Air Using Silver Ion Exchanged ZSM-5 and Zinc Oxide Sensing Films. Master of Science thesis, University of Louisville.
Duarte, Carlos; Núñez, Víctor; Wong, Yat; Vivar, Carlos; Benites, Elder; Rodriguez, Urso; Vergara, Carlos; Ponce, Jorge
2017-12-01
In assisted reproduction procedures, we need to develop and enhance new protocols to optimize sperm selection. The aim of this study is to evaluate the ability of the Z potential technique to select sperm with intact DNA in non-normospermic patients and evaluate the impact of this selection on embryonic development. We analyzed a total of 174 human seminal samples with at least one altered parameter. We measured basal, post density gradients, and post density gradients + Z potential DNA fragmentation index. To evaluate the impact of this technique on embryo development, 54 cases were selected. The embryo development parameters evaluated were fertilization rate, cleavage rate, top quality embryos at the third day and blastocysts rate. We found significant differences in the study groups when we compared the sperm fragmentation index by adding the Z potential technique to density gradient selection vs. density gradients alone. Furthermore, there was no significant difference in the embryo development parameters between the low sperm fragmentation index group vs. the moderate and high sperm fragmentation index groups, when selecting sperms with this new technique. The Z potential technique is a very useful tool for sperm selection; it significantly reduces the DNA fragmentation index and improves the parameters of embryo development. This technique could be considered routine for its simplicity and low cost.
Kimmel, April D.; Losina, Elena; Freedberg, Kenneth A.; Goldie, Sue J.
2006-01-01
We conducted a systematic review on the performance of diagnostic tests for clinical and laboratory monitoring of HIV-infected adults in developing countries. Diagnostic test information collected from computerized databases, bibliographies and the Internet were categorized as clinical (non-laboratory patient information), immunologic (information from immunologic laboratory tests), or virologic (information from virologic laboratory tests). Of the 51 studies selected for the review 28 assessed immunologic tests, 12 virologic tests and seven clinical and immunologic tests. Methods of performance evaluation were primarily sensitivity and specificity for the clinical category and correlation coefficients for immunologic and virologic categories. In the clinical category, the majority of test performance measures was reported as >70% sensitive and >65% specific. In the immunologic category, correlation coefficients ranged from r=0.54 to r=0.99 for different CD4 count enumeration techniques, while correlation for CD4 and total lymphocyte counts was between r=0.23 and r=0.74. In the virologic category, correlation coefficients for different human immunodeficiency virus (HIV) ribonucleic acid (RNA) quantification techniques ranged from r=0.54 to r=0.90. Future research requires consensus on designing studies, and collecting and reporting data useful for decision-makers. We recommend classifying information into clinically relevant categories, using a consistent definition of disease across studies and providing measures of both association and accuracy. PMID:16878233
Brunelli, Alessandro; Charloux, Anne; Bolliger, Chris T; Rocco, Gaetano; Sculier, Jean-Paul; Varela, Gonzalo; Licker, Marc; Ferguson, Mark K; Faivre-Finn, Corinne; Huber, Rudolf Maria; Clini, Enrico M; Win, Thida; De Ruysscher, Dirk; Goldman, Lee
2009-07-01
The European Respiratory Society (ERS) and the European Society of Thoracic Surgeons (ESTS) established a joint task force with the purpose to develop clinical evidence-based guidelines on evaluation of fitness for radical therapy in patients with lung cancer. The following topics were discussed, and are summarized in the final report along with graded recommendations: Cardiologic evaluation before lung resection; lung function tests and exercise tests (limitations of ppoFEV1; DLCO: systematic or selective?; split function studies; exercise tests: systematic; low-tech exercise tests; cardiopulmonary (high tech) exercise tests); future trends in preoperative work-up; physiotherapy/rehabilitation and smoking cessation; scoring systems; advanced care management (ICU/HDU); quality of life in patients submitted to radical treatment; combined cancer surgery and lung volume reduction surgery; compromised parenchymal sparing resections and minimally invasive techniques: the balance between oncological radicality and functional reserve; neoadjuvant chemotherapy and complications; definitive chemo and radiotherapy: functional selection criteria and definition of risk; should surgical criteria be re-calibrated for radiotherapy?; the patient at prohibitive surgical risk: alternatives to surgery; who should treat thoracic patients and where these patients should be treated?
Palomo, R; Casals-Coll, M; Sánchez-Benavides, G; Quintana, M; Manero, R M; Rognoni, T; Calvo, L; Aranciva, F; Tamayo, F; Peña-Casanova, J
2013-05-01
The Rey-Osterrieth Complex Figure (ROCF) and the Free and Cued Selective Reminding Test (FCSRT) are widely used in clinical practice. The ROCF assesses visual perception, constructional praxis, and visuo-spatial memory. The FCSRT assesses verbal learning and memory. In this study, as part of the Spanish normative studies project in young adults (NEURONORMA young adults), we present age- and education-adjusted normative data for both tests obtained by using linear regression techniques. The sample consisted of 179 healthy participants ranging in age from 18 to 49 years. We provide tables for converting raw scores to scaled scores in addition to tables with scores adjusted by socio-demographic factors. The results showed that education affects scores for some of the memory tests and the figure-copying task. Age was only found to have an effect on the performance of visuo-spatial memory tests, and the effect of sex was negligible. The normative data obtained will be extremely useful in the clinical neuropsychological evaluation of young Spanish adults. Copyright © 2011 Sociedad Española de Neurología. Published by Elsevier Espana. All rights reserved.
Dynamics and ethics of comprehensive preimplantation genetic testing: a review of the challenges.
Hens, Kristien; Dondorp, Wybo; Handyside, Alan H; Harper, Joyce; Newson, Ainsley J; Pennings, Guido; Rehmann-Sutter, Christoph; de Wert, Guido
2013-01-01
Genetic testing of preimplantation embryos has been used for preimplantation genetic diagnosis (PGD) and preimplantation genetic screening (PGS). Microarray technology is being introduced in both these contexts, and whole genome sequencing of blastomeres is also expeted to become possible soon. The amount of extra information such tests will yield may prove to be beneficial for embryo selection, will also raise various ethical issues. We present an overview of the developments and an agenda-setting exploration of the ethical issues. The paper is a joint endeavour by the presenters at an explorative 'campus meeting' organized by the European Society of Human Reproduction and Embryology in cooperation with the department of Health, Ethics & Society of the Maastricht University (The Netherlands). The increasing amount and detail of information that new screening techniques such as microarrays and whole genome sequencing offer does not automatically coincide with an increasing understanding of the prospects of an embryo. From a technical point of view, the future of comprehensive embryo testing may go together with developments in preconception carrier screening. From an ethical point of view, the increasing complexity and amount of information yielded by comprehensive testing techniques will lead to challenges to the principle of reproductive autonomy and the right of the child to an open future, and may imply a possible larger responsibility of the clinician towards the welfare of the future child. Combinations of preconception carrier testing and embryo testing may solve some of these ethical questions but could introduce others. As comprehensive testing techniques are entering the IVF clinic, there is a need for a thorough rethinking of traditional ethical paradigms regarding medically assisted reproduction.
PGS-FISH in reproductive medicine and perspective directions for improvement: a systematic review.
Zamora, Sandra; Clavero, Ana; Gonzalvo, M Carmen; de Dios Luna Del Castillo, Juan; Roldán-Nofuentes, Jose Antonio; Mozas, Juan; Castilla, Jose Antonio
2011-08-01
Embryo selection can be carried out via morphological criteria or by using genetic studies based on Preimplantation Genetic Screening. In the present study, we evaluate the clinical validity of Preimplantation Genetic Screening with fluorescence in situ hybridization (PGS-FISH) compared with morphological embryo criteria. A systematic review was made of the bibliography, with the following goals: firstly, to determine the prevalence of embryo chromosome alteration in clinical situations in which the PGS-FISH technique has been used; secondly, to calculate the statistics of diagnostic efficiency (negative Likelihood Ratio), using 2 × 2 tables, derived from PGS-FISH. The results obtained were compared with those obtained from embryo morphology. We calculated the probability of transferring at least one chromosome-normal embryo when it was selected using either morphological criteria or PGS-FISH, and considered what diagnostic performance should be expected of an embryo selection test with respect to achieving greater clinical validity than that obtained from embryo morphology. After an embryo morphology selection that produced a negative result (normal morphology), the likelihood of embryo aneuploidies was found to range from a pre-test value of 65% (prevalence of embryo chromosome alteration registered in all the study groups) to a post-test value of 55% (Confidence interval: 50-61), while after PGS-FISH with a negative result (euploid), the post-test probability was 42% (Confidence interval: 35-49) (p < 0.05). The probability of transferring at least one euploid embryo was the same whether 3 embryos were selected according to morphological criteria or whether 2, selected by PGS-FISH, were transferred. Any embryo selection test, if it is to provide greater clinical validity than embryo morphology, must present a LR-value of 0.40 (Confidence interval: 0.32-0.51) in single embryo transfer, and 0.06 (CI: 0.05-0.07) in double embryo transfer. With currently available technology, and taking into account the number of embryos to be transferred, the clinical validity of PGS-FISH, although superior to that of morphological criteria, does not appear to be clinically relevant.
Mechanical behavior of precipitation hardenable steels exposed to highly corrosive environment
NASA Technical Reports Server (NTRS)
Rosa, Ferdinand
1994-01-01
Unexpected occurrences of failures, due to stress corrosion cracking (SCC) of structural components, indicate a need for improved characterization of materials and more advanced analytical procedures for reliably predicting structures performance. Accordingly, the purpose of this study was to determine the stress corrosion susceptibility of 15 - 5 PH steel over a wide range of applied strain rates in a highly corrosive environment. The selected environment for this investigation was a 3.5 percent NaCl aqueous solution. The material selected for the study was 15 - 5 PH steel in the H 900 condition. The Slow Strain Rate technique was used to test the metallic specimens.
Radiosurgery in the Management of Intractable Mesial Temporal Lobe Epilepsy.
Peñagarícano, José; Serletis, Demitre
2015-09-01
Mesial temporal lobe epilepsy (MTLE) describes recurrent seizure activity originating from the depths of the temporal lobe. MTLE patients who fail two trials of medication now require testing for surgical candidacy at an epilepsy center. For these individuals, temporal lobectomy offers the greatest likelihood for seizure-freedom (up to 80-90%); unfortunately, this procedure remains largely underutilized. Moreover, for select patients unable to tolerate open surgery, novel techniques are emerging for selective ablation of the mesial temporal structures, including stereotactic radiosurgery (SRS). We present here a review of SRS as a potential therapy for MTLE, when open surgery is not an option.
Energy Harvesting from Salinity Gradient
NASA Astrophysics Data System (ADS)
Muhthassim, B.; Thian, X. K.; Hasan, K. N. Md
2018-04-01
Abstract: Energy harvesting from salt water received attention started back in 1970s’, but due to varying interests in the field and the growing potentials of other more promising sources, more work was required to fully establish it. This paper aims at identifying existing techniques of energy harvesting and the methodology involved determining an effective technique for small scale applications of the method. Capacitive deionization technique which involves electrochemical reaction was chosen for further analysis. The experiment was conducted to analyze factors affecting its performance including the electrode and the electrolyte. Combination electrode of carbon/aluminium, copper/aluminium and carbon/copper were selected and tested with different concentration of salty water. From the experiment, copper and aluminum electrodes were found to be the most effective among the rest. A DC-DC boost converter was used to step-up the voltage. Physical implementation of the circuit was done and the circuit was tested in which an input voltage of 1.022 V was boosted to 1.255 V. The efficiency of the boost converter was 38.17 % based on input power and output power obtained.
A processing centre for the CNES CE-GPS experimentation
NASA Technical Reports Server (NTRS)
Suard, Norbert; Durand, Jean-Claude
1994-01-01
CNES is involved in a GPS (Global Positioning System) geostationary overlay experimentation. The purpose of this experimentation is to test various new techniques in order to select the optimal station synchronization method, as well as the geostationary spacecraft orbitography method. These new techniques are needed to develop the Ranging GPS Integrity Channel services. The CNES experimentation includes three transmitting/receiving ground stations (manufactured by IN-SNEC), one INMARSAT 2 C/L band transponder and a processing center named STE (Station de Traitements de l'Experimentation). Not all the techniques to be tested are implemented, but the experimental system has to include several functions; part of the future system simulation functions, such as a servo-loop function, and in particular a data collection function providing for rapid monitoring of system operation, analysis of existing ground station processes, and several weeks of data coverage for other scientific studies. This paper discusses system architecture and some criteria used in its design, as well as the monitoring function, the approach used to develop a low-cost and short-life processing center in collaboration with a CNES sub-contractor (ATTDATAID), and some results.
Optimising rigid motion compensation for small animal brain PET imaging
NASA Astrophysics Data System (ADS)
Spangler-Bickell, Matthew G.; Zhou, Lin; Kyme, Andre Z.; De Laat, Bart; Fulton, Roger R.; Nuyts, Johan
2016-10-01
Motion compensation (MC) in PET brain imaging of awake small animals is attracting increased attention in preclinical studies since it avoids the confounding effects of anaesthesia and enables behavioural tests during the scan. A popular MC technique is to use multiple external cameras to track the motion of the animal’s head, which is assumed to be represented by the motion of a marker attached to its forehead. In this study we have explored several methods to improve the experimental setup and the reconstruction procedures of this method: optimising the camera-marker separation; improving the temporal synchronisation between the motion tracker measurements and the list-mode stream; post-acquisition smoothing and interpolation of the motion data; and list-mode reconstruction with appropriately selected subsets. These techniques have been tested and verified on measurements of a moving resolution phantom and brain scans of an awake rat. The proposed techniques improved the reconstructed spatial resolution of the phantom by 27% and of the rat brain by 14%. We suggest a set of optimal parameter values to use for awake animal PET studies and discuss the relative significance of each parameter choice.
Development of systems and techniques for landing an aircraft using onboard television
NASA Technical Reports Server (NTRS)
Gee, S. W.; Carr, P. C.; Winter, W. R.; Manke, J. A.
1978-01-01
A flight program was conducted to develop a landing technique with which a pilot could consistently and safely land a remotely piloted research vehicle (RPRV) without outside visual reference except through television. Otherwise, instrumentation was standard. Such factors as the selection of video parameters, the pilot's understanding of the television presentation, the pilot's ground cockpit environment, and the operational procedures for landing were considered. About 30 landings were necessary for a pilot to become sufficiently familiar and competent with the test aircraft to make powered approaches and landings with outside visual references only through television. When steep approaches and landings were made by remote control, the pilot's workload was extremely high. The test aircraft was used as a simulator for the F-15 RPRV, and as such was considered to be essential to the success of landing the F-15 RPRV.
Wavelet regression model in forecasting crude oil price
NASA Astrophysics Data System (ADS)
Hamid, Mohd Helmie; Shabri, Ani
2017-05-01
This study presents the performance of wavelet multiple linear regression (WMLR) technique in daily crude oil forecasting. WMLR model was developed by integrating the discrete wavelet transform (DWT) and multiple linear regression (MLR) model. The original time series was decomposed to sub-time series with different scales by wavelet theory. Correlation analysis was conducted to assist in the selection of optimal decomposed components as inputs for the WMLR model. The daily WTI crude oil price series has been used in this study to test the prediction capability of the proposed model. The forecasting performance of WMLR model were also compared with regular multiple linear regression (MLR), Autoregressive Moving Average (ARIMA) and Generalized Autoregressive Conditional Heteroscedasticity (GARCH) using root mean square errors (RMSE) and mean absolute errors (MAE). Based on the experimental results, it appears that the WMLR model performs better than the other forecasting technique tested in this study.
Deeb, Omar; Shaik, Basheerulla; Agrawal, Vijay K
2014-10-01
Quantitative Structure-Activity Relationship (QSAR) models for binding affinity constants (log Ki) of 78 flavonoid ligands towards the benzodiazepine site of GABA (A) receptor complex were calculated using the machine learning methods: artificial neural network (ANN) and support vector machine (SVM) techniques. The models obtained were compared with those obtained using multiple linear regression (MLR) analysis. The descriptor selection and model building were performed with 10-fold cross-validation using the training data set. The SVM and MLR coefficient of determination values are 0.944 and 0.879, respectively, for the training set and are higher than those of ANN models. Though the SVM model shows improvement of training set fitting, the ANN model was superior to SVM and MLR in predicting the test set. Randomization test is employed to check the suitability of the models.
Bruno, William; Martinuzzi, Claudia; Andreotti, Virginia; Pastorino, Lorenza; Spagnolo, Francesco; Dalmasso, Bruna; Cabiddu, Francesco; Gualco, Marina; Ballestrero, Alberto; Bianchi-Scarrà, Giovanna; Queirolo, Paola
2017-01-01
Finding the best technique to identify BRAF mutations with a high sensitivity and specificity is mandatory for accurate patient selection for target therapy. BRAF mutation frequency ranges from 40 to 60% depending on melanoma clinical characteristics and detection technique used. Intertumoral heterogeneity could lead to misinterpretation of BRAF mutational status; this is especially important if testing is performed on primary specimens, when metastatic lesions are unavailable. Aim of this study was to identify the best combination of methods for detecting BRAF mutations (among peptide nucleic acid – PNA-clamping real-time PCR, immunohistochemistry and capillary sequencing) and investigate BRAF mutation heterogeneity in a series of 100 primary melanomas and a subset of 25 matched metastatic samples. Overall, we obtained a BRAF mutation frequency of 62%, based on the combination of at least two techniques. Concordance between mutation status in primary and metastatic tumor was good but not complete (67%), when agreement of at least two techniques were considered. Next generation sequencing was used to quantify the threshold of detected mutant alleles in discordant samples. Combining different methods excludes that the observed heterogeneity is technique-based. We propose an algorithm for BRAF mutation testing based on agreement between immunohistochemistry and PNA; a third molecular method could be added in case of discordance of the results. Testing the primary tumor when the metastatic sample is unavailable is a good option if at least two methods of detection are used, however the presence of intertumoral heterogeneity or the occurrence of additional primaries should be carefully considered. PMID:28039443
NASA Astrophysics Data System (ADS)
Hespel, Camille; Blaisot, Jean-Bernard; Gazon, Matthieu; Godard, Gilles
2012-07-01
The characterization of diesel jets in the near field of the nozzle exit still presents challenges for experimenters. Detailed velocity measurements are needed to characterize diesel injector performance and also to establish boundary conditions for CFD codes. The present article examines the efficiency of laser correlation velocimetry (LCV) applied to diesel spray characterization. A new optical configuration based on a long-distance microscope was tested, and special care was taken to examine the spatial selectivity of the technique. Results show that the depth of the measurement volume (along the laser beam) of LCV extends beyond the depth of field of the imaging setup. The LCV results were also found to be particularly sensitive to high-speed elements of a spray. Results from high-pressure diesel jets in a back-pressure environment indicate that this technique is particularly suited to the very near field of the nozzle exit, where the flow is the narrowest and where the velocity distribution is not too large. It is also shown that the performance of the LCV technique is controlled by the filtering and windowing parameters used in the processing of the raw signals.
Garcia-Allende, P Beatriz; Mirapeix, Jesus; Conde, Olga M; Cobo, Adolfo; Lopez-Higuera, Jose M
2009-01-01
Plasma optical spectroscopy is widely employed in on-line welding diagnostics. The determination of the plasma electron temperature, which is typically selected as the output monitoring parameter, implies the identification of the atomic emission lines. As a consequence, additional processing stages are required with a direct impact on the real time performance of the technique. The line-to-continuum method is a feasible alternative spectroscopic approach and it is particularly interesting in terms of its computational efficiency. However, the monitoring signal highly depends on the chosen emission line. In this paper, a feature selection methodology is proposed to solve the uncertainty regarding the selection of the optimum spectral band, which allows the employment of the line-to-continuum method for on-line welding diagnostics. Field test results have been conducted to demonstrate the feasibility of the solution.
Aeroelastic Model Structure Computation for Envelope Expansion
NASA Technical Reports Server (NTRS)
Kukreja, Sunil L.
2007-01-01
Structure detection is a procedure for selecting a subset of candidate terms, from a full model description, that best describes the observed output. This is a necessary procedure to compute an efficient system description which may afford greater insight into the functionality of the system or a simpler controller design. Structure computation as a tool for black-box modelling may be of critical importance in the development of robust, parsimonious models for the flight-test community. Moreover, this approach may lead to efficient strategies for rapid envelope expansion which may save significant development time and costs. In this study, a least absolute shrinkage and selection operator (LASSO) technique is investigated for computing efficient model descriptions of nonlinear aeroelastic systems. The LASSO minimises the residual sum of squares by the addition of an l(sub 1) penalty term on the parameter vector of the traditional 2 minimisation problem. Its use for structure detection is a natural extension of this constrained minimisation approach to pseudolinear regression problems which produces some model parameters that are exactly zero and, therefore, yields a parsimonious system description. Applicability of this technique for model structure computation for the F/A-18 Active Aeroelastic Wing using flight test data is shown for several flight conditions (Mach numbers) by identifying a parsimonious system description with a high percent fit for cross-validated data.
Using deep learning for detecting gender in adult chest radiographs
NASA Astrophysics Data System (ADS)
Xue, Zhiyun; Antani, Sameer; Long, L. Rodney; Thoma, George R.
2018-03-01
In this paper, we present a method for automatically identifying the gender of an imaged person using their frontal chest x-ray images. Our work is motivated by the need to determine missing gender information in some datasets. The proposed method employs the technique of convolutional neural network (CNN) based deep learning and transfer learning to overcome the challenge of developing handcrafted features in limited data. Specifically, the method consists of four main steps: pre-processing, CNN feature extractor, feature selection, and classifier. The method is tested on a combined dataset obtained from several sources with varying acquisition quality resulting in different pre-processing steps that are applied for each. For feature extraction, we tested and compared four CNN architectures, viz., AlexNet, VggNet, GoogLeNet, and ResNet. We applied a feature selection technique, since the feature length is larger than the number of images. Two popular classifiers: SVM and Random Forest, are used and compared. We evaluated the classification performance by cross-validation and used seven performance measures. The best performer is the VggNet-16 feature extractor with the SVM classifier, with accuracy of 86.6% and ROC Area being 0.932 for 5-fold cross validation. We also discuss several misclassified cases and describe future work for performance improvement.
A new approach for categorizing pig lying behaviour based on a Delaunay triangulation method.
Nasirahmadi, A; Hensel, O; Edwards, S A; Sturm, B
2017-01-01
Machine vision-based monitoring of pig lying behaviour is a fast and non-intrusive approach that could be used to improve animal health and welfare. Four pens with 22 pigs in each were selected at a commercial pig farm and monitored for 15 days using top view cameras. Three thermal categories were selected relative to room setpoint temperature. An image processing technique based on Delaunay triangulation (DT) was utilized. Different lying patterns (close, normal and far) were defined regarding the perimeter of each DT triangle and the percentages of each lying pattern were obtained in each thermal category. A method using a multilayer perceptron (MLP) neural network, to automatically classify group lying behaviour of pigs into three thermal categories, was developed and tested for its feasibility. The DT features (mean value of perimeters, maximum and minimum length of sides of triangles) were calculated as inputs for the MLP classifier. The network was trained, validated and tested and the results revealed that MLP could classify lying features into the three thermal categories with high overall accuracy (95.6%). The technique indicates that a combination of image processing, MLP classification and mathematical modelling can be used as a precise method for quantifying pig lying behaviour in welfare investigations.
NASA Technical Reports Server (NTRS)
Mungas, Greg S.; Gursel, Yekta; Sepulveda, Cesar A.; Anderson, Mark; La Baw, Clayton; Johnson, Kenneth R.; Deans, Matthew; Beegle, Luther; Boynton, John
2008-01-01
Conducting high resolution field microscopy with coupled laser spectroscopy that can be used to selectively analyze the surface chemistry of individual pixels in a scene is an enabling capability for next generation robotic and manned spaceflight missions, civil, and military applications. In the laboratory, we use a range of imaging and surface preparation tools that provide us with in-focus images, context imaging for identifying features that we want to investigate at high magnification, and surface-optical coupling that allows us to apply optical spectroscopic analysis techniques for analyzing surface chemistry particularly at high magnifications. The camera, hand lens, and microscope probe with scannable laser spectroscopy (CHAMP-SLS) is an imaging/spectroscopy instrument capable of imaging continuously from infinity down to high resolution microscopy (resolution of approx. 1 micron/pixel in a final camera format), the closer CHAMP-SLS is placed to a feature, the higher the resultant magnification. At hand lens to microscopic magnifications, the imaged scene can be selectively interrogated with point spectroscopic techniques such as Raman spectroscopy, microscopic Laser Induced Breakdown Spectroscopy (micro-LIBS), laser ablation mass-spectrometry, Fluorescence spectroscopy, and/or Reflectance spectroscopy. This paper summarizes the optical design, development, and testing of the CHAMP-SLS optics.
Study of guided wave transmission through complex junction in sodium cooled reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elie, Q.; Le Bourdais, F.; Jezzine, K.
2015-07-01
Ultrasonic guided wave techniques are seen as suitable candidates for the inspection of welded structures within sodium cooled fast reactors (SFR), as the long range propagation of guided waves without amplitude attenuation can overcome the accessibility problem due to the liquid sodium. In the context of the development of the Advanced Sodium Test Reactor for Industrial Demonstration (ASTRID), the French Atomic Commission (CEA) investigates non-destructive testing techniques based on guided wave propagation. In this work, guided wave NDT methods are applied to control the integrity of welds located in a junction-type structure welded to the main vessel. The method presentedmore » in this paper is based on the analysis of scattering matrices peculiar to each expected defect, and takes advantage of the multi-modal and dispersive characteristics of guided wave generation. In a simulation study, an algorithm developed using the CIVA software is presented. It permits selecting appropriate incident modes to optimize detection and identification of expected flawed configurations. In the second part of this paper, experimental results corresponding to a first validation step of the simulation results are presented. The goal of the experiments is to estimate the effectiveness of the incident mode selection in plates. The results show good agreement between experience and simulation. (authors)« less
Balogun, Anthony Gbenro; Balogun, Shyngle Kolawole; Onyencho, Chidi Victor
2017-02-13
This study investigated the moderating role of achievement motivation in the relationship between test anxiety and academic performance. Three hundred and ninety three participants (192 males and 201 females) selected from a public university in Ondo State, Nigeria using a purposive sampling technique, participated in the study. They responded to measures of test anxiety and achievement motivation. Three hypotheses were tested using moderated hierarchical multiple regression analysis. Results showed that test anxiety had a negative impact on academic performance (β = -.23; p < .05). Achievement motivation had a positive impact on academic performance (β = .38; p < .05). Also, achievement motivation significantly moderated the relationship between test anxiety and academic performance (β = .10; p < .01). These findings suggest that university management should design appropriate psycho-educational interventions that would enhance students' achievement motivation.
Machine learning search for variable stars
NASA Astrophysics Data System (ADS)
Pashchenko, Ilya N.; Sokolovsky, Kirill V.; Gavras, Panagiotis
2018-04-01
Photometric variability detection is often considered as a hypothesis testing problem: an object is variable if the null hypothesis that its brightness is constant can be ruled out given the measurements and their uncertainties. The practical applicability of this approach is limited by uncorrected systematic errors. We propose a new variability detection technique sensitive to a wide range of variability types while being robust to outliers and underestimated measurement uncertainties. We consider variability detection as a classification problem that can be approached with machine learning. Logistic Regression (LR), Support Vector Machines (SVM), k Nearest Neighbours (kNN), Neural Nets (NN), Random Forests (RF), and Stochastic Gradient Boosting classifier (SGB) are applied to 18 features (variability indices) quantifying scatter and/or correlation between points in a light curve. We use a subset of Optical Gravitational Lensing Experiment phase two (OGLE-II) Large Magellanic Cloud (LMC) photometry (30 265 light curves) that was searched for variability using traditional methods (168 known variable objects) as the training set and then apply the NN to a new test set of 31 798 OGLE-II LMC light curves. Among 205 candidates selected in the test set, 178 are real variables, while 13 low-amplitude variables are new discoveries. The machine learning classifiers considered are found to be more efficient (select more variables and fewer false candidates) compared to traditional techniques using individual variability indices or their linear combination. The NN, SGB, SVM, and RF show a higher efficiency compared to LR and kNN.
Diabetic microangiopathy in capillaroscopic examination of juveniles with diabetes type 1.
Kaminska-Winciorek, Grażyna; Deja, Grażyna; Polańska, Joanna; Jarosz-Chobot, Przemysława
2012-01-30
The aim of this work was a quantitative and qualitative assessment of a selected part of the microcirculation in children with diabetes type 1 using videocapillaroscopy technique. The authors tested a group consisting of 145 children (70 boys, 75 girls) diagnosed and treated for diabetes type 1 in the Diabetic Clinic of GCZD in Katowice for at least one year. The study included history, clinical examination (including dermatological examination) and videocapillaroscopy. Capillaroscopy, a non-invasive, painless and easily repeatable test, was performed using videocapillaroscopy with digital storage of the obtained images. All nailfolds were examined in all children using videocapillaroscopy, and the obtained images were assessed quantitatively and qualitatively for changes in capillary loops in the tested children according to the defined diagnostic procedure. The analysis of capillaroscopic images described selected quantitative and qualitative characteristics. The conducted analysis showed an increase in the number of capillaries and their elongation, the presence of megacapillaries and Raynaud loops, which were accompanied by an intensive red background, indicating possible neoangiogenesis. The increase in the number of capillaries, disturbances in distribution of capillaries and the presence of abnormal capillaries were correlated with the longer duration of diabetes. Raynaud loops were more frequently found in the cases of increased mean values of HbA1c. Higher values of HbA1c influenced the capillaroscopic images, mainly the number of vessels, including Raynaud loops. Videocapillaroscopy technique could be a useful tool to detect the early changes of microangiopathy in children with diabetes type 1.
History-based route selection for reactive ad hoc routing protocols
NASA Astrophysics Data System (ADS)
Medidi, Sirisha; Cappetto, Peter
2007-04-01
Ad hoc networks rely on cooperation in order to operate, but in a resource constrained environment not all nodes behave altruistically. Selfish nodes preserve their own resources and do not forward packets not in their own self interest. These nodes degrade the performance of the network, but judicious route selection can help maintain performance despite this behavior. Many route selection algorithms place importance on shortness of the route rather than its reliability. We introduce a light-weight route selection algorithm that uses past behavior to judge the quality of a route rather than solely on the length of the route. It draws information from the underlying routing layer at no extra cost and selects routes with a simple algorithm. This technique maintains this data in a small table, which does not place a high cost on memory. History-based route selection's minimalism suits the needs the portable wireless devices and is easy to implement. We implemented our algorithm and tested it in the ns2 environment. Our simulation results show that history-based route selection achieves higher packet delivery and improved stability than its length-based counterpart.
Artificial intelligence techniques for embryo and oocyte classification.
Manna, Claudio; Nanni, Loris; Lumini, Alessandra; Pappalardo, Sebastiana
2013-01-01
One of the most relevant aspects in assisted reproduction technology is the possibility of characterizing and identifying the most viable oocytes or embryos. In most cases, embryologists select them by visual examination and their evaluation is totally subjective. Recently, due to the rapid growth in the capacity to extract texture descriptors from a given image, a growing interest has been shown in the use of artificial intelligence methods for embryo or oocyte scoring/selection in IVF programmes. This work concentrates the efforts on the possible prediction of the quality of embryos and oocytes in order to improve the performance of assisted reproduction technology, starting from their images. The artificial intelligence system proposed in this work is based on a set of Levenberg-Marquardt neural networks trained using textural descriptors (the local binary patterns). The proposed system was tested on two data sets of 269 oocytes and 269 corresponding embryos from 104 women and compared with other machine learning methods already proposed in the past for similar classification problems. Although the results are only preliminary, they show an interesting classification performance. This technique may be of particular interest in those countries where legislation restricts embryo selection. One of the most relevant aspects in assisted reproduction technology is the possibility of characterizing and identifying the most viable oocytes or embryos. In most cases, embryologists select them by visual examination and their evaluation is totally subjective. Recently, due to the rapid growth in our capacity to extract texture descriptors from a given image, a growing interest has been shown in the use of artificial intelligence methods for embryo or oocyte scoring/selection in IVF programmes. In this work, we concentrate our efforts on the possible prediction of the quality of embryos and oocytes in order to improve the performance of assisted reproduction technology, starting from their images. The artificial intelligence system proposed in this work is based on a set of Levenberg-Marquardt neural networks trained using textural descriptors (the 'local binary patterns'). The proposed system is tested on two data sets, of 269 oocytes and 269 corresponding embryos from 104 women, and compared with other machine learning methods already proposed in the past for similar classification problems. Although the results are only preliminary, they showed an interesting classification performance. This technique may be of particular interest in those countries where legislation restricts embryo selection. Copyright © 2012 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.
[Patient satisfaction in a laboratory test collection unit].
de Moura, Gisela Maria Schebella Souto; Hilleshein, Eunice Fabiani; Schardosim, Juliana Machado; Delgado, Kátia Simone
2008-06-01
This exploratory descriptive study aimed at identifying customer satisfaction attributes in the field of laboratory tests. Data were collected in 2006, using 104 interviews in a laboratorial unit inside a teaching hospital, using the critical incident technique, and submitted to content analysis. Three attribute categories were identified: time spent in waiting for care, interpersonal contact, and technical skills. These results subsidize the assessment of the current satisfaction survey tool, and point to its reformulation. They also allow the identification of improvement needs in customer attention, and provide elements to be taken into account in personnel selection, training programs, personnel performance assessment.
NASA Technical Reports Server (NTRS)
Pearce, W. E.
1982-01-01
An evaluation was made of laminar flow control (LFC) system concepts for subsonic commercial transport aircraft. Configuration design studies, performance analyses, fabrication development, structural testing, wind tunnel testing, and contamination-avoidance techniques were included. As a result of trade studies, a configuration with LFC on the upper wing surface only, utilizing an electron beam-perforated suction surface, and employing a retractable high-lift shield for contamination avoidance, was selected as the most practical LFC system. The LFC aircraft was then compared with an advanced turbulent aircraft designed for the same mission. This comparison indicated significant fuel savings.
NASA Astrophysics Data System (ADS)
Lü, Chengxu; Jiang, Xunpeng; Zhou, Xingfan; Zhang, Yinqiao; Zhang, Naiqian; Wei, Chongfeng; Mao, Wenhua
2017-10-01
Wet gluten is a useful quality indicator for wheat, and short wave near infrared spectroscopy (NIRS) is a high performance technique with the advantage of economic rapid and nondestructive test. To study the feasibility of short wave NIRS analyzing wet gluten directly from wheat seed, 54 representative wheat seed samples were collected and scanned by spectrometer. 8 spectral pretreatment method and genetic algorithm (GA) variable selection method were used to optimize analysis. Both quantitative and qualitative model of wet gluten were built by partial least squares regression and discriminate analysis. For quantitative analysis, normalization is the optimized pretreatment method, 17 wet gluten sensitive variables are selected by GA, and GA model performs a better result than that of all variable model, with R2V=0.88, and RMSEV=1.47. For qualitative analysis, automatic weighted least squares baseline is the optimized pretreatment method, all variable models perform better results than those of GA models. The correct classification rates of 3 class of <24%, 24-30%, >30% wet gluten content are 95.45, 84.52, and 90.00%, respectively. The short wave NIRS technique shows potential for both quantitative and qualitative analysis of wet gluten for wheat seed.
NASA Astrophysics Data System (ADS)
Li, Xi; Lin, Qiu-han; Zhao, Xin-Ying; Han, Zhi-Wei; Wang, Bo-liang
2017-04-01
Thermal techniques (differential scanning calorimetry (DSC) and the vacuum stability test (VST)), according to STANAG 4147, and non-thermal techniques (Fourier transform infrared (FTIR) spectrometry and X-ray diffractometry (XRD)) were used to examine compatibility issues for 2,4,6,8,10,12-hexanitrohexaazaisowurtzitane (CL-20) with a selection of insensitive explosives, including nitroguanidine (NQ), 2,4,6-trinitrotoluene (TNT), 2,6-diamino-3,5-dinitropyridine-1-oxide (ANPyO), 2,4,6-triamino-1,3,5-trinitrobenzene (TATB), 3-nitro-1,2,4-triazol-5-one (NTO) and 2,6-diamino-3,5-dinitropyrazine-1-oxide (LLM-105). DSC measurements showed that ANPyO, TATB, NTO and LLM-105 were compatible with CL-20. The compatibility of CL-20/NQ, CL-20/TNT, CL-20/ANPyO, CL-20/TATB, CL-20/NTO and CL-20/LLM-105 mixtures was further explored using the VST, which revealed that all the selected insensitive explosives were compatible with CL-20. Possible chemical interactions were suspected for CL-20/TATB from the FTIR results and for CL-20/NTO from XRD analysis. In summary, ANPyO and LLM-105 demonstrated the optimal compatibility with CL-20.
Restrepo, S; Duque, M; Tohme, J; Verdier, V
1999-01-01
Xanthomonas axonopodis pv. manihotis (Xam) is the causative agent of cassava bacterial blight (CBB), a worldwide disease that is particularly destructive in South America and Africa. CBB is controlled essentially through the use of resistant varieties. To develop an appropriate disease management strategy, the genetic diversity of the pathogen's populations must be assessed. Until now, the genetic diversity of Xam was characterized by RFLP analyses using ribotyping, and plasmid and genomic Xam probes. We used AFLP (amplified fragment length polymorphism), a novel PCR-based technique, to characterize the genetic diversity of Colombian Xam isolates. Six Xam strains were tested with 65 AFLP primer combinations to identify the best selective primers. Eight primer combinations were selected according to their reproducibility, number of polymorphic bands and polymorphism detected between Xam strains. Forty-seven Xam strains, originating from different Colombian ecozones, were analysed with the selected combinations. Results obtained with AFLP are consistent with those obtained with RFLP, using plasmid DNA as a probe. Some primer combinations differentiated Xam strains that were not distinguished by RFLP analyses, thus AFLP fingerprinting allowed a better definition of the genetic relationships between Xam strains.
Propulsion Health Monitoring of a Turbine Engine Disk Using Spin Test Data
NASA Technical Reports Server (NTRS)
Abdul-Aziz, Ali; Woike, Mark R.; Oza, Nikunj; Matthews, Bryan; Baaklini, George Y.
2010-01-01
This paper considers data collected from an experimental study using high frequency capacitive sensor technology to capture blade tip clearance and tip timing measurements in a rotating turbine engine-like-disk-to predict the disk faults and assess its structural integrity. The experimental results collected at a range of rotational speeds from tests conducted at the NASA Glenn Research Center s Rotordynamics Laboratory are evaluated using multiple data-driven anomaly detection techniques to identify abnormalities in the disk. Further, this study presents a select evaluation of an online health monitoring scheme of a rotating disk using high caliber sensors and test the capability of the in-house spin system.
Design characteristics of a heat pipe test chamber
NASA Technical Reports Server (NTRS)
Baker, Karl W.; Jang, J. Hoon; Yu, Juin S.
1992-01-01
LeRC has designed a heat pipe test facility which will be used to provide data for validating heat pipe computer codes. A heat pipe test chamber that uses helium gas for enhancing heat transfer was investigated. The conceptual design employs the technique of guarded heating and guarded cooling to facilitate accurate measurements of heat transfer rates to the evaporator and from the condenser. The design parameters are selected for a baseline heat pipe made of stainless steel with an inner diameter of 38.10 mm and a wall thickness of 1.016 mm. The heat pipe operates at a design temperature of 1000 K with an evaporator radial heat flux of 53 W/sq. cm.
Cazzoli, Riccardo; Buttitta, Fiamma; Di Nicola, Marta; Malatesta, Sara; Marchetti, Antonio; Pass, Harvey I.
2013-01-01
Introduction Lung cancer is formerly the highest cause of mortality among tumor pathologies worldwide. There are no validated techniques for an early detection of pulmonary cancer lesions other than low-dose helical CT-scan. Unfortunately, this method have some downside effects. Recent studies have laid the basis for development of exosomes-based techniques to screen/diagnose lung cancers. As the isolation of circulating exosomes is a minimally invasive procedure, this technique opens new possibilities for diagnostic applications. Methods We used a first set of 30 plasma samples from as many patients, including 10 patients affected by Lung Adenocarcinomas, 10 with Lung Granulomas and 10 healthy smokers matched for age and sex as negative controls. Wide range microRNAs analysis (742 microRNAs) was performed by quantitative RT-PCR. Data were compared by lesion characteristics using WEKA software for statistics and modeling. Subsequently, selected microRNAs were evaluated on an independent larger group of samples (105 specimens: 50 Lung Adenocarcinomas, 30 Lung Granulomas and 25 healthy smokers). Results This analysis led to the selection of 4 microRNAs to perform a screening test (miR-378a, miR-379, miR-139-5p and miR-200b-5p), useful to divide population into 2 groups: nodule (lung adenocarcinomas+carcinomas) and non-nodule (healthy former smokers). Six microRNAs (miR-151a-5p, miR-30a-3p, miR-200b-5p, miR-629, miR-100 and miR-154-3p) were selected for a second test on the “nodule” population to discriminate between lung adenocarcinoma and granuloma. Conclusions “Screening test” has shown 97.5% sensitivity, 72.0% specificity, AUC ROC of 90.8%. “Diagnostic test” had 96.0% sensitivity, 60.0% specificity, AUC ROC of 76.0%. Further evaluation is needed to confirm the predictive power of those models on higher cohorts of samples. PMID:23945385
Influence of different treatment techniques on radiation dose to the LAD coronary artery
Nieder, Carsten; Schill, Sabine; Kneschaurek, Peter; Molls, Michael
2007-01-01
Background The purpose of this proof-of-principle study was to test the ability of an intensity-modulated radiotherapy (IMRT) technique to reduce the radiation dose to the heart plus the left ventricle and a coronary artery. Radiation-induced heart disease might be a serious complication in long-term cancer survivors. Methods Planning CT scans from 6 female patients were available. They were part of a previous study of mediastinal IMRT for target volumes used in lymphoma treatment that included 8 patients and represent all cases where the left anterior descending coronary artery (LAD) could be contoured. We compared 6 MV AP/PA opposed fields to a 3D conformal 4-field technique and an optimised 7-field step-and-shoot IMRT technique and evaluated DVH's for several structures. The planning system was BrainSCAN 5.21 (BrainLAB, Heimstetten, Germany). Results IMRT maintained target volume coverage but resulted in better dose reduction to the heart, left ventricle and LAD than the other techniques. Selective dose reduction could be accomplished, although not to the degree initially attempted. The median LAD dose was approximately 50% lower with IMRT. In 5 out of 6 patients, IMRT was the best technique with regard to heart sparing. Conclusion IMRT techniques are able to reduce the radiation dose to the heart. In addition to dose reduction to whole heart, individualised dose distributions can be created, which spare, e.g., one ventricle plus one of the coronary arteries. Certain patients with well-defined vessel pathology might profit from an approach of general heart sparing with further selective dose reduction, accounting for the individual aspects of pre-existing damage. PMID:17547777
Kim, Seungjin; Krajmalnik-Brown, Rosa; Kim, Jong-Oh; Chung, Jinwook
2014-11-01
The application of effective remediation technologies can benefit from adequate preliminary testing, such as in lab-scale and Pilot-scale systems. Bioremediation technologies have demonstrated tremendous potential with regards to cost, but they cannot be used for all contaminated sites due to limitations in biological activity. The purpose of this study was to develop a DNA diagnostic method that reduces the time to select contaminated sites that are good candidates for bioremediation. We applied an oligonucleotide microarray method to detect and monitor genes that lead to aliphatic and aromatic degradation. Further, the bioremediation of a contaminated site, selected based on the results of the genetic diagnostic method, was achieved successfully by applying bioslurping in field tests. This gene-based diagnostic technique is a powerful tool to evaluate the potential for bioremediation in petroleum hydrocarbon contaminated soil. Copyright © 2014 Elsevier B.V. All rights reserved.
A support vector machine approach for classification of welding defects from ultrasonic signals
NASA Astrophysics Data System (ADS)
Chen, Yuan; Ma, Hong-Wei; Zhang, Guang-Ming
2014-07-01
Defect classification is an important issue in ultrasonic non-destructive evaluation. A layered multi-class support vector machine (LMSVM) classification system, which combines multiple SVM classifiers through a layered architecture, is proposed in this paper. The proposed LMSVM classification system is applied to the classification of welding defects from ultrasonic test signals. The measured ultrasonic defect echo signals are first decomposed into wavelet coefficients by the wavelet packet transform. The energy of the wavelet coefficients at different frequency channels are used to construct the feature vectors. The bees algorithm (BA) is then used for feature selection and SVM parameter optimisation for the LMSVM classification system. The BA-based feature selection optimises the energy feature vectors. The optimised feature vectors are input to the LMSVM classification system for training and testing. Experimental results of classifying welding defects demonstrate that the proposed technique is highly robust, precise and reliable for ultrasonic defect classification.
Autocorrelation of location estimates and the analysis of radiotracking data
Otis, D.L.; White, Gary C.
1999-01-01
The wildlife literature has been contradictory about the importance of autocorrelation in radiotracking data used for home range estimation and hypothesis tests of habitat selection. By definition, the concept of a home range involves autocorrelated movements, but estimates or hypothesis tests based on sampling designs that predefine a time frame of interest, and that generate representative samples of an animal's movement during this time frame, should not be affected by length of the sampling interval and autocorrelation. Intensive sampling of the individual's home range and habitat use during the time frame of the study leads to improved estimates for the individual, but use of location estimates as the sample unit to compare across animals is pseudoreplication. We therefore recommend against use of habitat selection analysis techniques that use locations instead of individuals as the sample unit. We offer a general outline for sampling designs for radiotracking studies.
Water vapor diffusion membrane development
NASA Technical Reports Server (NTRS)
Tan, M. K.
1977-01-01
An application of the water vapor diffusion technique is examined whereby the permeated water vapor is vented to space vacuum to alleviate on-board waste storage and provide supplemental cooling. The work reported herein deals primarily with the vapor diffusion-heat rejection (VD-HR) as it applies to the Space Shuttle. A stack configuration was selected, designed and fabricated. An asymmetric cellulose acetate membrane, used in reverse osmosis application was selected and a special spacer was designed to enhance mixing and promote mass transfer. A skid-mount unit was assembled from components used in the bench unit although no attempt was made to render it flight-suitable. The operating conditions of the VD-HR were examined and defined and a 60-day continuous test was carried out. The membranes performed very well throughout the test; no membrane rupture and no unusual flux decay was observed. In addition, a tentative design for a flight-suitable VD-HR unit was made.
Minimum time search in uncertain dynamic domains with complex sensorial platforms.
Lanillos, Pablo; Besada-Portas, Eva; Lopez-Orozco, Jose Antonio; de la Cruz, Jesus Manuel
2014-08-04
The minimum time search in uncertain domains is a searching task, which appears in real world problems such as natural disasters and sea rescue operations, where a target has to be found, as soon as possible, by a set of sensor-equipped searchers. The automation of this task, where the time to detect the target is critical, can be achieved by new probabilistic techniques that directly minimize the Expected Time (ET) to detect a dynamic target using the observation probability models and actual observations collected by the sensors on board the searchers. The selected technique, described in algorithmic form in this paper for completeness, has only been previously partially tested with an ideal binary detection model, in spite of being designed to deal with complex non-linear/non-differential sensorial models. This paper covers the gap, testing its performance and applicability over different searching tasks with searchers equipped with different complex sensors. The sensorial models under test vary from stepped detection probabilities to continuous/discontinuous differentiable/non-differentiable detection probabilities dependent on distance, orientation, and structured maps. The analysis of the simulated results of several static and dynamic scenarios performed in this paper validates the applicability of the technique with different types of sensor models.
Minimum Time Search in Uncertain Dynamic Domains with Complex Sensorial Platforms
Lanillos, Pablo; Besada-Portas, Eva; Lopez-Orozco, Jose Antonio; de la Cruz, Jesus Manuel
2014-01-01
The minimum time search in uncertain domains is a searching task, which appears in real world problems such as natural disasters and sea rescue operations, where a target has to be found, as soon as possible, by a set of sensor-equipped searchers. The automation of this task, where the time to detect the target is critical, can be achieved by new probabilistic techniques that directly minimize the Expected Time (ET) to detect a dynamic target using the observation probability models and actual observations collected by the sensors on board the searchers. The selected technique, described in algorithmic form in this paper for completeness, has only been previously partially tested with an ideal binary detection model, in spite of being designed to deal with complex non-linear/non-differential sensorial models. This paper covers the gap, testing its performance and applicability over different searching tasks with searchers equipped with different complex sensors. The sensorial models under test vary from stepped detection probabilities to continuous/discontinuous differentiable/non-differentiable detection probabilities dependent on distance, orientation, and structured maps. The analysis of the simulated results of several static and dynamic scenarios performed in this paper validates the applicability of the technique with different types of sensor models. PMID:25093345
Effect of different mixing methods on the physical properties of Portland cement.
Shahi, Shahriar; Ghasemi, Negin; Rahimi, Saeed; Yavari, Hamidreza; Samiei, Mohammad; Jafari, Farnaz
2016-12-01
The Portland cement is hydrophilic cement; as a result, the powder-to-liquid ratio affects the properties of the final mix. In addition, the mixing technique affects hydration. The aim of this study was to evaluate the effect of different mixing techniques (conventional, amalgamator and ultrasonic) on some selective physical properties of Portland cement. The physical properties to be evaluated were determined using the ISO 6786:2001 specification. One hundred sixty two samples of Portland cement were prepared for three mixing techniques for each physical property (each 6 samples). Data were analyzed using descriptive statistics, one-way ANOVA and post hoc Tukey tests. Statistical significance was set at P <0.05. The mixing technique had no significant effect on the compressive strength, film thickness and flow of Portland cement ( P >0.05). Dimensional changes (shrinkage), solubility and pH increased significantly by amalgamator and ultrasonic mixing techniques ( P <0.05). The ultrasonic technique significantly decreased working time, and the amalgamator and ultrasonic techniques significantly decreased the setting time ( P <0.05). The mixing technique exerted no significant effect on the flow, film thickness and compressive strength of Portland cement samples. Key words: Physical properties, Portland cement, mixing methods.
Accurate monitoring leads to effective control and greater learning of patient education materials.
Rawson, Katherine A; O'Neil, Rochelle; Dunlosky, John
2011-09-01
Effective management of chronic diseases (e.g., diabetes) can depend on the extent to which patients can learn and remember disease-relevant information. In two experiments, we explored a technique motivated by theories of self-regulated learning for improving people's learning of information relevant to managing a chronic disease. Materials were passages from patient education booklets on diabetes from NIDDK. Session 1 included an initial study trial, Session 2 included self-regulated restudy, and Session 3 included a final memory test. The key manipulation concerned the kind of support provided for self-regulated learning during Session 2. In Experiment 1, participants either were prompted to self-test and then evaluate their learning before selecting passages to restudy, were shown the prompt questions but did not overtly self-test or evaluate learning prior to selecting passages, or were not shown any prompts and were simply given the menu for selecting passages to restudy. Participants who self-tested and evaluated learning during Session 2 had a small but significant advantage over the other groups on the final test. Secondary analyses provided evidence that the performance advantage may have been modest because of inaccurate monitoring. Experiment 2 included a group who also self-tested but who evaluated their learning using idea-unit judgments (i.e., by checking their responses against a list of key ideas from the correct response). Participants who self-tested and made idea-unit judgments exhibited a sizable advantage on final test performance. Secondary analyses indicated that the performance advantage was attributable in part to more accurate monitoring and more effective self-regulated learning. An important practical implication is that learning of patient education materials can be enhanced by including appropriate support for learners' self-regulatory processes. (c) 2011 APA, all rights reserved.
Investigation of Laser Welding of Ti Alloys for Cognitive Process Parameters Selection.
Caiazzo, Fabrizia; Caggiano, Alessandra
2018-04-20
Laser welding of titanium alloys is attracting increasing interest as an alternative to traditional joining techniques for industrial applications, with particular reference to the aerospace sector, where welded assemblies allow for the reduction of the buy-to-fly ratio, compared to other traditional mechanical joining techniques. In this research work, an investigation on laser welding of Ti⁻6Al⁻4V alloy plates is carried out through an experimental testing campaign, under different process conditions, in order to perform a characterization of the produced weld bead geometry, with the final aim of developing a cognitive methodology able to support decision-making about the selection of the suitable laser welding process parameters. The methodology is based on the employment of artificial neural networks able to identify correlations between the laser welding process parameters, with particular reference to the laser power, welding speed and defocusing distance, and the weld bead geometric features, on the basis of the collected experimental data.
Investigation of Laser Welding of Ti Alloys for Cognitive Process Parameters Selection
2018-01-01
Laser welding of titanium alloys is attracting increasing interest as an alternative to traditional joining techniques for industrial applications, with particular reference to the aerospace sector, where welded assemblies allow for the reduction of the buy-to-fly ratio, compared to other traditional mechanical joining techniques. In this research work, an investigation on laser welding of Ti–6Al–4V alloy plates is carried out through an experimental testing campaign, under different process conditions, in order to perform a characterization of the produced weld bead geometry, with the final aim of developing a cognitive methodology able to support decision-making about the selection of the suitable laser welding process parameters. The methodology is based on the employment of artificial neural networks able to identify correlations between the laser welding process parameters, with particular reference to the laser power, welding speed and defocusing distance, and the weld bead geometric features, on the basis of the collected experimental data. PMID:29677114
Speaker-independent phoneme recognition with a binaural auditory image model
NASA Astrophysics Data System (ADS)
Francis, Keith Ivan
1997-09-01
This dissertation presents phoneme recognition techniques based on a binaural fusion of outputs of the auditory image model and subsequent azimuth-selective phoneme recognition in a noisy environment. Background information concerning speech variations, phoneme recognition, current binaural fusion techniques and auditory modeling issues is explained. The research is constrained to sources in the frontal azimuthal plane of a simulated listener. A new method based on coincidence detection of neural activity patterns from the auditory image model of Patterson is used for azimuth-selective phoneme recognition. The method is tested in various levels of noise and the results are reported in contrast to binaural fusion methods based on various forms of correlation to demonstrate the potential of coincidence- based binaural phoneme recognition. This method overcomes smearing of fine speech detail typical of correlation based methods. Nevertheless, coincidence is able to measure similarity of left and right inputs and fuse them into useful feature vectors for phoneme recognition in noise.
Targeted Proteomics Approach for Precision Plant Breeding.
Chawade, Aakash; Alexandersson, Erik; Bengtsson, Therese; Andreasson, Erik; Levander, Fredrik
2016-02-05
Selected reaction monitoring (SRM) is a targeted mass spectrometry technique that enables precise quantitation of hundreds of peptides in a single run. This technique provides new opportunities for multiplexed protein biomarker measurements. For precision plant breeding, DNA-based markers have been used extensively, but the potential of protein biomarkers has not been exploited. In this work, we developed an SRM marker panel with assays for 104 potato (Solanum tuberosum) peptides selected using univariate and multivariate statistics. Thereafter, using random forest classification, the prediction markers were identified for Phytopthora infestans resistance in leaves, P. infestans resistance in tubers, and plant yield in potato leaf secretome samples. The results suggest that the marker panel has the predictive potential for three traits, two of which have no commercial DNA markers so far. Furthermore, the marker panel was also tested and found to be applicable to potato clones not used during the marker development. The proposed workflow is thus a proof-of-concept for targeted proteomics as an efficient readout in accelerated breeding for complex and agronomically important traits.
Training site statistics from Landsat and Seasat satellite imagery registered to a common map base
NASA Technical Reports Server (NTRS)
Clark, J.
1981-01-01
Landsat and Seasat satellite imagery and training site boundary coordinates were registered to a common Universal Transverse Mercator map base in the Newport Beach area of Orange County, California. The purpose was to establish a spatially-registered, multi-sensor data base which would test the use of Seasat synthetic aperture radar imagery to improve spectral separability of channels used for land use classification of an urban area. Digital image processing techniques originally developed for the digital mosaics of the California Desert and the State of Arizona were adapted to spatially register multispectral and radar data. Techniques included control point selection from imagery and USGS topographic quadrangle maps, control point cataloguing with the Image Based Information System, and spatial and spectral rectifications of the imagery. The radar imagery was pre-processed to reduce its tendency toward uniform data distributions, so that training site statistics for selected Landsat and pre-processed Seasat imagery indicated good spectral separation between channels.
Roe, M; Kishiyama, C; Davidson, K; Schaefer, L; Todd, J
1995-01-01
We directly compared three techniques for the diagnosis of group A streptococcal pharyngitis in 500 symptomatic children seen in the Emergency Department or Child Care Clinic of The Children's Hospital of Denver. Throats were vigorously swabbed with two rayon swabs, which were transported immediately to the Microbiology Laboratory. Each swab was cultured aerobically on Strep A Isolation Agar (Remel) and then tested for antigen-one swab by the Strep A OIA optical immune assay (BioStar) and the other by the TestPack Plus Strep A (Abbott) technique. Each test was performed blind to the others. The refrigerated pledget was cultured in Todd-Hewitt broth if an antigen test was positive and both direct plate cultures were negative (the "gold standard" was any culture positive). All isolates were serologically grouped. Of 500 complete patient cultures, 151 (30%) were positive for group A streptococcal growth. The two antigen tests gave comparable results with an average sensitivity of 83%. Each was significantly (P < 0.02) less sensitive than its corresponding culture. The BioStar Strep A OIA optical immune assay produced significantly (P < 0.003) more false-positive results than did the Abbott test. Rapid antigen testing is not sensitive enough to eliminate the need for backup cultures. PMID:7650184
Targeted training of the decision rule benefits rule-guided behavior in Parkinson's disease.
Ell, Shawn W
2013-12-01
The impact of Parkinson's disease (PD) on rule-guided behavior has received considerable attention in cognitive neuroscience. The majority of research has used PD as a model of dysfunction in frontostriatal networks, but very few attempts have been made to investigate the possibility of adapting common experimental techniques in an effort to identify the conditions that are most likely to facilitate successful performance. The present study investigated a targeted training paradigm designed to facilitate rule learning and application using rule-based categorization as a model task. Participants received targeted training in which there was no selective-attention demand (i.e., stimuli varied along a single, relevant dimension) or nontargeted training in which there was selective-attention demand (i.e., stimuli varied along a relevant dimension as well as an irrelevant dimension). Following training, all participants were tested on a rule-based task with selective-attention demand. During the test phase, PD patients who received targeted training performed similarly to control participants and outperformed patients who did not receive targeted training. As a preliminary test of the generalizability of the benefit of targeted training, a subset of the PD patients were tested on the Wisconsin card sorting task (WCST). PD patients who received targeted training outperformed PD patients who did not receive targeted training on several WCST performance measures. These data further characterize the contribution of frontostriatal circuitry to rule-guided behavior. Importantly, these data also suggest that PD patient impairment, on selective-attention-demanding tasks of rule-guided behavior, is not inevitable and highlight the potential benefit of targeted training.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orlov, D.; Joshi, V.
Every year, TMS Magnesium Committee carefully selects a special topic in magnesium (Mg) related research and development not only having the hottest subject from both academic and industrial perspectives but also demonstrating major achievements within this subject. Following last year’s topic on Mg microallying [1], this year’s focus is on in-situ methods and associated techniques in their broad definition spanning from laboratory- to large- scale facilities to process monitoring. The applications of in-situ techniques have a wide spectrum from the analysis of melts and liquid-solid transitions to solid-state phenomena during thermo-mechanical processing and heat treatments to surface interactions with variousmore » environments. Therefore, such works are of significant interest to scientists working in the area of Mg alloy development as well as to a much broader audience from both academia and industry. This interest is primarily caused by challenges in the analysis of structure-property relationship in Mg alloys, and even cursory glance of literature reveals sharp increase of publications relevant to this topic recently. For instance, very high reactivity of Mg as well as its well-known propensity to substantially alter structure upon unloading in mechanical testing makes it difficult to understand and thus to simulate correlation between microstructures observed in post-mortem analysis and physical processes during testing or fabrication. However, recent advances in in-situ analysis based on large-scale research facilities such as neutron scattering and synchrotron radiation sources as well as microscopy-based, acoustic emission, and other more traditional techniques allowed significant achievements. Apart from apparent development of relevant experimental techniques, a significant part of this success should also be attributed to increasing accessibility of the facilities and simplification of their use from a user perspective. The selection of articles in this special topic is quite far from comprehensive review covering all aspects of in-situ methods available for metallic materials development. However, this was not the purpose in any case. Instead, this selection is intended to present several studies on Mg alloys and Mg-based composites with the use of in-situ techniques. In addition, it overviews two techniques, ‘Acoustic Emission’ (AE) and ‘Ambient-Pressure X-ray Photoelectron Spectroscopy’ (AP-XPS), that are rather novel to the Mg community. These studies are supposed to give readers representative examples of such techniques potential and to help in navigating the spectrum of modern state-of-the-art analytical methods facilitating the development of Mg alloys. These articles are organized in the order corresponding to typical Mg alloy lifecycle from material preparation and solidification to thermo-mechanical processing and product fabrication to degradation.« less
Selection and static calibration of the Marsh J1678 pressure gauge
NASA Technical Reports Server (NTRS)
Oxendine, Charles R.; Smith, Howard W.
1993-01-01
During the experimental testing of the ultralight, it was determined that a pressure gauge would be required to monitor the simulated flight loads. After analyzing several factors, which are indicated in the discussion section of this report, the Marsh J1678 pressure gauge appeared to be the prominent candidate for the task. However, prior to the final selection, the Marsh pressure gauge was calibrated twice by two different techniques. As a result of the calibration, the Marsh gauge was selected as the appropriate measuring device during the structural testing of the ultralight. Although, there are commerical pressure gauges available on the market that would have proven to be more efficient and accurate. However, in order to obtain these characteristics in a gauge, one has to pay the price on the price tag, and this value is an exponential function of the degree of accuracy efficiency, precision, and many other features that may be designed into the gauge. After analyzing the extent of precision and accuracy that would be required, a more expensive gauge wouldn't have proven to be a financial benefit towards the outcome of the experiment.
Module generation for self-testing integrated systems
NASA Astrophysics Data System (ADS)
Vanriessen, Ronald Pieter
Hardware used for self test in VLSI (Very Large Scale Integrated) systems is reviewed, and an architecture to control the test hardware in an integrated system is presented. Because of the increase of test times, the use of self test techniques has become practically and economically viable for VLSI systems. Beside the reduction in test times and costs, self test also provides testing at operational speeds. Therefore, a suitable combination of scan path and macrospecific (self) tests is required to reduce test times and costs. An expert system that can be used in a silicon compilation environment is presented. The approach requires a minimum of testability knowledge from a system designer. A user friendly interface was described for specifying and modifying testability requirements by a testability expert. A reason directed backtracking mechanism is used to solve selection failures. Both the hierarchical testable architecture and the design for testability expert system are used in a self test compiler. The definition of a self test compiler was given. A self test compiler is a software tool that selects an appropriate test method for every macro in a design. The hardware to control a macro test will be included in the design automatically. As an example, the integration of the self-test compiler in a silicon compilation system PIRAMID was described. The design of a demonstrator circuit by self test compiler is described. This circuit consists of two self testable macros. Control of the self test hardware is carried out via the test access port of the boundary scan standard.
Deep convolutional neural network based antenna selection in multiple-input multiple-output system
NASA Astrophysics Data System (ADS)
Cai, Jiaxin; Li, Yan; Hu, Ying
2018-03-01
Antenna selection of wireless communication system has attracted increasing attention due to the challenge of keeping a balance between communication performance and computational complexity in large-scale Multiple-Input MultipleOutput antenna systems. Recently, deep learning based methods have achieved promising performance for large-scale data processing and analysis in many application fields. This paper is the first attempt to introduce the deep learning technique into the field of Multiple-Input Multiple-Output antenna selection in wireless communications. First, the label of attenuation coefficients channel matrix is generated by minimizing the key performance indicator of training antenna systems. Then, a deep convolutional neural network that explicitly exploits the massive latent cues of attenuation coefficients is learned on the training antenna systems. Finally, we use the adopted deep convolutional neural network to classify the channel matrix labels of test antennas and select the optimal antenna subset. Simulation experimental results demonstrate that our method can achieve better performance than the state-of-the-art baselines for data-driven based wireless antenna selection.
Viscosities of nonelectrolyte liquid mixtures. III. Selected binary and quaternary mixtures
NASA Astrophysics Data System (ADS)
Wakefield, D. L.
1988-05-01
This paper is the final in a series of three viscosity and density studies of pure n-alkanes and selected binary and quaternary mixtures. A standard U-tube viscometer was used for viscosity measurements, and a Pyrex flask-type pycnometer was used for density determinations. Results are given here for pure alkane and selected binary mixtures of n-tetradecane + n-octane, for selected quaternary mixtures of n-hexadecane + n-dodecane + n-decane + n-hexane, and for pure and selected quaternary mixtures of n-hexadecane + n-dodecane + n-nonane + n-heptane at 303.16 and 308.16 K. The principle of congruence was tested, as was the Grunberg and Nissan equation, as they have been shown to be useful as prediction techniques for other n-alkane binary mixtures. Comparisons were made between the two groups of quaternary alkane mixtures and the binary n-tetradecane + n-octane mixtures of the same “pseudo” composition to understand better the dependence of mixture viscosities on the composition parameter.
Ffuzz: Towards full system high coverage fuzz testing on binary executables.
Zhang, Bin; Ye, Jiaxi; Bi, Xing; Feng, Chao; Tang, Chaojing
2018-01-01
Bugs and vulnerabilities in binary executables threaten cyber security. Current discovery methods, like fuzz testing, symbolic execution and manual analysis, both have advantages and disadvantages when exercising the deeper code area in binary executables to find more bugs. In this paper, we designed and implemented a hybrid automatic bug finding tool-Ffuzz-on top of fuzz testing and selective symbolic execution. It targets full system software stack testing including both the user space and kernel space. Combining these two mainstream techniques enables us to achieve higher coverage and avoid getting stuck both in fuzz testing and symbolic execution. We also proposed two key optimizations to improve the efficiency of full system testing. We evaluated the efficiency and effectiveness of our method on real-world binary software and 844 memory corruption vulnerable programs in the Juliet test suite. The results show that Ffuzz can discover software bugs in the full system software stack effectively and efficiently.
Okoyo, Collins; Simiyu, Elses; Njenga, Sammy M; Mwandawiro, Charles
2018-04-11
Kato-Katz technique has been the mainstay test in Schistosoma mansoni diagnosis in endemic areas. However, recent studies have documented its poor sensitivity in evaluating Schistosoma mansoni infection especially in areas with lower rates of transmission. It's the primary diagnostic tool in monitoring impact of the Kenya national school based deworming program on infection transmission, but there is need to consider a more sensitive technique as the prevalence reduces. Therefore, this study explored the relationship between results of the stool-based Kato-Katz technique with urine-based point-of-care circulating cathodic antigen (POC-CCA) test in view to inform decision-making by the program in changing from Kato-Katz to POC-CCA test. We used two cross-sectional surveys conducted pre- and post- mass drug administration (MDA) using praziquantel in a representative random sample of children from 18 schools across 11 counties. A total of 1944 children were randomly sampled for the study. Stool and urine samples were tested for S. mansoni infection using Kato-Katz and POC-CCA methods, respectively. S. mansoni prevalence using each technique was calculated and 95% confidence intervals obtained using binomial regression model. Specificity (Sp) and sensitivity (Sn) were determined using 2 × 2 contingency tables and compared using the McNemar's chi-square test. A total of 1899 and 1878 children were surveyed at pre- and post-treatment respectively. S. mansoni infection prevalence was 26.5 and 21.4% during pre- and post-treatment respectively using POC-CCA test, and 4.9 and 1.5% for pre- and post-treatment respectively using Kato-Katz technique. Taking POC-CCA as the gold standard, Kato-Katz was found to have significantly lower sensitivity both at pre- and post-treatment, Sn = 12.5% and Sn = 5.2% respectively, McNemar test χ 2 m = 782.0, p < 0.001. In overall, the results showed a slight/poor agreement between the two methods, kappa index (k) = 0.11, p < 0.001, inter-rater agreement = 77.1%. Results showed POC-CCA technique as an effective, sensitive and accurate screening tool for Schistosoma mansoni infection in areas of low prevalence. It was up to 14-fold accurate than Kato-Katz which had extremely inadequate sensitivity. We recommend usage of POC-CCA alongside Kato-Katz examinations by Schistosomiasis control programs in low prevalence areas.
A new time-frequency method for identification and classification of ball bearing faults
NASA Astrophysics Data System (ADS)
Attoui, Issam; Fergani, Nadir; Boutasseta, Nadir; Oudjani, Brahim; Deliou, Adel
2017-06-01
In order to fault diagnosis of ball bearing that is one of the most critical components of rotating machinery, this paper presents a time-frequency procedure incorporating a new feature extraction step that combines the classical wavelet packet decomposition energy distribution technique and a new feature extraction technique based on the selection of the most impulsive frequency bands. In the proposed procedure, firstly, as a pre-processing step, the most impulsive frequency bands are selected at different bearing conditions using a combination between Fast-Fourier-Transform FFT and Short-Frequency Energy SFE algorithms. Secondly, once the most impulsive frequency bands are selected, the measured machinery vibration signals are decomposed into different frequency sub-bands by using discrete Wavelet Packet Decomposition WPD technique to maximize the detection of their frequency contents and subsequently the most useful sub-bands are represented in the time-frequency domain by using Short Time Fourier transform STFT algorithm for knowing exactly what the frequency components presented in those frequency sub-bands are. Once the proposed feature vector is obtained, three feature dimensionality reduction techniques are employed using Linear Discriminant Analysis LDA, a feedback wrapper method and Locality Sensitive Discriminant Analysis LSDA. Lastly, the Adaptive Neuro-Fuzzy Inference System ANFIS algorithm is used for instantaneous identification and classification of bearing faults. In order to evaluate the performances of the proposed method, different testing data set to the trained ANFIS model by using different conditions of healthy and faulty bearings under various load levels, fault severities and rotating speed. The conclusion resulting from this paper is highlighted by experimental results which prove that the proposed method can serve as an intelligent bearing fault diagnosis system.
Utilization of advanced calibration techniques in stochastic rock fall analysis of quarry slopes
NASA Astrophysics Data System (ADS)
Preh, Alexander; Ahmadabadi, Morteza; Kolenprat, Bernd
2016-04-01
In order to study rock fall dynamics, a research project was conducted by the Vienna University of Technology and the Austrian Central Labour Inspectorate (Federal Ministry of Labour, Social Affairs and Consumer Protection). A part of this project included 277 full-scale drop tests at three different quarries in Austria and recording key parameters of the rock fall trajectories. The tests involved a total of 277 boulders ranging from 0.18 to 1.8 m in diameter and from 0.009 to 8.1 Mg in mass. The geology of these sites included strong rock belonging to igneous, metamorphic and volcanic types. In this paper the results of the tests are used for calibration and validation a new stochastic computer model. It is demonstrated that the error of the model (i.e. the difference between observed and simulated results) has a lognormal distribution. Selecting two parameters, advanced calibration techniques including Markov Chain Monte Carlo Technique, Maximum Likelihood and Root Mean Square Error (RMSE) are utilized to minimize the error. Validation of the model based on the cross validation technique reveals that in general, reasonable stochastic approximations of the rock fall trajectories are obtained in all dimensions, including runout, bounce heights and velocities. The approximations are compared to the measured data in terms of median, 95% and maximum values. The results of the comparisons indicate that approximate first-order predictions, using a single set of input parameters, are possible and can be used to aid practical hazard and risk assessment.
NASA Astrophysics Data System (ADS)
Kim, Saejoon
2018-01-01
We consider the problem of low-volatility portfolio selection which has been the subject of extensive research in the field of portfolio selection. To improve the currently existing techniques that rely purely on past information to select low-volatility portfolios, this paper investigates the use of time series regression techniques that make forecasts of future volatility to select the portfolios. In particular, for the first time, the utility of support vector regression and its enhancements as portfolio selection techniques is provided. It is shown that our regression-based portfolio selection provides attractive outperformances compared to the benchmark index and the portfolio defined by a well-known strategy on the data-sets of the S&P 500 and the KOSPI 200.
Laser-induced selective copper plating of polypropylene surface
NASA Astrophysics Data System (ADS)
Ratautas, K.; Gedvilas, M.; Stankevičiene, I.; JagminienÄ--, A.; Norkus, E.; Li Pira, N.; Sinopoli, S.; Emanuele, U.; Račiukaitis, G.
2016-03-01
Laser writing for selective plating of electro-conductive lines for electronics has several significant advantages, compared to conventional printed circuit board technology. Firstly, this method is faster and cheaper at the prototyping stage. Secondly, material consumption is reduced, because it works selectively. However, the biggest merit of this method is potentiality to produce moulded interconnect device, enabling to create electronics on complex 3D surfaces, thus saving space, materials and cost of production. There are two basic techniques of laser writing for selective plating on plastics: the laser-induced selective activation (LISA) and laser direct structuring (LDS). In the LISA method, pure plastics without any dopant (filler) can be used. In the LDS method, special fillers are mixed in the polymer matrix. These fillers are activated during laser writing process, and, in the next processing step, the laser modified area can be selectively plated with metals. In this work, both methods of the laser writing for the selective plating of polymers were investigated and compared. For LDS approach, new material: polypropylene with carbon-based additives was tested using picosecond and nanosecond laser pulses. Different laser processing parameters (laser pulse energy, scanning speed, the number of scans, pulse durations, wavelength and overlapping of scanned lines) were applied in order to find out the optimal regime of activation. Areal selectivity tests showed a high plating resolution. The narrowest width of a copper-plated line was less than 23 μm. Finally, our material was applied to the prototype of the electronic circuit board on a 2D surface.
Subudhi, Badri Narayan; Thangaraj, Veerakumar; Sankaralingam, Esakkirajan; Ghosh, Ashish
2016-11-01
In this article, a statistical fusion based segmentation technique is proposed to identify different abnormality in magnetic resonance images (MRI). The proposed scheme follows seed selection, region growing-merging and fusion of multiple image segments. In this process initially, an image is divided into a number of blocks and for each block we compute the phase component of the Fourier transform. The phase component of each block reflects the gray level variation among the block but contains a large correlation among them. Hence a singular value decomposition (SVD) technique is adhered to generate a singular value of each block. Then a thresholding procedure is applied on these singular values to identify edgy and smooth regions and some seed points are selected for segmentation. By considering each seed point we perform a binary segmentation of the complete MRI and hence with all seed points we get an equal number of binary images. A parcel based statistical fusion process is used to fuse all the binary images into multiple segments. Effectiveness of the proposed scheme is tested on identifying different abnormalities: prostatic carcinoma detection, tuberculous granulomas identification and intracranial neoplasm or brain tumor detection. The proposed technique is established by comparing its results against seven state-of-the-art techniques with six performance evaluation measures. Copyright © 2016 Elsevier Inc. All rights reserved.
Target oriented dimensionality reduction of hyperspectral data by Kernel Fukunaga-Koontz Transform
NASA Astrophysics Data System (ADS)
Binol, Hamidullah; Ochilov, Shuhrat; Alam, Mohammad S.; Bal, Abdullah
2017-02-01
Principal component analysis (PCA) is a popular technique in remote sensing for dimensionality reduction. While PCA is suitable for data compression, it is not necessarily an optimal technique for feature extraction, particularly when the features are exploited in supervised learning applications (Cheriyadat and Bruce, 2003) [1]. Preserving features belonging to the target is very crucial to the performance of target detection/recognition techniques. Fukunaga-Koontz Transform (FKT) based supervised band reduction technique can be used to provide this requirement. FKT achieves feature selection by transforming into a new space in where feature classes have complimentary eigenvectors. Analysis of these eigenvectors under two classes, target and background clutter, can be utilized for target oriented band reduction since each basis functions best represent target class while carrying least information of the background class. By selecting few eigenvectors which are the most relevant to the target class, dimension of hyperspectral data can be reduced and thus, it presents significant advantages for near real time target detection applications. The nonlinear properties of the data can be extracted by kernel approach which provides better target features. Thus, we propose constructing kernel FKT (KFKT) to present target oriented band reduction. The performance of the proposed KFKT based target oriented dimensionality reduction algorithm has been tested employing two real-world hyperspectral data and results have been reported consequently.