Sample records for testing large numbers

  1. Composite Overwrap Fragmentation Observations, Concerns, and Recommendations

    NASA Technical Reports Server (NTRS)

    Bangham, Mike; Hovater, Mary

    2017-01-01

    A series of test activities has raised some concerns about the generation of orbital debris caused by failures of composite overwrapped pressure vessels (COPVs). These tests have indicated that a large number of composite fragments can be produced by either pressure burst failures or by high-speed impacts. A review of prior high-speed tests with COPV indicates that other tests have produced large numbers of composite fragments. As was the case with the test referenced here, the tests tended to produce a large number of small composite fragments with relatively low velocities induced by the impact and or gas expansion.

  2. The effect of peer-group size on the delivery of feedback in basic life support refresher training: a cluster randomized controlled trial.

    PubMed

    Cho, Youngsuk; Je, Sangmo; Yoon, Yoo Sang; Roh, Hye Rin; Chang, Chulho; Kang, Hyunggoo; Lim, Taeho

    2016-07-04

    Students are largely providing feedback to one another when instructor facilitates peer feedback rather than teaching in group training. The number of students in a group affect the learning of students in the group training. We aimed to investigate whether a larger group size increases students' test scores on a post-training test with peer feedback facilitated by instructor after video-guided basic life support (BLS) refresher training. Students' one-rescuer adult BLS skills were assessed by a 2-min checklist-based test 1 year after the initial training. A cluster randomized controlled trial was conducted to evaluate the effect of student number in a group on BLS refresher training. Participants included 115 final-year medical students undergoing their emergency medicine clerkship. The median number of students was 8 in the large groups and 4 in the standard group. The primary outcome was to examine group differences in post-training test scores after video-guided BLS training. Secondary outcomes included the feedback time, number of feedback topics, and results of end-of-training evaluation questionnaires. Scores on the post-training test increased over three consecutive tests with instructor-led peer feedback, but not differ between large and standard groups. The feedback time was longer and number of feedback topics generated by students were higher in standard groups compared to large groups on the first and second tests. The end-of-training questionnaire revealed that the students in large groups preferred the smaller group size compared to their actual group size. In this BLS refresher training, the instructor-led group feedback increased the test score after tutorial video-guided BLS learning, irrespective of the group size. A smaller group size allowed more participations in peer feedback.

  3. The cost of large numbers of hypothesis tests on power, effect size and sample size.

    PubMed

    Lazzeroni, L C; Ray, A

    2012-01-01

    Advances in high-throughput biology and computer science are driving an exponential increase in the number of hypothesis tests in genomics and other scientific disciplines. Studies using current genotyping platforms frequently include a million or more tests. In addition to the monetary cost, this increase imposes a statistical cost owing to the multiple testing corrections needed to avoid large numbers of false-positive results. To safeguard against the resulting loss of power, some have suggested sample sizes on the order of tens of thousands that can be impractical for many diseases or may lower the quality of phenotypic measurements. This study examines the relationship between the number of tests on the one hand and power, detectable effect size or required sample size on the other. We show that once the number of tests is large, power can be maintained at a constant level, with comparatively small increases in the effect size or sample size. For example at the 0.05 significance level, a 13% increase in sample size is needed to maintain 80% power for ten million tests compared with one million tests, whereas a 70% increase in sample size is needed for 10 tests compared with a single test. Relative costs are less when measured by increases in the detectable effect size. We provide an interactive Excel calculator to compute power, effect size or sample size when comparing study designs or genome platforms involving different numbers of hypothesis tests. The results are reassuring in an era of extreme multiple testing.

  4. Development, Testing and Operation of a Large Suspended Ocean Measurement Structure for Deep-Ocean Use

    DTIC Science & Technology

    1992-05-01

    and systems for developing , testing, and operating the system. A new, lightweight cable de- used this evolving technology base in the ensuing years...Funding Numbers. Development , Testing, and Operation of a Large Suspended Ocean Contrac Measurement Structure for Deep-Ocean Use Program Element No...Research L.aboratory Report Number. Ocean Acoutics and Technology Directorate PR 91:132:253 Stennis Space Center, MS 39529-5004 9. Sponsoring

  5. 77 FR 71574 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-03

    ... Test. OMB Control Number: None Form Number(s): The automated survey instrument has no form number. Type... have been developed and are now slated for a large-scale field test to evaluate the questions and the... reference period and timing of data collection. Qualitative research has [[Page 71575

  6. Dynamic test/analysis correlation using reduced analytical models

    NASA Technical Reports Server (NTRS)

    Mcgowan, Paul E.; Angelucci, A. Filippo; Javeed, Mehzad

    1992-01-01

    Test/analysis correlation is an important aspect of the verification of analysis models which are used to predict on-orbit response characteristics of large space structures. This paper presents results of a study using reduced analysis models for performing dynamic test/analysis correlation. The reduced test-analysis model (TAM) has the same number and orientation of DOF as the test measurements. Two reduction methods, static (Guyan) reduction and the Improved Reduced System (IRS) reduction, are applied to the test/analysis correlation of a laboratory truss structure. Simulated test results and modal test data are used to examine the performance of each method. It is shown that selection of DOF to be retained in the TAM is critical when large structural masses are involved. In addition, the use of modal test results may provide difficulties in TAM accuracy even if a large number of DOF are retained in the TAM.

  7. Comparison of jet Mach number decay data with a correlation and jet spreading contours for a large variety of nozzles

    NASA Technical Reports Server (NTRS)

    Groesbeck, D. E.; Huff, R. G.; Vonglahn, U. H.

    1977-01-01

    Small-scale circular, noncircular, single- and multi-element nozzles with flow areas as large as 122 sq cm were tested with cold airflow at exit Mach numbers from 0.28 to 1.15. The effects of multi-element nozzle shape and element spacing on jet Mach number decay were studied in an effort to reduce the noise caused by jet impingement on externally blown flap (EBF) STOL aircraft. The jet Mach number decay data are well represented by empirical relations. Jet spreading and Mach number decay contours are presented for all configurations tested.

  8. Some anomalies observed in wind-tunnel tests of a blunt body at transonic and supersonic speeds

    NASA Technical Reports Server (NTRS)

    Brooks, J. D.

    1976-01-01

    An investigation of anomalies observed in wind tunnel force tests of a blunt body configuration was conducted at Mach numbers from 0.20 to 1.35 in the Langley 8-foot transonic pressure tunnel and at Mach numbers of 1.50, 1,80, and 2.16 in the Langley Unitary Plan wind tunnel. At a Mach number of 1.35, large variations occurred in axial force coefficient at a given angle of attack. At transonic and low supersonic speeds, the total drag measured in the wind tunnel was much lower than that measured during earlier ballistic range tests. Accurate measurements of total drag for blunt bodies will require the use of models smaller than those tested thus far; however, it appears that accurate forebody drag results can be obtained by using relatively large models. Shock standoff distance is presented from experimental data over the Mach number range from 1.05 to 4.34. Theory accurately predicts the shock standoff distance at Mach numbers up to 1.75.

  9. Analysis of large system black box verification test data

    NASA Technical Reports Server (NTRS)

    Clapp, Kenneth C.; Iyer, Ravishankar Krishnan

    1993-01-01

    Issues regarding black box, large systems verification are explored. It begins by collecting data from several testing teams. An integrated database containing test, fault, repair, and source file information is generated. Intuitive effectiveness measures are generated using conventional black box testing results analysis methods. Conventional analysts methods indicate that the testing was effective in the sense that as more tests were run, more faults were found. Average behavior and individual data points are analyzed. The data is categorized and average behavior shows a very wide variation in number of tests run and in pass rates (pass rates ranged from 71 percent to 98 percent). The 'white box' data contained in the integrated database is studied in detail. Conservative measures of effectiveness are discussed. Testing efficiency (ratio of repairs to number of tests) is measured at 3 percent, fault record effectiveness (ratio of repairs to fault records) is measured at 55 percent, and test script redundancy (ratio of number of failed tests to minimum number of tests needed to find the faults) ranges from 4.2 to 15.8. Error prone source files and subsystems are identified. A correlational mapping of test functional area to product subsystem is completed. A new adaptive testing process based on real-time generation of the integrated database is proposed.

  10. [Tumor markers for bladder cancer: up-to-date study by the Kiel Tumor Bank].

    PubMed

    Hautmann, S; Eggers, J; Meyhoff, H; Melchior, D; Munk, A; Hamann, M; Naumann, M; Braun, P M; Jünemann, K P

    2007-11-01

    The number of noninvasive diagnostic tests for bladder cancer has increased tremendously over the last years with a large number of experimental and commercial tests. Comparative analyses of tests for diagnosis, follow-up, and recurrence detection of bladder cancer were performed retrospectively as well as prospectively, unicentrically, and multicentrically. An analysis of multicentric studies with large patient numbers compared with our own Kiel Tumor Bank data is presented. The Kiel Tumor Bank data looked prospectively at 106 consecutive bladder tumor patients from the year 2006. Special focus was put on urine cytology as a reference test, as well as the commercial NMP 22 Bladder Chek. The analysis of the NMP 22 Bladder Chek showed an overall sensitivity of 69% for all tumor grades and stages, with a specificity of 76%. Comparison to multicentric data with an overall sensitivity of 75% for all tumor grades and stages, with a specificity of 73%, showed results similar to those in the literature. Urine cytology showed a comparable overall sensitivity of 73% for all tumor grades and stages, with a specificity of 80%. A large number of noninvasive tests for bladder cancer follow-up with reasonable sensitivity and specificity can currently be used. Because of limited numbers of prospective randomized multicentric studies, no single particular marker for bladder cancer screening can be recommended at this point in time.

  11. Design and test of a natural laminar flow/large Reynolds number airfoil with a high design cruise lift coefficient

    NASA Technical Reports Server (NTRS)

    Kolesar, C. E.

    1987-01-01

    Research activity on an airfoil designed for a large airplane capable of very long endurance times at a low Mach number of 0.22 is examined. Airplane mission objectives and design optimization resulted in requirements for a very high design lift coefficient and a large amount of laminar flow at high Reynolds number to increase the lift/drag ratio and reduce the loiter lift coefficient. Natural laminar flow was selected instead of distributed mechanical suction for the measurement technique. A design lift coefficient of 1.5 was identified as the highest which could be achieved with a large extent of laminar flow. A single element airfoil was designed using an inverse boundary layer solution and inverse airfoil design computer codes to create an airfoil section that would achieve performance goals. The design process and results, including airfoil shape, pressure distributions, and aerodynamic characteristics are presented. A two dimensional wind tunnel model was constructed and tested in a NASA Low Turbulence Pressure Tunnel which enabled testing at full scale design Reynolds number. A comparison is made between theoretical and measured results to establish accuracy and quality of the airfoil design technique.

  12. Force test of a 0.88 percent scale 142-inch diameter solid rocket booster (MSFC model number 461) in the NASA/MSFC high Reynolds number wind tunnel (SA13F)

    NASA Technical Reports Server (NTRS)

    Johnson, J. D.; Winkler, G. W.

    1976-01-01

    The results are presented of a force test of a .88 percent scale model of the 142 inch solid rocket booster without protuberances, conducted in the MSFC high Reynolds number wind tunnel. The objective of this test was to obtain aerodynamic force data over a large range of Reynolds numbers. The test was conducted over a Mach number range from 0.4 to 3.5. Reynolds numbers based on model diameter (1.25 inches) ranged from .75 million to 13.5 million. The angle of attack range was from 35 to 145 degrees.

  13. The EPA ToxCast Program: Developing Predictive Bioactivity Signatures for Chemicals

    EPA Science Inventory

    There are tens of thousands of chemicals used in the environment for which little or no toxicology information is known. Current testing paradigms that use large numbers of animals to perform in vivo toxicology are too slow and expensive to apply to this large number of chemicals...

  14. Of Small Beauties and Large Beasts: The Quality of Distractors on Multiple-Choice Tests Is More Important than Their Quantity

    ERIC Educational Resources Information Center

    Papenberg, Martin; Musch, Jochen

    2017-01-01

    In multiple-choice tests, the quality of distractors may be more important than their number. We therefore examined the joint influence of distractor quality and quantity on test functioning by providing a sample of 5,793 participants with five parallel test sets consisting of items that differed in the number and quality of distractors.…

  15. Composite Failures: A Comparison of Experimental Test Results and Computational Analysis Using XFEM

    DTIC Science & Technology

    2016-09-30

    NUWC-NPT Technical Report 12,218 30 September 2016 Composite Failures: A Comparison of Experimental Test Results and Computational Analysis...A Comparison of Experimental Test Results and Computational Analysis Using XFEM 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...availability of measurement techniques, experimental testing of composite materials has largely outpaced the computational modeling ability, forcing

  16. Towards Risk-Based Test Protocols: Estimating the Contribution of Intensive Testing to the UK Bovine Tuberculosis Problem

    PubMed Central

    van Dijk, Jan

    2013-01-01

    Eradicating disease from livestock populations involves the balancing act of removing sufficient numbers of diseased animals without removing too many healthy individuals in the process. As ever more tests for bovine tuberculosis (BTB) are carried out on the UK cattle herd, and each positive herd test triggers more testing, the question arises whether ‘false positive’ results contribute significantly to the measured BTB prevalence. Here, this question is explored using simple probabilistic models of test behaviour. When the screening test is applied to the average UK herd, the estimated proportion of test-associated false positive new outbreaks is highly sensitive to small fluctuations in screening test specificity. Estimations of this parameter should be updated as a priority. Once outbreaks have been confirmed in screening-test positive herds, the following rounds of intensive testing with more sensitive, albeit less specific, tests are highly likely to remove large numbers of false positive animals from herds. Despite this, it is unlikely that significantly more truly infected animals are removed. BTB test protocols should become based on quantified risk in order to prevent the needless slaughter of large numbers of healthy animals. PMID:23717517

  17. Prime Numbers Comparison using Sieve of Eratosthenes and Sieve of Sundaram Algorithm

    NASA Astrophysics Data System (ADS)

    Abdullah, D.; Rahim, R.; Apdilah, D.; Efendi, S.; Tulus, T.; Suwilo, S.

    2018-03-01

    Prime numbers are numbers that have their appeal to researchers due to the complexity of these numbers, many algorithms that can be used to generate prime numbers ranging from simple to complex computations, Sieve of Eratosthenes and Sieve of Sundaram are two algorithm that can be used to generate Prime numbers of randomly generated or sequential numbered random numbers, testing in this study to find out which algorithm is better used for large primes in terms of time complexity, the test also assisted with applications designed using Java language with code optimization and Maximum memory usage so that the testing process can be simultaneously and the results obtained can be objective

  18. Real-time fast physical random number generator with a photonic integrated circuit.

    PubMed

    Ugajin, Kazusa; Terashima, Yuta; Iwakawa, Kento; Uchida, Atsushi; Harayama, Takahisa; Yoshimura, Kazuyuki; Inubushi, Masanobu

    2017-03-20

    Random number generators are essential for applications in information security and numerical simulations. Most optical-chaos-based random number generators produce random bit sequences by offline post-processing with large optical components. We demonstrate a real-time hardware implementation of a fast physical random number generator with a photonic integrated circuit and a field programmable gate array (FPGA) electronic board. We generate 1-Tbit random bit sequences and evaluate their statistical randomness using NIST Special Publication 800-22 and TestU01. All of the BigCrush tests in TestU01 are passed using 410-Gbit random bit sequences. A maximum real-time generation rate of 21.1 Gb/s is achieved for random bit sequences in binary format stored in a computer, which can be directly used for applications involving secret keys in cryptography and random seeds in large-scale numerical simulations.

  19. Screening large-scale association study data: exploiting interactions using random forests.

    PubMed

    Lunetta, Kathryn L; Hayward, L Brooke; Segal, Jonathan; Van Eerdewegh, Paul

    2004-12-10

    Genome-wide association studies for complex diseases will produce genotypes on hundreds of thousands of single nucleotide polymorphisms (SNPs). A logical first approach to dealing with massive numbers of SNPs is to use some test to screen the SNPs, retaining only those that meet some criterion for further study. For example, SNPs can be ranked by p-value, and those with the lowest p-values retained. When SNPs have large interaction effects but small marginal effects in a population, they are unlikely to be retained when univariate tests are used for screening. However, model-based screens that pre-specify interactions are impractical for data sets with thousands of SNPs. Random forest analysis is an alternative method that produces a single measure of importance for each predictor variable that takes into account interactions among variables without requiring model specification. Interactions increase the importance for the individual interacting variables, making them more likely to be given high importance relative to other variables. We test the performance of random forests as a screening procedure to identify small numbers of risk-associated SNPs from among large numbers of unassociated SNPs using complex disease models with up to 32 loci, incorporating both genetic heterogeneity and multi-locus interaction. Keeping other factors constant, if risk SNPs interact, the random forest importance measure significantly outperforms the Fisher Exact test as a screening tool. As the number of interacting SNPs increases, the improvement in performance of random forest analysis relative to Fisher Exact test for screening also increases. Random forests perform similarly to the univariate Fisher Exact test as a screening tool when SNPs in the analysis do not interact. In the context of large-scale genetic association studies where unknown interactions exist among true risk-associated SNPs or SNPs and environmental covariates, screening SNPs using random forest analyses can significantly reduce the number of SNPs that need to be retained for further study compared to standard univariate screening methods.

  20. Gas-Centered Swirl Coaxial Liquid Injector Evaluations

    NASA Technical Reports Server (NTRS)

    Cohn, A. K.; Strakey, P. A.; Talley, D. G.

    2005-01-01

    Development of Liquid Rocket Engines is expensive. Extensive testing at large scales usually required. In order to verify engine lifetime, large number of tests required. Limited Resources available for development. Sub-scale cold-flow and hot-fire testing is extremely cost effective. Could be a necessary (but not sufficient) condition for long engine lifetime. Reduces overall costs and risk of large scale testing. Goal: Determine knowledge that can be gained from sub-scale cold-flow and hot-fire evaluations of LRE injectors. Determine relationships between cold-flow and hot-fire data.

  1. NOAO testing procedures for large optics

    NASA Astrophysics Data System (ADS)

    Stepp, Larry M.; Poczulp, Gary A.; Pearson, Earl T.; Roddier, Nicolas A.

    1992-03-01

    This paper describes optical testing procedures used at the National Optical Astronomy Observatories (NOAO) for testing large optics. It begins with a discussion of the philosophy behind the testing approach and then describes a number of different testing methods used at NOAO, including the wire test, full-aperture and sub-aperture Hartmann testing, and scatterplate interferometry. Specific innovations that enhance the testing capabilities are mentioned. NOAO data reduction software is described. Examples are given of specific output formats that are useful to the optician, using illustrations taken from recent testing of a 3.5- meter, f/1.75 borosilicate honeycomb mirror. Finally, we discuss some of the optical testing challenges posed by the large optics for the Gemini 8-meter Telescopes Project.

  2. Improving the Inventory of Large Lunar Basins: Using Lola Data to Test Previous Candidates and Search for New Ones

    NASA Technical Reports Server (NTRS)

    Frey, Herbert V.; Meyer, H. M.

    2012-01-01

    Topography and crustal thickness data from LOLA altimetry were used to test the validity of 98 candidate large lunar basins derived from photogeologic and earlier topographic and crustal thickness data, and to search for possible new candidates. We eliminate 23 previous candidates but find good evidence for 20 new candidates. The number of basins > 300 km diameter on the Moon is almost certainly a factor 2 (maybe 3?) larger than the number of named features having basin-like topography.

  3. Infants Use Different Mechanisms to Make Small and Large Number Ordinal Judgments

    ERIC Educational Resources Information Center

    vanMarle, Kristy

    2013-01-01

    Previous research has shown indirectly that infants may use two different mechanisms-an object tracking system and an analog magnitude mechanism--to represent small (less than 4) and large (greater than or equal to 4) numbers of objects, respectively. The current study directly tested this hypothesis in an ordinal choice task by presenting 10- to…

  4. School Exits in the Milwaukee Parental Choice Program: Evidence of a Marketplace?

    ERIC Educational Resources Information Center

    Ford, Michael

    2011-01-01

    This article examines whether the large number of school exits from the Milwaukee school voucher program is evidence of a marketplace. Two logistic regression and multinomial logistic regression models tested the relation between the inability to draw large numbers of voucher students and the ability for a private school to remain viable. Data on…

  5. The Theory about CD-CAT Based on FCA and Its Application

    ERIC Educational Resources Information Center

    Shuqun, Yang; Shuliang, Ding; Zhiqiang, Yao

    2009-01-01

    Cognitive diagnosis (CD) plays an important role in intelligent tutoring system. Computerized adaptive testing (CAT) is adaptive, fair, and efficient, which is suitable to large-scale examination. Traditional cognitive diagnostic test needs quite large number of items, the efficient and tailored CAT could be a remedy for it, so the CAT with…

  6. Synthetic Modifications In the Frequency Domain for Finite Element Model Update and Damage Detection

    DTIC Science & Technology

    2017-09-01

    Sensitivity-based finite element model updating and structural damage detection has been limited by the number of modes available in a vibration test and...increase the number of modes and corresponding sensitivity data by artificially constraining the structure under test, producing a large number of... structural modifications to the measured data, including both springs-to-ground and mass modifications. This is accomplished with frequency domain

  7. Aquatic Plant Control Research Program. Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants. Reports 2 and 3. First and Second Year Poststocking Results. Volume 5. The Herpetofauna of Lake Conway, Florida: Community Analysis.

    DTIC Science & Technology

    1983-07-01

    TEST CHART NATIONAL BVIREAU OF StANARS-1963- I AQUATIC PLANT CONTROL RESEARCH PROGRAM TECHNICAL REPORT A-78-2 LARGE-SCALE OPERATIONS MANAGEMENT TEST OF...Waterways Experiment Station P. 0. Box 631, Vicksburg, Miss. 39180 83 11 01 018 - I ., lit I III I | LARGE-SCALE OPERATIONS MANAGEMENT TEST OF USE OF THE...No. 3. RECIPIENT’S CATALOG NUMBER Technical Report A-78-2 Aa 1 Lj 19 ________5!1___ A. TITLE (Ad Subtitle) LARGE-SCALE OPERATIONS MANAGEMENT S. TYPE

  8. Non-symbolic halving in an Amazonian indigene group

    PubMed Central

    McCrink, Koleen; Spelke, Elizabeth S.; Dehaene, Stanislas; Pica, Pierre

    2014-01-01

    Much research supports the existence of an Approximate Number System (ANS) that is recruited by infants, children, adults, and non-human animals to generate coarse, non-symbolic representations of number. This system supports simple arithmetic operations such as addition, subtraction, and ordering of amounts. The current study tests whether an intuition of a more complex calculation, division, exists in an indigene group in the Amazon, the Mundurucu, whose language includes no words for large numbers. Mundurucu children were presented with a video event depicting a division transformation of halving, in which pairs of objects turned into single objects, reducing the array's numerical magnitude. Then they were tested on their ability to calculate the outcome of this division transformation with other large-number arrays. The Mundurucu children effected this transformation even when non-numerical variables were controlled, performed above chance levels on the very first set of test trials, and exhibited performance similar to urban children who had access to precise number words and a surrounding symbolic culture. We conclude that a halving calculation is part of the suite of intuitive operations supported by the ANS. PMID:23587042

  9. Capuchin monkeys (Cebus apella) treat small and large numbers of items similarly during a relative quantity judgment task.

    PubMed

    Beran, Michael J; Parrish, Audrey E

    2016-08-01

    A key issue in understanding the evolutionary and developmental emergence of numerical cognition is to learn what mechanism(s) support perception and representation of quantitative information. Two such systems have been proposed, one for dealing with approximate representation of sets of items across an extended numerical range and another for highly precise representation of only small numbers of items. Evidence for the first system is abundant across species and in many tests with human adults and children, whereas the second system is primarily evident in research with children and in some tests with non-human animals. A recent paper (Choo & Franconeri, Psychonomic Bulletin & Review, 21, 93-99, 2014) with adult humans also reported "superprecise" representation of small sets of items in comparison to large sets of items, which would provide more support for the presence of a second system in human adults. We first presented capuchin monkeys with a test similar to that of Choo and Franconeri in which small or large sets with the same ratios had to be discriminated. We then presented the same monkeys with an expanded range of comparisons in the small number range (all comparisons of 1-9 items) and the large number range (all comparisons of 10-90 items in 10-item increments). Capuchin monkeys showed no increased precision for small over large sets in making these discriminations in either experiment. These data indicate a difference in the performance of monkeys to that of adult humans, and specifically that monkeys do not show improved discrimination performance for small sets relative to large sets when the relative numerical differences are held constant.

  10. Experience with specifications applicable to certification. [of photovoltaic modules for large-scale application

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.

    1982-01-01

    The Jet Propulsion Laboratory has developed a number of photovoltaic test and measurement specifications to guide the development of modules toward the requirements of future large-scale applications. Experience with these specifications and the extensive module measurement and testing that has accompanied their use is examined. Conclusions are drawn relative to three aspects of product certification: performance measurement, endurance testing and safety evaluation.

  11. The Impacts of Grade Retention: Benefits and Challenges Perceived by Retained Middle School Students

    ERIC Educational Resources Information Center

    Rand, Lauren E.

    2013-01-01

    No Child Left Behind legislation and high stakes testing have increased the pressure for public schools to ensure academic achievement for all students. Each year, a large number of students do not demonstrate adequate achievement and are retained to repeat the grade level. The large number of students retained is an indication that the system…

  12. Gender Differences in Unilateral Spatial Neglect within 24 Hours of Ischemic Stroke

    ERIC Educational Resources Information Center

    Kleinman, Jonathan T.; Gottesman, Rebecca F.; Davis, Cameron; Newhart, Melissa; Heidler-Gary, Jennifer; Hillis, Argye E.

    2008-01-01

    Hemispatial neglect is a common and disabling consequence of stroke. Previous reports examining the relationship between gender and the incidence of unilateral spatial neglect (USN) have included either a large numbers of patients with few neglect tests or small numbers of patients with multiple tests. To determine if USN was more common and/or…

  13. Experimental feasibility study of the application of magnetic suspension techniques to large-scale aerodynamic test facilities

    NASA Technical Reports Server (NTRS)

    Zapata, R. N.; Humphris, R. R.; Henderson, K. C.

    1974-01-01

    Based on the premises that (1) magnetic suspension techniques can play a useful role in large-scale aerodynamic testing and (2) superconductor technology offers the only practical hope for building large-scale magnetic suspensions, an all-superconductor three-component magnetic suspension and balance facility was built as a prototype and was tested successfully. Quantitative extrapolations of design and performance characteristics of this prototype system to larger systems compatible with existing and planned high Reynolds number facilities have been made and show that this experimental technique should be particularly attractive when used in conjunction with large cryogenic wind tunnels.

  14. Experimental feasibility study of the application of magnetic suspension techniques to large-scale aerodynamic test facilities. [cryogenic traonics wind tunnel

    NASA Technical Reports Server (NTRS)

    Zapata, R. N.; Humphris, R. R.; Henderson, K. C.

    1975-01-01

    Based on the premises that magnetic suspension techniques can play a useful role in large scale aerodynamic testing, and that superconductor technology offers the only practical hope for building large scale magnetic suspensions, an all-superconductor 3-component magnetic suspension and balance facility was built as a prototype and tested sucessfully. Quantitative extrapolations of design and performance characteristics of this prototype system to larger systems compatible with existing and planned high Reynolds number facilities at Langley Research Center were made and show that this experimental technique should be particularly attractive when used in conjunction with large cryogenic wind tunnels.

  15. Testing of small and large sign support systems FOIL test number : 92F011

    DOT National Transportation Integrated Search

    1994-07-01

    This test report contains the results of a crash test performed at the Federal Outdoor Impact Laboratory (FOIL) in McLean, Virginia. The test was performed on a small sign support system at 20 mi/h (8.9 m/s), test 92FOll. The vehicle used for these t...

  16. Testing of small and large sign support systems FOIL test number : 92F036

    DOT National Transportation Integrated Search

    1994-07-01

    This test report contains the results of a crash test performed at the Federal Outdoor Impact Laboratory (FOIL) in McLean, Virginia. The test was performed on a small sign support system at 60 mi/h (96.6 km/h), test 92F036. The vehicle used for this ...

  17. Testing of small and large sign support systems FOIL test number : 92F016

    DOT National Transportation Integrated Search

    1994-07-01

    This test report contains the results of a crash test performed at the Federal Outdoor : Impact Laboratory (FOIL) in McLean, Virginia. The test was performed on a small sign : support system at 20 mi/h (8.9 m/s), test 92F016. The vehicle used for the...

  18. Testing of small and large sign support systems FOIL test number : 92F035

    DOT National Transportation Integrated Search

    1994-07-01

    This test report contains the results of a crash test performed at the Federal Outdoor Impact Laboratory (FOIL) in McLean, Virginia. The test was performed on a small sign support system at 20 mi/h (32.2 km/h), test 92F035. The vehicle used for this ...

  19. Testing of small and large sign support systems FOIL test number : 92F038

    DOT National Transportation Integrated Search

    1994-01-01

    This test report contains the results of a crash test performed at the Federal Outdoor Impact Laboratory (FOIL) in McLean, Virginia. The test was performed on a small sign support system at 60 mi/h (96.6 km/h), test 92F038. The vehicle used for this ...

  20. Testing of small and large sign support systems FOIL test number : 92F037

    DOT National Transportation Integrated Search

    1994-07-01

    This test report contains the results of a crash test performed at the Federal Outdoor Impact Laboratory (FOIL) in McLean, Virginia. The test was performed on a small sign support system at 20 mi/h (32.2 km/h), test 92F037. The vehicle used for this ...

  1. Testing of small and large sign support systems FOIL test number : 92F022

    DOT National Transportation Integrated Search

    1994-07-01

    This test report contains the results of a crash test performed at the Federal Outdoor Impact laboratory (FOIL) in Mclean, Virginia. The test was performed on a small sign support system at 20 mi/h (8.9 m/s) , test 92F022. The vehicle used for this t...

  2. Testing of small and large sign support systems FOIL test numbers : 92F040

    DOT National Transportation Integrated Search

    1994-07-01

    This test report contains the results of a crash test performed at the Federal Outdoor Impact Laboratory (FOIL) in McLean, Virginia. The test was performed on a small sign support system at 60 mi/h (96.6 km/h), test 92F040. The vehicle used for this ...

  3. Testing of small and large sign support systems FOIL test number : 92F039

    DOT National Transportation Integrated Search

    1994-07-01

    This test report contains the results of a crash test performed at the Federal Outdoor Impact Laboratory (FOIL) in McLean, Virginia. The test was performed on a small sign support system at 20 mi/h (32.2 km/h) , test 92F039. The vehicle used for this...

  4. Cryocooler based test setup for high current applications

    NASA Astrophysics Data System (ADS)

    Pradhan, Jedidiah; Das, Nisith Kr.; Roy, Anindya; Duttagupta, Anjan

    2018-04-01

    A cryo-cooler based cryogenic test setup has been designed, fabricated, and tested. The setup incorporates two numbers of cryo-coolers, one for sample cooling and the other one for cooling the large magnet coil. The performance and versatility of the setup has been tested using large samples of high-temperature superconductor magnet coil as well as short samples with high current. Several un-calibrated temperature sensors have been calibrated using this system. This paper presents the details of the system along with results of different performance tests.

  5. Sensitivity of a Riparian Large Woody Debris Recruitment Model to the Number of Contributing Banks and Tree Fall Pattern

    Treesearch

    Don C. Bragg; Jeffrey L. Kershner

    2004-01-01

    Riparian large woody debris (LWD) recruitment simulations have traditionally applied a random angle of tree fall from two well-forested stream banks. We used a riparian LWD recruitment model (CWD, version 1.4) to test the validity these assumptions. Both the number of contributing forest banks and predominant tree fall direction significantly influenced simulated...

  6. Potential utilization of the NASA/George C. Marshall Space Flight Center in earthquake engineering research

    NASA Technical Reports Server (NTRS)

    Scholl, R. E. (Editor)

    1979-01-01

    Earthquake engineering research capabilities of the National Aeronautics and Space Administration (NASA) facilities at George C. Marshall Space Flight Center (MSFC), Alabama, were evaluated. The results indicate that the NASA/MSFC facilities and supporting capabilities offer unique opportunities for conducting earthquake engineering research. Specific features that are particularly attractive for large scale static and dynamic testing of natural and man-made structures include the following: large physical dimensions of buildings and test bays; high loading capacity; wide range and large number of test equipment and instrumentation devices; multichannel data acquisition and processing systems; technical expertise for conducting large-scale static and dynamic testing; sophisticated techniques for systems dynamics analysis, simulation, and control; and capability for managing large-size and technologically complex programs. Potential uses of the facilities for near and long term test programs to supplement current earthquake research activities are suggested.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parthipun, A. A., E-mail: aneeta@hotmail.co.uk; Taylor, J.; Manyonda, I.

    The purpose of this study was to determine whether there is a correlation between large uterine fibroid diameter, uterine volume, number of vials of embolic agent used and risk of complications from uterine artery embolisation (UAE). This was a prospective study involving 121 patients undergoing UAE embolisation for symptomatic uterine fibroids at a single institution. Patients were grouped according to diameter of largest fibroid and uterine volume. Results were also stratified according to the number of vials of embolic agent used and rate of complications. No statistical difference in complication rate was demonstrated between the two groups according to diametermore » of the largest fibroid (large fibroids were classified as {>=}10 cm; Fisher's exact test P = 1.00), and no statistical difference in complication rate was demonstrated according to uterine volume (large uterine volume was defined as {>=}750 cm{sup 3}; Fisher's exact test P = 0.70). 84 of the 121 patients had documentation of the number of vials used during the procedure. Patients were divided into two groups, with {>=}4 used defined as a large number of embolic agent. There was no statistical difference between these two groups and no associated increased risk of developing complications. This study showed no increased incidence of complications in women with large-diameter fibroids or uterine volumes as defined. In addition, there was no evidence of increased complications according to quantity of embolic material used. Therefore, UAE should be offered to women with large fibroids and uterine volumes.« less

  8. Effect of number of probes and their orientation on the calculation of several compressor face distortion descriptors

    NASA Technical Reports Server (NTRS)

    Stoll, F.; Tremback, J. W.; Arnaiz, H. H.

    1979-01-01

    A study was performed to determine the effects of the number and position of total pressure probes on the calculation of five compressor face distortion descriptors. This study used three sets of 320 steady state total pressure measurements that were obtained with a special rotating rake apparatus in wind tunnel tests of a mixed-compression inlet. The inlet was a one third scale model of the inlet on a YF-12 airplane, and it was tested in the wind tunnel at representative flight conditions at Mach numbers above 2.0. The study shows that large errors resulted in the calculation of the distortion descriptors even with a number of probes that were considered adequate in the past. There were errors as large as 30 and -50 percent in several distortion descriptors for a configuration consisting of eight rakes with five equal-area-weighted probes on each rake.

  9. Non-overlap subaperture interferometric testing for large optics

    NASA Astrophysics Data System (ADS)

    Wu, Xin; Yu, Yingjie; Zeng, Wenhan; Qi, Te; Chen, Mingyi; Jiang, Xiangqian

    2017-08-01

    It has been shown that the number of subapertures and the amount of overlap has a significant influence on the stitching accuracy. In this paper, a non-overlap subaperture interferometric testing method (NOSAI) is proposed to inspect large optical components. This method would greatly reduce the number of subapertures and the influence of environmental interference while maintaining the accuracy of reconstruction. A general subaperture distribution pattern of NOSAI is also proposed for the large rectangle surface. The square Zernike polynomial is employed to fit such wavefront. The effect of the minimum fitting terms on the accuracy of NOSAI and the sensitivities of NOSAI to subaperture's alignment error, power systematic error, and random noise are discussed. Experimental results validate the feasibility and accuracy of the proposed NOSAI in comparison with wavefront obtained by a large aperture interferometer and stitching surface by multi-aperture overlap-scanning technique (MAOST).

  10. Multiple damage identification on a wind turbine blade using a structural neural system

    NASA Astrophysics Data System (ADS)

    Kirikera, Goutham R.; Schulz, Mark J.; Sundaresan, Mannur J.

    2007-04-01

    A large number of sensors are required to perform real-time structural health monitoring (SHM) to detect acoustic emissions (AE) produced by damage growth on large complicated structures. This requires a large number of high sampling rate data acquisition channels to analyze high frequency signals. To overcome the cost and complexity of having such a large data acquisition system, a structural neural system (SNS) was developed. The SNS reduces the required number of data acquisition channels and predicts the location of damage within a sensor grid. The sensor grid uses interconnected sensor nodes to form continuous sensors. The combination of continuous sensors and the biomimetic parallel processing of the SNS tremendously reduce the complexity of SHM. A wave simulation algorithm (WSA) was developed to understand the flexural wave propagation in composite structures and to utilize the code for developing the SNS. Simulation of AE responses in a plate and comparison with experimental results are shown in the paper. The SNS was recently tested by a team of researchers from University of Cincinnati and North Carolina A&T State University during a quasi-static proof test of a 9 meter long wind turbine blade at the National Renewable Energy Laboratory (NREL) test facility in Golden, Colorado. Twelve piezoelectric sensor nodes were used to form four continuous sensors to monitor the condition of the blade during the test. The four continuous sensors are used as inputs to the SNS. There are only two analog output channels of the SNS, and these signals are digitized and analyzed in a computer to detect damage. In the test of the wind turbine blade, multiple damages were identified and later verified by sectioning of the blade. The results of damage identification using the SNS during this proof test will be shown in this paper. Overall, the SNS is very sensitive and can detect damage on complex structures with ribs, joints, and different materials, and the system relatively inexpensive and simple to implement on large structures.

  11. Recommendations for Developing Alternative Test Methods for Screening and Prioritization of Chemicals for Developmental Neurotoxicity

    EPA Science Inventory

    Developmental neurotoxicity testing (DNT) is perceived by many stakeholders to be an area in critical need of alternative methods to current animal testing protocols and gUidelines. An immediate goal is to develop test methods that are capable of screening large numbers of chemic...

  12. Testing of small and large sign support systems FOIL test number : 92F017

    DOT National Transportation Integrated Search

    1994-07-01

    This test report contains the results of a crash test performed at the Federal Outdoor Impact Laboratory (FOIL) in McLean, Virginia. The test was performed on a small sign support system at 20 mi/h (8.9 m/s), test 92F017. The vehicle used for this te...

  13. Testing of small and large sign support systems FOIL test number : 92F026

    DOT National Transportation Integrated Search

    1994-07-01

    This test report contains the results of a crash test performed at the Federal Outdoor Impact Laboratory (FOIL) in McLean, Virginia. The test was performed on a small sign support system at 20 mi/h (8.9 m/s), test 92F026. The vehicle used for this te...

  14. Testing of small and large sign support systems FOIL test number : 92F018

    DOT National Transportation Integrated Search

    1994-07-01

    This test report contains the results of a crash test performed at the Federal Outdoor Impact Laboratory (FOIL) in McLean, Virginia. The test was performed on a small sign support system at 20 mi/h (8.9 m/s), test 92F018. The vehicle used for this te...

  15. Testing of small and large sign support systems FOIL test number : 92F015

    DOT National Transportation Integrated Search

    1994-07-01

    This test report contains the results of a crash test performed at the Federal Outdoor Impact Laboratory (FOIL) in McLean, Virginia. The test was performed on a small sign support system at 20 mi/h (8.9 m/s), test 92F015. The vehicle used for these t...

  16. Testing of small and large sign support systems FOIL test number : 92F012

    DOT National Transportation Integrated Search

    1994-07-01

    This test report contains the results of a crash test performed at the Federal Outdoor Impact Laboratory (FOIL) in McLean, Virginia. The test was performed on a small sign support system at 20 mi/h (8.9 m/s), test 92F012. The vehicle used for these t...

  17. Testing of small and large sign support systems FOIL test number : 92F023

    DOT National Transportation Integrated Search

    1994-07-01

    This test report contains the results of a crash test performed at the Federal Outdoor Impact Laboratory (FOIL) in McLean, Virginia. The test was performed on a small sign support system at 20 mi/h (8.9 m/s), test 92F023. The vehicle used for this te...

  18. Testing of small and large sign support systems FOIL test number : 92F019

    DOT National Transportation Integrated Search

    1994-07-01

    This test report contains the results of a crash test performed at the Federal Outdoor Impact Laboratory (FOIL) in McLean, Virginia. The test was performed on a small sign support system at 20 mi/h (8.9 m/s), test 92F019. The vehicle used for this te...

  19. Testing of small and large sign support systems FOIL test number : 92F014

    DOT National Transportation Integrated Search

    1994-07-01

    This test report contains the results of a crash test performed at the Federal Outdoor Impact Laboratory (FOIL) in McLean, Virginia. The test was performed on a small sign support system at 20 mi/h (8.9 m/s), test 92F014. The vehicle used for these t...

  20. Linking Parameter Estimates Derived from an Item Response Model through Separate Calibrations. Research Report. ETS RR-09-40

    ERIC Educational Resources Information Center

    Haberman, Shelby J.

    2009-01-01

    A regression procedure is developed to link simultaneously a very large number of item response theory (IRT) parameter estimates obtained from a large number of test forms, where each form has been separately calibrated and where forms can be linked on a pairwise basis by means of common items. An application is made to forms in which a…

  1. Antenatal diagnosis of Down syndrome: how good is state of the art.

    PubMed

    Mittal, Riju; Varghese, Raji Mathew; Puliyel, Jacob M

    2009-01-01

    A newborn with Down syndrome can be expected once in a thousand deliveries. Amniocentesis for karyotyping of foetal cells or detection of foetal cell in the maternal circulation ie, fluorescent in-situ hybridisation (FISH) and karyotyping, are definitive methods of making the diagnosis antenatally. The cost of doing this routinely in all pregnancies is prohibitive. This has led to dependence on screening tests, to select women more likely to be carrying a Down foetus, to offer karyotyping in a more cost efficient manner. Unfortunately, these screening criteria, namely maternal age, biochemical markers and ultrasound pointers, are rather insensitive and miss a large number of cases of Down syndrome. At the same time they are very non-specific, picking up a large number of false positive cases, resulting in undue anxiety and unnecessary alarm in a large number of mothers. Till a non-invasive, definitive test, like FISH can be routinely used in all pregnancies at affordable costs, accurate antenatal diagnosis on a community basis will be a hit and miss affair.

  2. Large space structures testing

    NASA Technical Reports Server (NTRS)

    Waites, Henry; Worley, H. Eugene

    1987-01-01

    There is considerable interest in the development of testing concepts and facilities that accurately simulate the pathologies believed to exist in future spacecraft. Both the Government and Industry have participated in the development of facilities over the past several years. The progress and problems associated with the development of the Large Space Structure Test Facility at the Marshall Flight Center are presented. This facility was in existence for a number of years and its utilization has run the gamut from total in-house involvement, third party contractor testing, to the mutual participation of other goverment agencies in joint endeavors.

  3. Towards Intelligent Interpretation of Low Strain Pile Integrity Testing Results Using Machine Learning Techniques.

    PubMed

    Cui, De-Mi; Yan, Weizhong; Wang, Xiao-Quan; Lu, Lie-Min

    2017-10-25

    Low strain pile integrity testing (LSPIT), due to its simplicity and low cost, is one of the most popular NDE methods used in pile foundation construction. While performing LSPIT in the field is generally quite simple and quick, determining the integrity of the test piles by analyzing and interpreting the test signals (reflectograms) is still a manual process performed by experienced experts only. For foundation construction sites where the number of piles to be tested is large, it may take days before the expert can complete interpreting all of the piles and delivering the integrity assessment report. Techniques that can automate test signal interpretation, thus shortening the LSPIT's turnaround time, are of great business value and are in great need. Motivated by this need, in this paper, we develop a computer-aided reflectogram interpretation (CARI) methodology that can interpret a large number of LSPIT signals quickly and consistently. The methodology, built on advanced signal processing and machine learning technologies, can be used to assist the experts in performing both qualitative and quantitative interpretation of LSPIT signals. Specifically, the methodology can ease experts' interpretation burden by screening all test piles quickly and identifying a small number of suspected piles for experts to perform manual, in-depth interpretation. We demonstrate the methodology's effectiveness using the LSPIT signals collected from a number of real-world pile construction sites. The proposed methodology can potentially enhance LSPIT and make it even more efficient and effective in quality control of deep foundation construction.

  4. An investigation of small scales of turbulence in a boundary layer at high Reynolds numbers

    NASA Technical Reports Server (NTRS)

    Wallace, James M.; Ong, L.; Balint, J.-L.

    1993-01-01

    The assumption that turbulence at large wave-numbers is isotropic and has universal spectral characteristics which are independent of the flow geometry, at least for high Reynolds numbers, has been a cornerstone of closure theories as well as of the most promising recent development in the effort to predict turbulent flows, viz. large eddy simulations. This hypothesis was first advanced by Kolmogorov based on the supposition that turbulent kinetic energy cascades down the scales (up the wave-numbers) of turbulence and that, if the number of these cascade steps is sufficiently large (i.e. the wave-number range is large), then the effects of anisotropies at the large scales are lost in the energy transfer process. Experimental attempts were repeatedly made to verify this fundamental assumption. However, Van Atta has recently suggested that an examination of the scalar and velocity gradient fields is necessary to definitively verify this hypothesis or prove it to be unfounded. Of course, this must be carried out in a flow with a sufficiently high Reynolds number to provide the necessary separation of scales in order unambiguously to provide the possibility of local isotropy at large wave-numbers. An opportunity to use our 12-sensor hot-wire probe to address this issue directly was made available at the 80'x120' wind tunnel at the NASA Ames Research Center, which is normally used for full-scale aircraft tests. An initial report on this high Reynolds number experiment and progress toward its evaluation is presented.

  5. Hybrid maize breeding with doubled haploids: I. One-stage versus two-stage selection for testcross performance.

    PubMed

    Longin, C Friedrich H; Utz, H Friedrich; Reif, Jochen C; Schipprack, Wolfgang; Melchinger, Albrecht E

    2006-03-01

    Optimum allocation of resources is of fundamental importance for the efficiency of breeding programs. The objectives of our study were to (1) determine the optimum allocation for the number of lines and test locations in hybrid maize breeding with doubled haploids (DHs) regarding two optimization criteria, the selection gain deltaG(k) and the probability P(k) of identifying superior genotypes, (2) compare both optimization criteria including their standard deviations (SDs), and (3) investigate the influence of production costs of DHs on the optimum allocation. For different budgets, number of finally selected lines, ratios of variance components, and production costs of DHs, the optimum allocation of test resources under one- and two-stage selection for testcross performance with a given tester was determined by using Monte Carlo simulations. In one-stage selection, lines are tested in field trials in a single year. In two-stage selection, optimum allocation of resources involves evaluation of (1) a large number of lines in a small number of test locations in the first year and (2) a small number of the selected superior lines in a large number of test locations in the second year, thereby maximizing both optimization criteria. Furthermore, to have a realistic chance of identifying a superior genotype, the probability P(k) of identifying superior genotypes should be greater than 75%. For budgets between 200 and 5,000 field plot equivalents, P(k) > 75% was reached only for genotypes belonging to the best 5% of the population. As the optimum allocation for P(k)(5%) was similar to that for deltaG(k), the choice of the optimization criterion was not crucial. The production costs of DHs had only a minor effect on the optimum number of locations and on values of the optimization criteria.

  6. A Process for Reviewing and Evaluating Generated Test Items

    ERIC Educational Resources Information Center

    Gierl, Mark J.; Lai, Hollis

    2016-01-01

    Testing organization needs large numbers of high-quality items due to the proliferation of alternative test administration methods and modern test designs. But the current demand for items far exceeds the supply. Test items, as they are currently written, evoke a process that is both time-consuming and expensive because each item is written,…

  7. A Design Tool for Matching UAV Propeller and Power Plant Performance

    NASA Astrophysics Data System (ADS)

    Mangio, Arion L.

    A large body of knowledge is available for matching propellers to engines for large propeller driven aircraft. Small UAV's and model airplanes operate at much lower Reynolds numbers and use fixed pitch propellers so the information for large aircraft is not directly applicable. A design tool is needed that takes into account Reynolds number effects, allows for gear reduction, and the selection of a propeller optimized for the airframe. The tool developed in this thesis does this using propeller performance data generated from vortex theory or wind tunnel experiments and combines that data with an engine power curve. The thrust, steady state power, RPM, and tip Mach number vs. velocity curves are generated. The Reynolds number vs. non dimensional radial station at an operating point is also found. The tool is then used to design a geared power plant for the SAE Aero Design competition. To measure the power plant performance, a purpose built engine test stand was built. The characteristics of the engine test stand are also presented. The engine test stand was then used to characterize the geared power plant. The power plant uses a 26x16 propeller, 100/13 gear ratio, and an LRP 0.30 cubic inch engine turning at 28,000 RPM and producing 2.2 HP. Lastly, the measured power plant performance is presented. An important result is that 17 lbf of static thrust is produced.

  8. The Impact of Teach for America on Non-Test Academic Outcomes

    ERIC Educational Resources Information Center

    Backes, Ben; Hansen, Michael

    2018-01-01

    Recent evidence on teacher productivity suggests that teachers meaningfully influence non-test academic student outcomes that are commonly overlooked by narrowly focusing on test scores. Despite a large number of studies investigating the Teach For America (TFA) effect on math and English achievement, little is known about non-tested academic…

  9. Bayesian Item Selection in Constrained Adaptive Testing Using Shadow Tests

    ERIC Educational Resources Information Center

    Veldkamp, Bernard P.

    2010-01-01

    Application of Bayesian item selection criteria in computerized adaptive testing might result in improvement of bias and MSE of the ability estimates. The question remains how to apply Bayesian item selection criteria in the context of constrained adaptive testing, where large numbers of specifications have to be taken into account in the item…

  10. Evaluation of Flush-Mounted, S-Duct Inlets With Large Amounts of Boundary Layer Ingestion

    NASA Technical Reports Server (NTRS)

    Berrier, Bobby L.; Morehouse, Melissa B.

    2003-01-01

    A new high Reynolds number test capability for boundary layer ingesting inlets has been developed for the NASA Langley Research Center 0.3-Meter Transonic Cryogenic Tunnel. Using this new capability, an experimental investigation of four S-duct inlet configurations with large amounts of boundary layer ingestion (nominal boundary layer thickness of about 40% of inlet height) was conducted at realistic operating conditions (high subsonic Mach numbers and full-scale Reynolds numbers). The objectives of this investigation were to 1) develop a new high Reynolds number, boundary-layer ingesting inlet test capability, 2) evaluate the performance of several boundary layer ingesting S-duct inlets, 3) provide a database for CFD tool validation, and 4) provide a baseline inlet for future inlet flow-control studies. Tests were conducted at Mach numbers from 0.25 to 0.83, Reynolds numbers (based on duct exit diameter) from 5.1 million to a fullscale value of 13.9 million, and inlet mass-flow ratios from 0.39 to 1.58 depending on Mach number. Results of this investigation indicate that inlet pressure recovery generally decreased and inlet distortion generally increased with increasing Mach number. Except at low Mach numbers, increasing inlet mass-flow increased pressure recovery and increased distortion. Increasing the amount of boundary layer ingestion (by decreasing inlet throat height and increasing inlet throat width) or ingesting a boundary layer with a distorted profile decreased pressure recovery and increased distortion. Finally, increasing Reynolds number had almost no effect on inlet distortion but increased inlet recovery by about one-half percent at a Mach number near cruise.

  11. Testing of small and large sign support systems FOIL test numbers : 92F009 and 92F010

    DOT National Transportation Integrated Search

    1994-07-01

    This test report contains the results of two crash tests performed at the Federal Outdoor Impact Laboratory (FOIL) in McLean, Virginia. The/tests were performed on a small sign support system at 20 rni/h (8.9 m/s), test 92F009 and 60 mi/h (26.8 m/s),...

  12. Testing of small and large sign support systems FOIL test numbers : 92F024 and 92F025

    DOT National Transportation Integrated Search

    1994-07-01

    This test report contains the results of two crash tests performed at the Federal Outdoor Impact Laboratory (FOIL) in McLean, Virginia. The tests were performed on a small sign support system at 20 mi/h (8.9 m/s), test 92F024, and 60 mi/h (26.8 m/s),...

  13. In Vitro Testing of Engineered Nanomaterials in the EPA’s ToxCast Program (WC9)

    EPA Science Inventory

    High-throughput and high-content screens are attractive approaches for prioritizing nanomaterial hazards and informing targeted testing due to the impracticality of using traditional toxicological testing on the large numbers and varieties of nanomaterials. The ToxCast program a...

  14. A Reynolds Number Study of Wing Leading-Edge Effects on a Supersonic Transport Model at Mach 0.3

    NASA Technical Reports Server (NTRS)

    Williams, M. Susan; Owens, Lewis R., Jr.; Chu, Julio

    1999-01-01

    A representative supersonic transport design was tested in the National Transonic Facility (NTF) in its original configuration with small-radius leading-edge flaps and also with modified large-radius inboard leading-edge flaps. Aerodynamic data were obtained over a range of Reynolds numbers at a Mach number of 0.3 and angles of attack up to 16 deg. Increasing the radius of the inboard leading-edge flap delayed nose-up pitching moment to a higher lift coefficient. Deflecting the large-radius leading-edge flap produced an overall decrease in lift coefficient and delayed nose-up pitching moment to even higher angles of attack as compared with the undeflected large- radius leading-edge flap. At angles of attack corresponding to the maximum untrimmed lift-to-drag ratio, lift and drag coefficients decreased while lift-to-drag ratio increased with increasing Reynolds number. At an angle of attack of 13.5 deg., the pitching-moment coefficient was nearly constant with increasing Reynolds number for both the small-radius leading-edge flap and the deflected large-radius leading-edge flap. However, the pitching moment coefficient increased with increasing Reynolds number for the undeflected large-radius leading-edge flap above a chord Reynolds number of about 35 x 10 (exp 6).

  15. Mach Number effects on turbulent superstructures in wall bounded flows

    NASA Astrophysics Data System (ADS)

    Kaehler, Christian J.; Bross, Matthew; Scharnowski, Sven

    2017-11-01

    Planer and three-dimensional flow field measurements along a flat plat boundary layer in the Trisonic Wind Tunnel Munich (TWM) are examined with the aim to characterize the scaling, spatial organization, and topology of large scale turbulent superstructures in compressible flow. This facility is ideal for this investigation as the ratio of boundary layer thickness to test section spanwise extent ratio is around 1/25, ensuring minimal sidewall and corner effects on turbulent structures in the center of the test section. A major difficulty in the experimental investigation of large scale features is the mutual size of the superstructures which can extend over many boundary layer thicknesses. Using multiple PIV systems, it was possible to capture the full spatial extent of large-scale structures over a range of Mach numbers from Ma = 0.3 - 3. To calculate the average large-scale structure length and spacing, the acquired vector fields were analyzed by statistical multi-point methods that show large scale structures with a correlation length of around 10 boundary layer thicknesses over the range of Mach numbers investigated. Furthermore, the average spacing between high and low momentum structures is on the order of a boundary layer thicknesses. This work is supported by the Priority Programme SPP 1881 Turbulent Superstructures of the Deutsche Forschungsgemeinschaft.

  16. TESTING PRACTICES AND VOLUME OF NON-LYME TICKBORNE DISEASES IN THE UNITED STATES

    PubMed Central

    Connally, Neeta P.; Hinckley, Alison F.; Feldman, Katherine A.; Kemperman, Melissa; Neitzel, David; Wee, Siok-Bi; White, Jennifer L.; Mead, Paul S.; Meek, James I.

    2015-01-01

    Large commercial laboratories in the United States were surveyed regarding the number of specimens tested for eight tickborne diseases in 2008. Seven large commercial laboratories reported testing a total of 2,927,881 specimens nationally (including Lyme disease). Of these, 495,585 specimens (17 percent) were tested for tickborne diseases other than Lyme disease. In addition to large commercial laboratories, another 1,051 smaller commercial, hospital, and government laboratories in four states (CT, MD, MN, and NY) were surveyed regarding tickborne disease testing frequency, practices, and results. Ninety-two of these reported testing a total of 10,091 specimens for four tickborne diseases other than Lyme disease. We estimate the cost of laboratory diagnostic testing for non-Lyme disease tickborne diseases in 2008 to be $9.6 million. These data provide a baseline to evaluate trends in tickborne disease test utilization and insight into the burden of these diseases. PMID:26565931

  17. In-Flight Boundary-Layer Transition of a Large Flat Plate at Supersonic Speeds

    NASA Technical Reports Server (NTRS)

    Banks, D. W.; Frederick, M. A.; Tracy, R. R.; Matisheck, J. R.; Vanecek, N. D.

    2012-01-01

    A flight experiment was conducted to investigate the pressure distribution, local-flow conditions, and boundary-layer transition characteristics on a large flat plate in flight at supersonic speeds up to Mach 2.00. The tests used a NASA testbed aircraft with a bottom centerline mounted test fixture. The primary objective of the test was to characterize the local flow field in preparation for future tests of a high Reynolds number natural laminar flow test article. A second objective was to determine the boundary-layer transition characteristics on the flat plate and the effectiveness of using a simplified surface coating. Boundary-layer transition was captured in both analog and digital formats using an onboard infrared imaging system. Surface pressures were measured on the surface of the flat plate. Flow field measurements near the leading edge of the test fixture revealed the local flow characteristics including downwash, sidewash, and local Mach number. Results also indicated that the simplified surface coating did not provide sufficient insulation from the metallic structure, which likely had a substantial effect on boundary-layer transition compared with that of an adiabatic surface. Cold wall conditions were predominant during the acceleration to maximum Mach number, and warm wall conditions were evident during the subsequent deceleration.

  18. Some Measurement and Instruction Related Considerations Regarding Computer Assisted Testing.

    ERIC Educational Resources Information Center

    Oosterhof, Albert C.; Salisbury, David F.

    The Assessment Resource Center (ARC) at Florida State University provides computer assisted testing (CAT) for approximately 4,000 students each term. Computer capabilities permit a small proctoring staff to administer tests simultaneously to large numbers of students. Programs provide immediate feedback for students and generate a variety of…

  19. EPA’s ToxCast Program for Predicting Toxicity and Prioritizing Chemicals for Further Screening and Testing

    EPA Science Inventory

    Testing of environmental and industrial chemicals for toxicity potential is a daunting task because of the wide range of possible toxicity mechanisms. Although animal testing is one means of achieving broad toxicity coverage, evaluation of large numbers of chemicals is challengin...

  20. EPAs ToxCast Program for Predicting Toxcity and Prioritizing Chemicals for Further Screening and Testing

    EPA Science Inventory

    Testing of environmental and industrial chemicals for toxicity potential is a daunting task because of the wide range of possible toxicity mechanisms. Although animal testing is one means of achieving broad toxicity coverage, evaluation of large numbers of chemicals is challengin...

  1. Anxiety in Language Testing: The APTIS Case

    ERIC Educational Resources Information Center

    Valencia Robles, Jeannette de Fátima

    2017-01-01

    The requirement of holding a diploma which certifies proficiency level in a foreign language is constantly increasing in academic and working environments. Computer-based testing has become a prevailing tendency for these and other educational purposes. Each year large numbers of students take online language tests everywhere in the world. In…

  2. The Generalized Higher Criticism for Testing SNP-Set Effects in Genetic Association Studies

    PubMed Central

    Barnett, Ian; Mukherjee, Rajarshi; Lin, Xihong

    2017-01-01

    It is of substantial interest to study the effects of genes, genetic pathways, and networks on the risk of complex diseases. These genetic constructs each contain multiple SNPs, which are often correlated and function jointly, and might be large in number. However, only a sparse subset of SNPs in a genetic construct is generally associated with the disease of interest. In this article, we propose the generalized higher criticism (GHC) to test for the association between an SNP set and a disease outcome. The higher criticism is a test traditionally used in high-dimensional signal detection settings when marginal test statistics are independent and the number of parameters is very large. However, these assumptions do not always hold in genetic association studies, due to linkage disequilibrium among SNPs and the finite number of SNPs in an SNP set in each genetic construct. The proposed GHC overcomes the limitations of the higher criticism by allowing for arbitrary correlation structures among the SNPs in an SNP-set, while performing accurate analytic p-value calculations for any finite number of SNPs in the SNP-set. We obtain the detection boundary of the GHC test. We compared empirically using simulations the power of the GHC method with existing SNP-set tests over a range of genetic regions with varied correlation structures and signal sparsity. We apply the proposed methods to analyze the CGEM breast cancer genome-wide association study. Supplementary materials for this article are available online. PMID:28736464

  3. Measures of Strength and Fitness for Older Populations.

    ERIC Educational Resources Information Center

    Osness, Wayne H.; Hiebert, Lujean M.

    The overall strength of the musculature does not require testing of large numbers of muscle groups and can be accomplished from three or four tests. Small batteries of strength tests have been devised to predict total strength. The best combination of tests for males are thigh flexors, leg extensors, arm flexors, and pectoralis major. The battery…

  4. Individualizing the Teaching of Reading through Test Management Systems.

    ERIC Educational Resources Information Center

    Fry, Edward

    Test management systems are suggested for individualizing the teaching of reading in the elementary classroom. Test management systems start with a list of objectives or specific goals which cover all or some major areas of the learning to read process. They then develop a large number of criterion referenced tests which match the skill areas at…

  5. Acute carbon tetrachloride feeding induces damage of large but not small cholangiocytes from BDL rat liver.

    PubMed

    LeSage, G D; Glaser, S S; Marucci, L; Benedetti, A; Phinizy, J L; Rodgers, R; Caligiuri, A; Papa, E; Tretjak, Z; Jezequel, A M; Holcomb, L A; Alpini, G

    1999-05-01

    Bile duct damage and/or loss is limited to a range of duct sizes in cholangiopathies. We tested the hypothesis that CCl4 damages only large ducts. CCl4 or mineral oil was given to bile duct-ligated (BDL) rats, and 1, 2, and 7 days later small and large cholangiocytes were purified and evaluated for apoptosis, proliferation, and secretion. In situ, we measured apoptosis by morphometric and TUNEL analysis and the number of small and large ducts by morphometry. Two days after CCl4 administration, we found an increased number of small ducts and reduced number of large ducts. In vitro apoptosis was observed only in large cholangiocytes, and this was accompanied by loss of proliferation and secretion in large cholangiocytes and loss of choleretic effect of secretin. Small cholangiocytes de novo express the secretin receptor gene and secretin-induced cAMP response. Consistent with damage of large ducts, we detected cytochrome P-4502E1 (which CCl4 converts to its radicals) only in large cholangiocytes. CCl4 induces selective apoptosis of large ducts associated with loss of large cholangiocyte proliferation and secretion.

  6. Heat transfer with very high free stream turbulence

    NASA Technical Reports Server (NTRS)

    Moffat, Robert J.; Maciejewski, Paul K.

    1985-01-01

    Stanton numbers as much as 350 percent above the accepted correlations for flat plate turbulent boundary layer heat transfer have been found in experiments on a low velocity air flow with very high turbulence (up to 50 percent). These effects are far larger that have been previously reported and the data do not correlate as well in boundary layer coordinates (Stanton number and Reynolds number) as they do in simpler coordinates: h vs. X. The very high relative turbulence levels were achieved by placing the test plate in different positions in the margin of a large diameter free jet. The large increases may be due to organized structures of large scale which are present in the marginal flowfield around a free jet.

  7. Risk of co-occuring psychopathology: testing a prediction of expectancy theory.

    PubMed

    Capron, Daniel W; Norr, Aaron M; Schmidt, Norman B

    2013-01-01

    Despite the high impact of anxiety sensitivity (AS; a fear of anxiety related sensations) research, almost no research attention has been paid to its parent theory, Reiss' expectancy theory (ET). ET has gone largely unexamined to this point, including the prediction that AS is a better predictor of number of fears than current anxiety. To test Reiss' prediction, we used a large (N = 317) clinical sample of anxiety outpatients. Specifically, we examined whether elevated AS predicted number of comorbid anxiety and non-anxiety disorder diagnoses in this sample. Consistent with ET, findings indicated that AS predicted number of comorbid anxiety disorder diagnoses above and beyond current anxiety symptoms. Also, AS did not predict the number of comorbid non-anxiety diagnoses when current anxiety symptoms were accounted for. These findings represent an important examination of a prediction of Reiss' ET and are consistent with the idea that AS may be a useful transdiagnostic treatment target. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. [Research on adaptive quasi-linear viscoelastic model for nonlinear viscoelastic properties of in vivo soft tissues].

    PubMed

    Wang, Heng; Sang, Yuanjun

    2017-10-01

    The mechanical behavior modeling of human soft biological tissues is a key issue for a large number of medical applications, such as surgery simulation, surgery planning, diagnosis, etc. To develop a biomechanical model of human soft tissues under large deformation for surgery simulation, the adaptive quasi-linear viscoelastic (AQLV) model was proposed and applied in human forearm soft tissues by indentation tests. An incremental ramp-and-hold test was carried out to calibrate the model parameters. To verify the predictive ability of the AQLV model, the incremental ramp-and-hold test, a single large amplitude ramp-and-hold test and a sinusoidal cyclic test at large strain amplitude were adopted in this study. Results showed that the AQLV model could predict the test results under the three kinds of load conditions. It is concluded that the AQLV model is feasible to describe the nonlinear viscoelastic properties of in vivo soft tissues under large deformation. It is promising that this model can be selected as one of the soft tissues models in the software design for surgery simulation or diagnosis.

  9. Towards Intelligent Interpretation of Low Strain Pile Integrity Testing Results Using Machine Learning Techniques

    PubMed Central

    Cui, De-Mi; Wang, Xiao-Quan; Lu, Lie-Min

    2017-01-01

    Low strain pile integrity testing (LSPIT), due to its simplicity and low cost, is one of the most popular NDE methods used in pile foundation construction. While performing LSPIT in the field is generally quite simple and quick, determining the integrity of the test piles by analyzing and interpreting the test signals (reflectograms) is still a manual process performed by experienced experts only. For foundation construction sites where the number of piles to be tested is large, it may take days before the expert can complete interpreting all of the piles and delivering the integrity assessment report. Techniques that can automate test signal interpretation, thus shortening the LSPIT’s turnaround time, are of great business value and are in great need. Motivated by this need, in this paper, we develop a computer-aided reflectogram interpretation (CARI) methodology that can interpret a large number of LSPIT signals quickly and consistently. The methodology, built on advanced signal processing and machine learning technologies, can be used to assist the experts in performing both qualitative and quantitative interpretation of LSPIT signals. Specifically, the methodology can ease experts’ interpretation burden by screening all test piles quickly and identifying a small number of suspected piles for experts to perform manual, in-depth interpretation. We demonstrate the methodology’s effectiveness using the LSPIT signals collected from a number of real-world pile construction sites. The proposed methodology can potentially enhance LSPIT and make it even more efficient and effective in quality control of deep foundation construction. PMID:29068431

  10. Large-scale Advanced Prop-fan (LAP) high speed wind tunnel test report

    NASA Technical Reports Server (NTRS)

    Campbell, William A.; Wainauski, Harold S.; Arseneaux, Peter J.

    1988-01-01

    High Speed Wind Tunnel testing of the SR-7L Large Scale Advanced Prop-Fan (LAP) is reported. The LAP is a 2.74 meter (9.0 ft) diameter, 8-bladed tractor type rated for 4475 KW (6000 SHP) at 1698 rpm. It was designated and built by Hamilton Standard under contract to the NASA Lewis Research Center. The LAP employs thin swept blades to provide efficient propulsion at flight speeds up to Mach .85. Testing was conducted in the ONERA S1-MA Atmospheric Wind Tunnel in Modane, France. The test objectives were to confirm that the LAP is free from high speed classical flutter, determine the structural and aerodynamic response to angular inflow, measure blade surface pressures (static and dynamic) and evaluate the aerodynamic performance at various blade angles, rotational speeds and Mach numbers. The measured structural and aerodynamic performance of the LAP correlated well with analytical predictions thereby providing confidence in the computer prediction codes used for the design. There were no signs of classical flutter throughout all phases of the test up to and including the 0.84 maximum Mach number achieved. Steady and unsteady blade surface pressures were successfully measured for a wide range of Mach numbers, inflow angles, rotational speeds and blade angles. No barriers were discovered that would prevent proceeding with the PTA (Prop-Fan Test Assessment) Flight Test Program scheduled for early 1987.

  11. Evaluation of the limulus amoebocyte lysate test in conjunction with a gram negative bacterial plate count for detecting irradiation of chicken

    NASA Astrophysics Data System (ADS)

    Scotter, Susan L.; Wood, Roger; McWeeny, David J.

    A study to evaluate the potential of the Limulus amoebocyte lysate (LAL) test in conjuction with a Gram negative bacteria (GNB) plate count for detecting the irradiation of chicken is described. Preliminary studies demonstrated that chickens irradiated at an absorbed dose of 2.5 kGy could be differentiated from unirradiated birds by measuring levels of endotoxin and of numbers of GNB on chicken skin. Irradiated birds were found to have endotoxin levels similar to those found in unirradiated birds but significantly lower numbers of GNB. In a limited study the test was found to be applicable to birds from different processors. The effect of temperature abuse on the microbiological profile, and thus the efficacy of the test, was also investigated. After temperature abuse, the irradiated birds were identifiable at worst up to 3 days after irradiation treatment at the 2.5 kGy level and at best some 13 days after irradiation. Temperature abuse at 15°C resulted in rapid recovery of surviving micro-organisms which made differentiation of irradiated and unirradiated birds using this test unreliable. The microbiological quality of the bird prior to irradiation treatment also affected the test as large numbers of GNB present on the bird prior to irradiation treatment resulted in larger numbers of survivors. In addition, monitoring the developing flora after irradiation treatment and during subsequent chilled storage also aided differentiation of irradiated and unirradiated birds. Large numbers of yeasts and Gram positive cocci were isolated from irradiated carcasses whereas Gram negative oxidative rods were the predominant spoilage flora on unirradiated birds.

  12. Development of laboratory testing facility for evaluation of base-soil behavior under repeated loading : phase 1 : feasibility study.

    DOT National Transportation Integrated Search

    2005-02-01

    Accelerated load testing of paved and unpaved roads is the application of a large number of load repetitions in a short period of time. This type of testing is an economic way to determine the behavior of roads and compare different materials, struct...

  13. Evaluating ToxCast™ High-Throughput Assays For Their Ability To Detect Direct-Acting Genotoxicants

    EPA Science Inventory

    A standard battery of tests has been in use for the several decades to screen chemicals for genotoxicity. However, the large number of environmental and industrial chemicals that need to be tested overwhelms our ability to test them. ToxCast™ is a multi-year effort to develop a ...

  14. Measuring What Really Matters

    ERIC Educational Resources Information Center

    Wei, Ruth Chung; Pecheone, Raymond L.; Wilczak, Katherine L.

    2015-01-01

    Since the passage of No Child Left Behind, large-scale assessments have come to play a central role in federal and state education accountability systems. Teachers and parents have expressed a number of concerns about their state testing programs, such as too much time devoted to testing and the high-stakes use of testing for teacher evaluation.…

  15. Application of molecular target homology-based approaches to predict species sensitivities to two pesticides, permethrin and propiconozole

    EPA Science Inventory

    In the U.S., registration of pesticide active ingredients requires a battery of intensive and costly in vivo toxicity tests which utilize large numbers of test animals. These tests use a limited array of model species from various aquatic and terrestrial taxa to represent all pla...

  16. Investigation of estimators of probability density functions

    NASA Technical Reports Server (NTRS)

    Speed, F. M.

    1972-01-01

    Four research projects are summarized which include: (1) the generation of random numbers on the IBM 360/44, (2) statistical tests used to check out random number generators, (3) Specht density estimators, and (4) use of estimators of probability density functions in analyzing large amounts of data.

  17. High-Throughput/High-Content Screening Assays with Engineered Nanomaterials in ToxCast

    EPA Science Inventory

    High-throughput and high-content screens are attractive approaches for prioritizing nanomaterial hazards and informing targeted testing due to the impracticality of using traditional toxicological testing on the large numbers and varieties of nanomaterials. The ToxCast program a...

  18. Species hybridization in the genus Pinus

    Treesearch

    Peter W. Garrett

    1979-01-01

    Results of a breeding program in which a large number of pine species were tested indicate that a number of species and hybrids may be useful in the northeastern United States. Austrian black pine x Japanese black pine and hybrids containing Japanese red pine all had good growth rates. While none of the soft pines grew faster than eastern white pine, a number of...

  19. Modification of the Mantel-Haenszel and Logistic Regression DIF Procedures to Incorporate the SIBTEST Regression Correction

    ERIC Educational Resources Information Center

    DeMars, Christine E.

    2009-01-01

    The Mantel-Haenszel (MH) and logistic regression (LR) differential item functioning (DIF) procedures have inflated Type I error rates when there are large mean group differences, short tests, and large sample sizes.When there are large group differences in mean score, groups matched on the observed number-correct score differ on true score,…

  20. Are You Afraid of Taking an Online Foreign Language Test?

    ERIC Educational Resources Information Center

    Garcia Laborda, Jesus; Robles, Valencia

    2017-01-01

    Computer based testing has become a prevailing tendency in education. Each year, a large number of students take online language tests everywhere in the world. In fact, there is a tendency to make these tests more and more used due to their low cost of delivery. However, many students are forced to take them despite their interests, feelings and…

  1. Wind-Tunnel Tests of Seven Static-Pressure Probes at Transonic Speeds

    NASA Technical Reports Server (NTRS)

    Capone, Francis J.

    1961-01-01

    Wind-tunnel tests have been conducted to determine the errors of 3 seven static-pressure probes mounted very close to the nose of a body of revolution simulating a missile forebody. The tests were conducted at Mach numbers from 0.80 to 1.08 and at angles of attack from -1.7 deg to 8.4 deg. The test Reynolds number per foot varied from 3.35 x 10(exp 6) to 4.05 x 10(exp 6). For three 4-vane, gimbaled probes, the static-pressure errors remained constant throughout the test angle-of-attack range for all Mach numbers except 1.02. For two single-vane, self-rotating probes having two orifices at +/-37.5 deg. from the plane of symmetry on the lower surface of the probe body, the static-pressure error varied as much as 1.5 percent of free-stream static pressure through the test angle-of- attack range for all Mach numbers. For two fixed, cone-cylinder probes of short length and large diameter, the static-pressure error varied over the test angle-of-attack range at constant Mach numbers as much as 8 to 10 percent of free-stream static pressure.

  2. TES Detector Noise Limited Readout Using SQUID Multiplexers

    NASA Technical Reports Server (NTRS)

    Staguhn, J. G.; Benford, D. J.; Chervenak, J. A.; Khan, S. A.; Moseley, S. H.; Shafer, R. A.; Deiker, S.; Grossman, E. N.; Hilton, G. C.; Irwin, K. D.

    2004-01-01

    The availability of superconducting Transition Edge Sensors (TES) with large numbers of individual detector pixels requires multiplexers for efficient readout. The use of multiplexers reduces the number of wires needed between the cryogenic electronics and the room temperature electronics and cuts the number of required cryogenic amplifiers. We are using an 8 channel SQUID multiplexer to read out one-dimensional TES arrays which are used for submillimeter astronomical observations. We present results from test measurements which show that the low noise level of the SQUID multiplexers allows accurate measurements of the TES Johnson noise, and that in operation, the readout noise is dominated by the detector noise. Multiplexers for large number of channels require a large bandwidth for the multiplexed readout signal. We discuss the resulting implications for the noise performance of these multiplexers which will be used for the readout of two dimensional TES arrays in next generation instruments.

  3. Segment-Wise Genome-Wide Association Analysis Identifies a Candidate Region Associated with Schizophrenia in Three Independent Samples

    PubMed Central

    Rietschel, Marcella; Mattheisen, Manuel; Breuer, René; Schulze, Thomas G.; Nöthen, Markus M.; Levinson, Douglas; Shi, Jianxin; Gejman, Pablo V.; Cichon, Sven; Ophoff, Roel A.

    2012-01-01

    Recent studies suggest that variation in complex disorders (e.g., schizophrenia) is explained by a large number of genetic variants with small effect size (Odds Ratio∼1.05–1.1). The statistical power to detect these genetic variants in Genome Wide Association (GWA) studies with large numbers of cases and controls (∼15,000) is still low. As it will be difficult to further increase sample size, we decided to explore an alternative method for analyzing GWA data in a study of schizophrenia, dramatically reducing the number of statistical tests. The underlying hypothesis was that at least some of the genetic variants related to a common outcome are collocated in segments of chromosomes at a wider scale than single genes. Our approach was therefore to study the association between relatively large segments of DNA and disease status. An association test was performed for each SNP and the number of nominally significant tests in a segment was counted. We then performed a permutation-based binomial test to determine whether this region contained significantly more nominally significant SNPs than expected under the null hypothesis of no association, taking linkage into account. Genome Wide Association data of three independent schizophrenia case/control cohorts with European ancestry (Dutch, German, and US) using segments of DNA with variable length (2 to 32 Mbp) was analyzed. Using this approach we identified a region at chromosome 5q23.3-q31.3 (128–160 Mbp) that was significantly enriched with nominally associated SNPs in three independent case-control samples. We conclude that considering relatively wide segments of chromosomes may reveal reliable relationships between the genome and schizophrenia, suggesting novel methodological possibilities as well as raising theoretical questions. PMID:22723893

  4. Cooperative system and method using mobile robots for testing a cooperative search controller

    DOEpatents

    Byrne, Raymond H.; Harrington, John J.; Eskridge, Steven E.; Hurtado, John E.

    2002-01-01

    A test system for testing a controller provides a way to use large numbers of miniature mobile robots to test a cooperative search controller in a test area, where each mobile robot has a sensor, a communication device, a processor, and a memory. A method of using a test system provides a way for testing a cooperative search controller using multiple robots sharing information and communicating over a communication network.

  5. Large-scale translocation strategies for reintroducing red-cockaded woodpeckers

    Treesearch

    Daniel Saenz; Kristen A. Baum; Richard N. Conner; D. Craig Rudolph; Ralph Costa

    2002-01-01

    Translocation of wild birds is a potential conservation strategy for the endangered red-cockaded woodpecker (Picoides borealis). We developed and tested 8 large-scale translocation strategy models for a regional red-cockaded woodpecker reintroduction program. The purpose of the reintroduction program is to increase the number of red-cockaded...

  6. Role of optometry school in single day large scale school vision testing

    PubMed Central

    Anuradha, N; Ramani, Krishnakumar

    2015-01-01

    Background: School vision testing aims at identification and management of refractive errors. Large-scale school vision testing using conventional methods is time-consuming and demands a lot of chair time from the eye care professionals. A new strategy involving a school of optometry in single day large scale school vision testing is discussed. Aim: The aim was to describe a new approach of performing vision testing of school children on a large scale in a single day. Materials and Methods: A single day vision testing strategy was implemented wherein 123 members (20 teams comprising optometry students and headed by optometrists) conducted vision testing for children in 51 schools. School vision testing included basic vision screening, refraction, frame measurements, frame choice and referrals for other ocular problems. Results: A total of 12448 children were screened, among whom 420 (3.37%) were identified to have refractive errors. 28 (1.26%) children belonged to the primary, 163 to middle (9.80%), 129 (4.67%) to secondary and 100 (1.73%) to the higher secondary levels of education respectively. 265 (2.12%) children were referred for further evaluation. Conclusion: Single day large scale school vision testing can be adopted by schools of optometry to reach a higher number of children within a short span. PMID:25709271

  7. An Exploratory Investigation of the Effects of a Thin Plastic Film Cover on the Profile Drag of an Aircraft Wing Panel

    NASA Technical Reports Server (NTRS)

    Beasley, W. D.; Mcghee, R. J.

    1977-01-01

    Exploratory wind tunnel tests were conducted on a large chord aircraft wing panel to evaluate the potential for drag reduction resulting from the application of a thin plastic film cover. The tests were conducted at a Mach number of 0.15 over a Reynolds number range from about 7 x 10 to the 6th power to 63 x 10 to the 6th power.

  8. Neuronal models for evaluation of proliferation in vitro using high content screening

    EPA Science Inventory

    In vitro test methods can provide a rapid approach for the screening of large numbers of chemicals for their potential to produce toxicity (hazard identification). In order to identify potential developmental neurotoxicants, a battery of in vitro tests for neurodevelopmental proc...

  9. The characteristics of 78 related airfoil sections from tests in the variable-density wind tunnel

    NASA Technical Reports Server (NTRS)

    Jacobs, Eastman N; Ward, Kenneth E; Pinkerton, Robert M

    1933-01-01

    An investigation of a large group of related airfoils was made in the NACA variable-density wind tunnel at a large value of the Reynolds number. The tests were made to provide data that may be directly employed for a rational choice of the most suitable airfoil section for a given application. The variation of the aerodynamic characteristics with variations in thickness and mean-line form were systematically studied. (author)

  10. Human Papillomavirus (HPV) Genotyping: Automation and Application in Routine Laboratory Testing

    PubMed Central

    Torres, M; Fraile, L; Echevarria, JM; Hernandez Novoa, B; Ortiz, M

    2012-01-01

    A large number of assays designed for genotyping human papillomaviruses (HPV) have been developed in the last years. They perform within a wide range of analytical sensitivity and specificity values for the different viral types, and are used either for diagnosis, epidemiological studies, evaluation of vaccines and implementing and monitoring of vaccination programs. Methods for specific genotyping of HPV-16 and HPV-18 are also useful for the prevention of cervical cancer in screening programs. Some commercial tests are, in addition, fully or partially automated. Automation of HPV genotyping presents advantages such as the simplicity of the testing procedure for the operator, the ability to process a large number of samples in a short time, and the reduction of human errors from manual operations, allowing a better quality assurance and a reduction of cost. The present review collects information about the current HPV genotyping tests, with special attention to practical aspects influencing their use in clinical laboratories. PMID:23248734

  11. Detecting a Weak Association by Testing its Multiple Perturbations: a Data Mining Approach

    NASA Astrophysics Data System (ADS)

    Lo, Min-Tzu; Lee, Wen-Chung

    2014-05-01

    Many risk factors/interventions in epidemiologic/biomedical studies are of minuscule effects. To detect such weak associations, one needs a study with a very large sample size (the number of subjects, n). The n of a study can be increased but unfortunately only to an extent. Here, we propose a novel method which hinges on increasing sample size in a different direction-the total number of variables (p). We construct a p-based `multiple perturbation test', and conduct power calculations and computer simulations to show that it can achieve a very high power to detect weak associations when p can be made very large. As a demonstration, we apply the method to analyze a genome-wide association study on age-related macular degeneration and identify two novel genetic variants that are significantly associated with the disease. The p-based method may set a stage for a new paradigm of statistical tests.

  12. A Multilayer Secure Biomedical Data Management System for Remotely Managing a Very Large Number of Diverse Personal Healthcare Devices.

    PubMed

    Park, KeeHyun; Lim, SeungHyeon

    2015-01-01

    In this paper, a multilayer secure biomedical data management system for managing a very large number of diverse personal health devices is proposed. The system has the following characteristics: the system supports international standard communication protocols to achieve interoperability. The system is integrated in the sense that both a PHD communication system and a remote PHD management system work together as a single system. Finally, the system proposed in this paper provides user/message authentication processes to securely transmit biomedical data measured by PHDs based on the concept of a biomedical signature. Some experiments, including the stress test, have been conducted to show that the system proposed/constructed in this study performs very well even when a very large number of PHDs are used. For a stress test, up to 1,200 threads are made to represent the same number of PHD agents. The loss ratio of the ISO/IEEE 11073 messages in the normal system is as high as 14% when 1,200 PHD agents are connected. On the other hand, no message loss occurs in the multilayered system proposed in this study, which demonstrates the superiority of the multilayered system to the normal system with regard to heavy traffic.

  13. TOXCAST, A TOOL FOR CATEGORIZATION AND ...

    EPA Pesticide Factsheets

    Across several EPA Program Offices (e.g., OPPTS, OW, OAR), there is a clear need to develop strategies and methods to screen large numbers of chemicals for potential toxicity, and to use the resulting information to prioritize the use of testing resources towards those entities and endpoints that present the greatest likelihood of risk to human health and the environment. This need could be addressed using the experience of the pharmaceutical industry in the use of advanced modern molecular biology and computational chemistry tools for the development of new drugs, with appropriate adjustment to the needs and desires of environmental toxicology. A conceptual approach named ToxCast has been developed to address the needs of EPA Program Offices in the area of prioritization and screening. Modern computational chemistry and molecular biology tools bring enabling technologies forward that can provide information about the physical and biological properties of large numbers of chemicals. The essence of the proposal is to conduct a demonstration project based upon a rich toxicological database (e.g., registered pesticides, or the chemicals tested in the NTP bioassay program), select a fairly large number (50-100 or more chemicals) representative of a number of differing structural classes and phenotypic outcomes (e.g., carcinogens, reproductive toxicants, neurotoxicants), and evaluate them across a broad spectrum of information domains that modern technology has pro

  14. A Multilayer Secure Biomedical Data Management System for Remotely Managing a Very Large Number of Diverse Personal Healthcare Devices

    PubMed Central

    Lim, SeungHyeon

    2015-01-01

    In this paper, a multilayer secure biomedical data management system for managing a very large number of diverse personal health devices is proposed. The system has the following characteristics: the system supports international standard communication protocols to achieve interoperability. The system is integrated in the sense that both a PHD communication system and a remote PHD management system work together as a single system. Finally, the system proposed in this paper provides user/message authentication processes to securely transmit biomedical data measured by PHDs based on the concept of a biomedical signature. Some experiments, including the stress test, have been conducted to show that the system proposed/constructed in this study performs very well even when a very large number of PHDs are used. For a stress test, up to 1,200 threads are made to represent the same number of PHD agents. The loss ratio of the ISO/IEEE 11073 messages in the normal system is as high as 14% when 1,200 PHD agents are connected. On the other hand, no message loss occurs in the multilayered system proposed in this study, which demonstrates the superiority of the multilayered system to the normal system with regard to heavy traffic. PMID:26247034

  15. Satellite battery testing status

    NASA Astrophysics Data System (ADS)

    Haag, R.; Hall, S.

    1986-09-01

    Because of the large numbers of satellite cells currently being tested and anticipated at the Naval Weapons Support Center (NAVWPNSUPPCEN) Crane, Indiana, satellite cell testing is being integrated into the Battery Test Automation Project (BTAP). The BTAP, designed to meet the growing needs for battery testing at the NAVWPNSUPPCEN Crane, will consist of several Automated Test Stations (ATSs) which monitor batteries under test. Each ATS will interface with an Automation Network Controller (ANC) which will collect test data for reduction.

  16. Satellite battery testing status

    NASA Technical Reports Server (NTRS)

    Haag, R.; Hall, S.

    1986-01-01

    Because of the large numbers of satellite cells currently being tested and anticipated at the Naval Weapons Support Center (NAVWPNSUPPCEN) Crane, Indiana, satellite cell testing is being integrated into the Battery Test Automation Project (BTAP). The BTAP, designed to meet the growing needs for battery testing at the NAVWPNSUPPCEN Crane, will consist of several Automated Test Stations (ATSs) which monitor batteries under test. Each ATS will interface with an Automation Network Controller (ANC) which will collect test data for reduction.

  17. Accurate, high-throughput typing of copy number variation using paralogue ratios from dispersed repeats

    PubMed Central

    Armour, John A. L.; Palla, Raquel; Zeeuwen, Patrick L. J. M.; den Heijer, Martin; Schalkwijk, Joost; Hollox, Edward J.

    2007-01-01

    Recent work has demonstrated an unexpected prevalence of copy number variation in the human genome, and has highlighted the part this variation may play in predisposition to common phenotypes. Some important genes vary in number over a high range (e.g. DEFB4, which commonly varies between two and seven copies), and have posed formidable technical challenges for accurate copy number typing, so that there are no simple, cheap, high-throughput approaches suitable for large-scale screening. We have developed a simple comparative PCR method based on dispersed repeat sequences, using a single pair of precisely designed primers to amplify products simultaneously from both test and reference loci, which are subsequently distinguished and quantified via internal sequence differences. We have validated the method for the measurement of copy number at DEFB4 by comparison of results from >800 DNA samples with copy number measurements by MAPH/REDVR, MLPA and array-CGH. The new Paralogue Ratio Test (PRT) method can require as little as 10 ng genomic DNA, appears to be comparable in accuracy to the other methods, and for the first time provides a rapid, simple and inexpensive method for copy number analysis, suitable for application to typing thousands of samples in large case-control association studies. PMID:17175532

  18. Testing practices and volume of non-Lyme tickborne diseases in the United States.

    PubMed

    Connally, Neeta P; Hinckley, Alison F; Feldman, Katherine A; Kemperman, Melissa; Neitzel, David; Wee, Siok-Bi; White, Jennifer L; Mead, Paul S; Meek, James I

    2016-02-01

    Large commercial laboratories in the United States were surveyed regarding the number of specimens tested for eight tickborne diseases in 2008. Seven large commercial laboratories reported testing a total of 2,927,881 specimens nationally (including Lyme disease). Of these, 495,585 specimens (17%) were tested for tickborne diseases other than Lyme disease. In addition to large commercial laboratories, another 1051 smaller commercial, hospital, and government laboratories in four states (CT, MD, MN, and NY) were surveyed regarding tickborne disease testing frequency, practices, and results. Ninety-two of these reported testing a total of 10,091 specimens for four tickborne diseases other than Lyme disease. We estimate the cost of laboratory diagnostic testing for non-Lyme disease tickborne diseases in 2008 to be $9.6 million. These data provide a baseline to evaluate trends in tickborne disease test utilization and insight into the burden of these diseases. Copyright © 2015 Elsevier GmbH. All rights reserved.

  19. Micronucleus assay in aquatic animals.

    PubMed

    Bolognesi, Claudia; Hayashi, Makoto

    2011-01-01

    Aquatic pollutants produce multiple consequences at organism, population, community and ecosystem level, affecting organ function, reproductive status, population size, species survival and thus biodiversity. Among these, carcinogenic and mutagenic compounds are the most dangerous as their effects may exert a damage beyond that of individual and may be active through several generations. The application of genotoxicity biomarkers in sentinel organisms allows for the assessment of mutagenic hazards and/or for the identification of the sources and fate of the contaminants. Micronucleus (MN) test as an index of accumulated genetic damage during the lifespan of the cells is one of the most suitable techniques to identify integrated response to the complex mixture of contaminants. MN assay is today widely applied in a large number of wild and transplanted aquatic species. The large majority of studies or programmes on the genotoxic effect of the polluted water environment have been carried out with the use of bivalves and fish. Haemocytes and gill cells are the target tissues most frequently considered for the MN determination in bivalves. The MN test was widely validated and was successfully applied in a large number of field studies using bivalves from the genera Mytilus. MN in fish can be visualised in different cell types: erythrocytes and gill, kidney, hepatic and fin cells. The use of peripheral erythrocytes is more widely used because it avoids the complex cell preparation and the killing of the animals. The MN test in fish erythrocytes was validated in laboratory with different species after exposure to a large number of genotoxic agents. The erythrocyte MN test in fish was also widely and frequently applied for genotoxicity assessment of freshwater and marine environment in situ using native or caged animals following different periods of exposure. Large interspecies differences in sensitivity for MN induction were observed. Further validation studies are needed in order to better characterise the different types of nuclear alterations and to clarify the role of biotic and abiotic factors in interspecies and inter-individual variability.

  20. Number Frequency in L1 Differentially Affects Immediate Serial Recall of Numbers in L2 Between Beginning and Intermediate Learners.

    PubMed

    Sumioka, Norihiko; Williams, Atsuko; Yamada, Jun

    2016-12-01

    A list number recall test in English (L2) was administered to both Japanese (L1) students with beginning-level English proficiency who attended evening high school and Japanese college students with intermediate-level English proficiency. The major findings were that, only for the high school group, the small numbers 1 and 2 in middle positions of lists were recalled better than the large numbers 8 and 9 and there was a significant correlation between number frequency in Japanese and recall performance. Equally intriguing was that in both groups for adjacent transposition errors, smaller numbers tended to appear in the first position and large numbers in the second; also, omission errors were commonly seen for larger numbers. These phenomena are interpreted as reflecting frequency and/or frequency-related effects. Briefly discussed were the bilingual short-term memory system, effects of number value, generality and implications of the findings, and weaknesses of the study.

  1. Testing Mediation in Structural Equation Modeling: The Effectiveness of the Test of Joint Significance

    ERIC Educational Resources Information Center

    Leth-Steensen, Craig; Gallitto, Elena

    2016-01-01

    A large number of approaches have been proposed for estimating and testing the significance of indirect effects in mediation models. In this study, four sets of Monte Carlo simulations involving full latent variable structural equation models were run in order to contrast the effectiveness of the currently popular bias-corrected bootstrapping…

  2. Flow through collapsible tubes at low Reynolds numbers. Applicability of the waterfall model.

    PubMed

    Lyon, C K; Scott, J B; Wang, C Y

    1980-07-01

    The applicability of the waterfall model was tested using the Starling resistor and different viscosities of fluids to vary the Reynolds number. The waterfall model proved adequate to describe flow in the Starling resistor model only at very low Reynolds numbers (Reynolds number less than 1). Blood flow characterized by such low Reynolds numbers occurs only in the microvasculature. Thus, it is inappropriate to apply the waterfall model indiscriminately to flow through large collapsible veins.

  3. Evaluation of Alternative Altitude Scaling Methods for Thermal Ice Protection System in NASA Icing Research Tunnel

    NASA Technical Reports Server (NTRS)

    Lee, Sam; Addy, Harold; Broeren, Andy P.; Orchard, David M.

    2017-01-01

    A test was conducted at NASA Icing Research Tunnel to evaluate altitude scaling methods for thermal ice protection system. Two scaling methods based on Weber number were compared against a method based on the Reynolds number. The results generally agreed with the previous set of tests conducted in NRCC Altitude Icing Wind Tunnel. The Weber number based scaling methods resulted in smaller runback ice mass than the Reynolds number based scaling method. The ice accretions from the Weber number based scaling method also formed farther upstream. However there were large differences in the accreted ice mass between the two Weber number based scaling methods. The difference became greater when the speed was increased. This indicated that there may be some Reynolds number effects that isnt fully accounted for and warrants further study.

  4. A study of high-lift airfoils at high Reynolds numbers in the Langley low-turbulence pressure tunnel

    NASA Technical Reports Server (NTRS)

    Morgan, Harry L., Jr.; Ferris, James C.; Mcghee, Robert J.

    1987-01-01

    An experimental study was conducted in the Langley Low Turbulence Pressure Tunnel to determine the effects of Reynolds number and Mach number on the two-dimensional aerodynamic performance of two supercritical type airfoils, one equipped with a conventional flap system and the other with an advanced high lift flap system. The conventional flap system consisted of a leading edge slat and a double slotted, trailing edge flap with a small chord vane and a large chord aft flap. The advanced flap system consisted of a leading edge slat and a double slotted, trailing edge flap with a large chord vane and a small chord aft flap. Both models were tested with all elements nested to form the cruise airfoil and with the leading edge slat and with a single or double slotted, trailing edge flap deflected to form the high lift airfoils. The experimental tests were conducted through a Reynolds number range from 2.8 to 20.9 x 1,000,000 and a Mach number range from 0.10 to 0.35. Lift and pitching moment data were obtained. Summaries of the test results obtained are presented and comparisons are made between the observed aerodynamic performance trends for both models. The results showing the effect of leading edge frost and glaze ice formation is given.

  5. A Statistical Procedure for Testing Unusually Frequent Exactly Matching Responses and Nearly Matching Responses. Research Report. ETS RR-17-23

    ERIC Educational Resources Information Center

    Haberman, Shelby J.; Lee, Yi-Hsuan

    2017-01-01

    In investigations of unusual testing behavior, a common question is whether a specific pattern of responses occurs unusually often within a group of examinees. In many current tests, modern communication techniques can permit quite large numbers of examinees to share keys, or common response patterns, to the entire test. To address this issue,…

  6. Relationship between Health Education on Mathematics Standardized Testing Performance and Academic Indicators for 11th Grade Students

    ERIC Educational Resources Information Center

    Keosybounheuang, Sunnin Bree

    2017-01-01

    High stakes testing is an increasingly powerful force in education with curricular decisions as one of the many factors influenced by this trend. A large number of districts and school leaders feel an increase in curriculum time for tested areas is the answer to America's academic shortcomings. The increase in instructional time for tested areas,…

  7. Zebrafish as a systems toxicology model for developmental neurotoxicity testing.

    PubMed

    Nishimura, Yuhei; Murakami, Soichiro; Ashikawa, Yoshifumi; Sasagawa, Shota; Umemoto, Noriko; Shimada, Yasuhito; Tanaka, Toshio

    2015-02-01

    The developing brain is extremely sensitive to many chemicals. Exposure to neurotoxicants during development has been implicated in various neuropsychiatric and neurological disorders, including autism spectrum disorder, attention deficit hyperactive disorder, schizophrenia, Parkinson's disease, and Alzheimer's disease. Although rodents have been widely used for developmental neurotoxicity testing, experiments using large numbers of rodents are time-consuming, expensive, and raise ethical concerns. Using alternative non-mammalian animal models may relieve some of these pressures by allowing testing of large numbers of subjects while reducing expenses and minimizing the use of mammalian subjects. In this review, we discuss some of the advantages of using zebrafish in developmental neurotoxicity testing, focusing on central nervous system development, neurobehavior, toxicokinetics, and toxicodynamics in this species. We also describe some important examples of developmental neurotoxicity testing using zebrafish combined with gene expression profiling, neuroimaging, or neurobehavioral assessment. Zebrafish may be a systems toxicology model that has the potential to reveal the pathways of developmental neurotoxicity and to provide a sound basis for human risk assessments. © 2014 Japanese Teratology Society.

  8. Integration of chemical-specific exposure and pharmacokinetic considerations with the chemical-agnostic adverse outcome pathway framework

    EPA Science Inventory

    Traditional toxicity testing provides insight into the mechanisms underlying toxicological responses but requires a high investment in large numbers of resources. The new paradigm of testing approaches involves rapid screening of thousands of chemicals across hundreds of biologic...

  9. A Roadmap for the Development of Alternative (Non-Animal) Methods for Systemic Toxicity Testing

    EPA Science Inventory

    Systemic toxicity testing forms the cornerstone for the safety evaluation of substances. Pressures to move from traditional animal models to novel technologies arise from various concerns, including: the need to evaluate large numbers of previously untested chemicals and new prod...

  10. Evaluation of Compatibility of ToxCast High-Throughput/High-Content Screening Assays with Engineered Nanomaterials

    EPA Science Inventory

    High-throughput and high-content screens are attractive approaches for prioritizing nanomaterial hazards and informing targeted testing due to the impracticality of using traditional toxicological testing on the large numbers and varieties of nanomaterials. The ToxCast program a...

  11. Developmental neurotoxicity testing in vitro: Models for assessing chemical effects on neurite outgrowth

    EPA Science Inventory

    In vitro models may be useful for the rapid toxicological screening of large numbers of chemicals for their potential to produce toxicity. Such screening could facilitate prioritization of resources needed for in vivo toxicity testing towards those chemicals most likely to resul...

  12. Modeling Reproductive Toxicity for Chemical Prioritization into an Integrated Testing Strategy

    EPA Science Inventory

    The EPA ToxCast research program uses a high-throughput screening (HTS) approach for predicting the toxicity of large numbers of chemicals. Phase-I tested 309 well-characterized chemicals in over 500 assays of different molecular targets, cellular responses and cell-states. Of th...

  13. Automated Simultaneous Assembly for Multistage Testing

    ERIC Educational Resources Information Center

    Breithaupt, Krista; Ariel, Adelaide; Veldkamp, Bernard P.

    2005-01-01

    This article offers some solutions used in the assembly of the computerized Uniform Certified Public Accountancy (CPA) licensing examination as practical alternatives for operational programs producing large numbers of forms. The Uniform CPA examination was offered as an adaptive multistage test (MST) beginning in April of 2004. Examples of…

  14. Problems of air traffic management. II., Prediction of success in air traffic controller school.

    DOT National Transportation Integrated Search

    1962-02-01

    An analysis of scores for an extensive battery of psychological tests administered to a large number of air traffic controller (ATC) trainees indicated that such tests can make a useful contribution in the selection of personnel for ATC training. Fiv...

  15. Identifying Metabolically Active Chemicals Using a Consensus Quantitative Structure Activity Relationship Model for Estrogen Receptor Binding

    EPA Science Inventory

    Traditional toxicity testing provides insight into the mechanisms underlying toxicological responses but requires a high investment in a large number of resources. The new paradigm of testing approaches involves rapid screening studies able to evaluate thousands of chemicals acro...

  16. Heat and fluid flow characteristics of an oval fin-and-tube heat exchanger with large diameters for textile machine dryer

    NASA Astrophysics Data System (ADS)

    Bae, Kyung Jin; Cha, Dong An; Kwon, Oh Kyung

    2016-11-01

    The objectives of this paper are to develop correlations between heat transfer and pressure drop for oval finned-tube heat exchanger with large diameters (larger than 20 mm) used in a textile machine dryer. Numerical tests using ANSYS CFX are performed for four different parameters; tube size, fin pitch, transverse tube pitch and longitudinal tube pitch. The numerical results showed that the Nusselt number and the friction factor are in a range of -16.2 ~ +3.1 to -7.7 ~ +3.9 %, respectively, compared with experimental results. It was found that the Nusselt number linearly increased with increasing Reynolds number, but the friction factor slightly decreased with increasing Reynolds number. It was also found that the variation of longitudinal tube pitch has little effect on the Nusselt number and friction factor than other parameters (below 2.0 and 2.5 %, respectively). This study proposed a new Nusselt number and friction factor correlation of the oval finned-tube heat exchanger with large diameters for textile machine dryer.

  17. HammerCloud: A Stress Testing System for Distributed Analysis

    NASA Astrophysics Data System (ADS)

    van der Ster, Daniel C.; Elmsheuser, Johannes; Úbeda García, Mario; Paladin, Massimo

    2011-12-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HammerCloud was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HammerCloud has been employed by the ATLAS experiment for continuous testing of many sites worldwide, and also during large scale computing challenges such as STEP'09 and UAT'09, where the scale of the tests exceeded 10,000 concurrently running and 1,000,000 total jobs over multi-day periods. In addition, HammerCloud is being adopted by the CMS experiment; the plugin structure of HammerCloud allows the execution of CMS jobs using their official tool (CRAB).

  18. Reference Material Kydex(registered trademark)-100 Test Data Message for Flammability Testing

    NASA Technical Reports Server (NTRS)

    Engel, Carl D.; Richardson, Erin; Davis, Eddie

    2003-01-01

    The Marshall Space Flight Center (MSFC) Materials and Processes Technical Information System (MAPTIS) database contains, as an engineering resource, a large amount of material test data carefully obtained and recorded over a number of years. Flammability test data obtained using Test 1 of NASA-STD-6001 is a significant component of this database. NASA-STD-6001 recommends that Kydex 100 be used as a reference material for testing certification and for comparison between test facilities in the round-robin certification testing that occurs every 2 years. As a result of these regular activities, a large volume of test data is recorded within the MAPTIS database. The activity described in this technical report was undertaken to mine the database, recover flammability (Test 1) Kydex 100 data, and review the lessons learned from analysis of these data.

  19. Fracture Tests of Etched Components Using a Focused Ion Beam Machine

    NASA Technical Reports Server (NTRS)

    Kuhn, Jonathan, L.; Fettig, Rainer K.; Moseley, S. Harvey; Kutyrev, Alexander S.; Orloff, Jon; Powers, Edward I. (Technical Monitor)

    2000-01-01

    Many optical MEMS device designs involve large arrays of thin (0.5 to 1 micron components subjected to high stresses due to cyclic loading. These devices are fabricated from a variety of materials, and the properties strongly depend on size and processing. Our objective is to develop standard and convenient test methods that can be used to measure the properties of large numbers of witness samples, for every device we build. In this work we explore a variety of fracture test configurations for 0.5 micron thick silicon nitride membranes machined using the Reactive Ion Etching (RIE) process. Testing was completed using an FEI 620 dual focused ion beam milling machine. Static loads were applied using a probe. and dynamic loads were applied through a piezo-electric stack mounted at the base of the probe. Results from the tests are presented and compared, and application for predicting fracture probability of large arrays of devices are considered.

  20. Highlights of experience with a flexible walled test section in the NASA Langley 0.3-meter transonic cryogenic tunnel

    NASA Technical Reports Server (NTRS)

    Wolf, Stephen W. D.; Ray, Edward J.

    1988-01-01

    The unique combination of adaptive wall technology with a contonuous flow cryogenic wind tunnel is described. This powerful combination allows wind tunnel users to carry out 2-D tests at flight Reynolds numbers with wall interference essentially eliminated. Validation testing was conducted to support this claim using well tested symmetrical and cambered airfoils at transonic speeds and high Reynolds numbers. The test section hardware has four solid walls, with the floor and ceiling flexible. The method of adapting/shaping the floor and ceiling to eliminate top and bottom wall interference at its source is outlined. Data comparisons for different size models tested and others in several sophisticated 2-D wind tunnels are made. In addition, the effects of Reynolds number, testing at high lift with associated large flexible wall movements, the uniqueness of the adapted wall shapes, and the effects of sidewall boundary layer control are examined. The 0.3-m TCT is now the most advanced 2-D research facility anywhere.

  1. LVQ and backpropagation neural networks applied to NASA SSME data

    NASA Technical Reports Server (NTRS)

    Doniere, Timothy F.; Dhawan, Atam P.

    1993-01-01

    Feedfoward neural networks with backpropagation learning have been used as function approximators for modeling the space shuttle main engine (SSME) sensor signals. The modeling of these sensor signals is aimed at the development of a sensor fault detection system that can be used during ground test firings. The generalization capability of a neural network based function approximator depends on the training vectors which in this application may be derived from a number of SSME ground test-firings. This yields a large number of training vectors. Large training sets can cause the time required to train the network to be very large. Also, the network may not be able to generalize for large training sets. To reduce the size of the training sets, the SSME test-firing data is reduced using the learning vector quantization (LVQ) based technique. Different compression ratios were used to obtain compressed data in training the neural network model. The performance of the neural model trained using reduced sets of training patterns is presented and compared with the performance of the model trained using complete data. The LVQ can also be used as a function approximator. The performance of the LVQ as a function approximator using reduced training sets is presented and compared with the performance of the backpropagation network.

  2. Dual-Level Method for Estimating Multistructural Partition Functions with Torsional Anharmonicity.

    PubMed

    Bao, Junwei Lucas; Xing, Lili; Truhlar, Donald G

    2017-06-13

    For molecules with multiple torsions, an accurate evaluation of the molecular partition function requires consideration of multiple structures and their torsional-potential anharmonicity. We previously developed a method called MS-T for this problem, and it requires an exhaustive conformational search with frequency calculations for all the distinguishable conformers; this can become expensive for molecules with a large number of torsions (and hence a large number of structures) if it is carried out with high-level methods. In the present work, we propose a cost-effective method to approximate the MS-T partition function when there are a large number of structures, and we test it on a transition state that has eight torsions. This new method is a dual-level method that combines an exhaustive conformer search carried out by a low-level electronic structure method (for instance, AM1, which is very inexpensive) and selected calculations with a higher-level electronic structure method (for example, density functional theory with a functional that is suitable for conformational analysis and thermochemistry). To provide a severe test of the new method, we consider a transition state structure that has 8 torsional degrees of freedom; this transition state structure is formed along one of the reaction pathways of the hydrogen abstraction reaction (at carbon-1) of ketohydroperoxide (KHP; its IUPAC name is 4-hydroperoxy-2-pentanone) by OH radical. We find that our proposed dual-level method is able to significantly reduce the computational cost for computing MS-T partition functions for this test case with a large number of torsions and with a large number of conformers because we carry out high-level calculations for only a fraction of the distinguishable conformers found by the low-level method. In the example studied here, the dual-level method with 40 high-level optimizations (1.8% of the number of optimizations in a coarse-grained full search and 0.13% of the number of optimizations in a fine-grained full search) reproduces the full calculation of the high-level partition function within a factor of 1.0 to 2.0 from 200 to 1000 K. The error in the dual-level method can be further reduced to factors of 0.6 to 1.1 over the whole temperature interval from 200 to 2400 K by optimizing 128 structures (5.9% of the number of optimizations in a fine-grained full search and 0.41% of the number of optimizations in a fine-grained full search). These factor-of-two or better errors are small compared to errors up to a factor of 1.0 × 10 3 if one neglects multistructural effects for the case under study.

  3. Analysis of Fluctuating Static Pressure Measurements in a Large High Reynolds Number Transonic Cryogenic Wind Tunnel. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Igoe, William B.

    1991-01-01

    Dynamic measurements of fluctuating static pressure levels were made using flush mounted high frequency response pressure transducers at eleven locations in the circuit of the National Transonic Facility (NTF) over the complete operating range of this wind tunnel. Measurements were made at test section Mach numbers from 0.2 to 1.2, at pressure from 1 to 8.6 atmospheres and at temperatures from ambient to -250 F, resulting in dynamic flow disturbance measurements at the highest Reynolds numbers available in a transonic ground test facility. Tests were also made independently at variable Mach number, variable Reynolds number, and variable drivepower, each time keeping the other two variables constant thus allowing for the first time, a distinct separation of these three important variables. A description of the NTF emphasizing its flow quality features, details on the calibration of the instrumentation, results of measurements with the test section slots covered, downstream choke, effects of liquid nitrogen injection and gaseous nitrogen venting, comparisons between air and nitrogen, isolation of the effects of Mach number, Reynolds number, and fan drive power, and identification of the sources of significant flow disturbances is included. The results indicate that primary sources of flow disturbance in the NTF may be edge-tones generated by test section sidewall re-entry flaps and the venting of nitrogen gas from the return leg of the tunnel circuit between turns 3 and 4 in the cryogenic mode of operation. The tests to isolate the effects of Mach number, Reynolds number, and drive power indicate that Mach number effects predominate. A comparison with other transonic wind tunnels shows that the NTF has low levels of test section fluctuating static pressure especially in the high subsonic Mach number range from 0.7 to 0.9.

  4. The effects of leading-edge serrations on reducing flow unsteadiness about airfoils, an experimental and analytical investigation

    NASA Technical Reports Server (NTRS)

    Schwind, R. G.; Allen, H. J.

    1973-01-01

    High frequency surface pressure measurements were obtained from wind-tunnel tests over the Reynolds number range 1.2 times one million to 6.2 times one million on a rectangular wing of NACA 63-009 airfoil section. Measurements were also obtained with a wide selection of leading-edge serrations added to the basic airfoil. Under a two-dimensional laminar bubble very close to the leading edge of the basic airfoil there is a large apatial peak in rms pressure. Frequency analysis of the pressure signals in this region show a large, high-frequency energy peak which is interpreted as an oscillation in size and position of the bubble. The serrations divide the bubble into segments and reduce the peak rms pressures. A low Reynolds number flow visualization test on a hydrofoil in water was also conducted. A von Karman vortex street was found trailing from the rear of the foil. Its frequency is at a much lower Strouhal number than in the high Reynolds number experiment, and is related to the trailing-edge and boundary-layer thicknesses.

  5. Small-scale behavior in distorted turbulent boundary layers at low Reynolds number

    NASA Technical Reports Server (NTRS)

    Saddoughi, Seyed G.

    1994-01-01

    During the last three years we have conducted high- and low-Reynolds-number experiments, including hot-wire measurements of the velocity fluctuations, in the test-section-ceiling boundary layer of the 80- by 120-foot Full-Scale Aerodynamics Facility at NASA Ames Research Center, to test the local-isotropy predictions of Kolmogorov's universal equilibrium theory. This hypothesis, which states that at sufficiently high Reynolds numbers the small-scale structures of turbulent motions are independent of large-scale structures and mean deformations, has been used in theoretical studies of turbulence and computational methods such as large-eddy simulation; however, its range of validity in shear flows has been a subject of controversy. The present experiments were planned to enhance our understanding of the local-isotropy hypothesis. Our experiments were divided into two sets. First, measurements were taken at different Reynolds numbers in a plane boundary layer, which is a 'simple' shear flow. Second, experiments were designed to address this question: will our criteria for the existence of local isotropy hold for 'complex' nonequilibrium flows in which extra rates of mean strain are added to the basic mean shear?

  6. Use of Existing Data Bases in a Large Scale Correlational/Regression Study. for Period January 1977-January 1978.

    ERIC Educational Resources Information Center

    Greene, Jennifer C.; Kellogg, Theodore

    Statewide assessment data available from two school years, two grade levels, and five sources (achievement tests; student, principal, and teacher questionnaires; and principal interviews), were aggregated to more closely investigate the relationship between student/school characteristics and student achievement. To organize this large number of…

  7. Sparse Measurement Systems: Applications, Analysis, Algorithms and Design

    ERIC Educational Resources Information Center

    Narayanaswamy, Balakrishnan

    2011-01-01

    This thesis deals with "large-scale" detection problems that arise in many real world applications such as sensor networks, mapping with mobile robots and group testing for biological screening and drug discovery. These are problems where the values of a large number of inputs need to be inferred from noisy observations and where the…

  8. The large scale microelectronics Computer-Aided Design and Test (CADAT) system

    NASA Technical Reports Server (NTRS)

    Gould, J. M.

    1978-01-01

    The CADAT system consists of a number of computer programs written in FORTRAN that provide the capability to simulate, lay out, analyze, and create the artwork for large scale microelectronics. The function of each software component of the system is described with references to specific documentation for each software component.

  9. Large amplitude forcing of a high speed 2-dimensional jet

    NASA Technical Reports Server (NTRS)

    Bernal, L.; Sarohia, V.

    1984-01-01

    The effect of large amplitude forcing on the growth of a high speed two dimensional jet was investigated experimentally. Two forcing techniques were utilized: mass flow oscillations and a mechanical system. The mass flow oscillation tests were conducted at Strouhal numbers from 0.00052 to 0.045, and peak to peak amplitudes up to 50 percent of the mean exit velocity. The exit Mach number was varied in the range 0.15 to 0.8. The corresponding Reynolds numbers were 8,400 and 45,000. The results indicate no significant change of the jet growth rate or centerline velocity decay compared to the undisturbed free jet. The mechanical forcing system consists of two counter rotating hexagonal cylinders located parallel to the span of the nozzle. Forcing frequencies up to 1,500 Hz were tested. Both symmetric and antisymmetric forcing can be implemented. The results for antisymmetric forcing showed a significant (75 percent) increase of the jet growth rate at an exit Mach number of 0.25 and a Strouhal number of 0.019. At higher rotational speeds, the jet deflected laterally. A deflection angle of 39 deg with respect to the centerline was measured at the maximum rotational speed.

  10. Robust Detection of Examinees with Aberrant Answer Changes

    ERIC Educational Resources Information Center

    Belov, Dmitry I.

    2015-01-01

    The statistical analysis of answer changes (ACs) has uncovered multiple testing irregularities on large-scale assessments and is now routinely performed at testing organizations. However, AC data has an uncertainty caused by technological or human factors. Therefore, existing statistics (e.g., number of wrong-to-right ACs) used to detect examinees…

  11. Interference of Wing and Fuselage from Tests of 209 Combinations in the NACA Variable-Density Tunnel

    NASA Technical Reports Server (NTRS)

    Jacobs, Eastman N; Ward, Kenneth E

    1936-01-01

    This report presents the results of tests of 209 simple wing-fuselage combinations made in the NACA variable-density wind tunnel to provide information regarding the effects of aerodynamic interference between wings and fuselages at a large value of Reynolds number.

  12. Learner Washback Variability in Standardized Exit Tests

    ERIC Educational Resources Information Center

    Pan, Yi-Ching

    2014-01-01

    In much of the world, the issue of accountability and measurement of educational outcomes is highly controversial. Exit testing is part of the movement to ascertain what students have learned and hold institutions and teachers to account. However, compared to the large number of teacher washback studies, learner washback research is lacking…

  13. Maxi CAI with a Micro.

    ERIC Educational Resources Information Center

    Gerhold, George; And Others

    This paper describes an effective microprocessor-based CAI system which has been repeatedly tested by a large number of students and edited accordingly. Tasks not suitable for microprocessor based systems (authoring, testing, and debugging) were handled on larger multi-terminal systems. This approach requires that the CAI language used on the…

  14. Mathematics Placement at Cottey College.

    ERIC Educational Resources Information Center

    Callahan, Susan

    In response to the large numbers of students who were failing or dropping out of basic algebra and calculus classes, Cottey College, in Missouri, developed a math placement program in 1982 using Basic Algebra (BA) and Calculus Readiness (CR) tests from the Mathematical Association of America's Placement Testing Program. Cut off scores for the…

  15. Evaluation of Flush-Mounted, S-Duct Inlets with Large Amounts of Boundary Layer Ingestion

    NASA Technical Reports Server (NTRS)

    Berrier, Bobby L.; Morehouse, Melissa B.

    2003-01-01

    A new high Reynolds number test capability for boundary layer ingesting inlets has been developed for the NASA Langley Research Center 0.3-Meter Transonic Cryogenic Tunnel. Using this new capability, an experimental investigation of four S-duct inlet configurations with large amounts of boundary layer ingestion (nominal boundary layer thickness of about 40% of inlet height) was conducted at realistic operating conditions (high subsonic Mach numbers and full-scale Reynolds numbers). The objectives of this investigation were to 1) provide a database for CFD tool validation on boundary layer ingesting inlets operating at realistic conditions and 2) provide a baseline inlet for future inlet flow-control studies. Tests were conducted at Mach numbers from 0.25 to 0.83, Reynolds numbers (based on duct exit diameter) from 5.1 million to a full-scale value of 13.9 million, and inlet mass-flow ratios from 0.39 to 1.58 depending on Mach number. Results of this investigation indicate that inlet pressure recovery generally decreased and inlet distortion generally increased with increasing Mach number. Except at low Mach numbers, increasing inlet mass-flow increased pressure recovery and increased distortion. Increasing the amount of boundary layer ingestion (by decreasing inlet throat height) or ingesting a boundary layer with a distorted (adverse) profile decreased pressure recovery and increased distortion. Finally, increasing Reynolds number had almost no effect on inlet distortion but increased inlet recovery by about one-half percent at a Mach number near cruise.

  16. Factor Structure of the Comprehensive Trail Making Test in Children and Adolescents with Brain Dysfunction

    ERIC Educational Resources Information Center

    Allen, Daniel N.; Thaler, Nicholas S.; Barchard, Kimberly A.; Vertinski, Mary; Mayfield, Joan

    2012-01-01

    The Comprehensive Trail Making Test (CTMT) is a relatively new version of the Trail Making Test that has a number of appealing features, including a large normative sample that allows raw scores to be converted to standard "T" scores adjusted for age. Preliminary validity information suggests that CTMT scores are sensitive to brain…

  17. The Legacy Continues: "The Test" and Denying Access to a Challenging Mathematics Education for Historically Marginalized Students

    ERIC Educational Resources Information Center

    Kitchen, Richard; Ridder, Sarah Anderson; Bolz, Joseph

    2016-01-01

    Research is needed to understand the impact of high-stakes testing on teachers' practices and consequently on their students, particularly at schools that serve large numbers of low-income students and students of color. In this research study, we examined how a state's annual high-stakes test and administrative mandates influenced the assessment…

  18. Drug testing and flow cytometry analysis on a large number of uniform sized tumor spheroids using a microfluidic device

    NASA Astrophysics Data System (ADS)

    Patra, Bishnubrata; Peng, Chien-Chung; Liao, Wei-Hao; Lee, Chau-Hwang; Tung, Yi-Chung

    2016-02-01

    Three-dimensional (3D) tumor spheroid possesses great potential as an in vitro model to improve predictive capacity for pre-clinical drug testing. In this paper, we combine advantages of flow cytometry and microfluidics to perform drug testing and analysis on a large number (5000) of uniform sized tumor spheroids. The spheroids are formed, cultured, and treated with drugs inside a microfluidic device. The spheroids can then be harvested from the device without tedious operation. Due to the ample cell numbers, the spheroids can be dissociated into single cells for flow cytometry analysis. Flow cytometry provides statistical information in single cell resolution that makes it feasible to better investigate drug functions on the cells in more in vivo-like 3D formation. In the experiments, human hepatocellular carcinoma cells (HepG2) are exploited to form tumor spheroids within the microfluidic device, and three anti-cancer drugs: Cisplatin, Resveratrol, and Tirapazamine (TPZ), and their combinations are tested on the tumor spheroids with two different sizes. The experimental results suggest the cell culture format (2D monolayer vs. 3D spheroid) and spheroid size play critical roles in drug responses, and also demonstrate the advantages of bridging the two techniques in pharmaceutical drug screening applications.

  19. NASA Multipoint LDI Development

    NASA Technical Reports Server (NTRS)

    Tacina, Robert

    2001-01-01

    Multipoint Lean-Direct-Injection (LDI) is a combustor concept in which a large number of fuel injectors and fuel-air mixers are used to quickly and uniformly mix the fuel and air so that ultralow levels of NO, are produced. Each fuel injector has an air swirler associated with it for fuel-air mixing and to establish a small recirculation and burning zone. A concept in which there are 36 fuel injectors in the space of a conventional single fuel injector has been tested in a flame tube. A greater than 80 percent reduction in NO, at high power conditions (400 psia, 1000 "Finlet) was achieved. Alternate concepts with 9,25,36 or 49 fuel injectors are being investigated in flame tube tests for their low NO, potential and with fuel staging to improve the turn-down ratio at low power conditions. A preliminary sector concept of a large engine design has been successfully tested at inlet conditions of 700 psia and 1100 O F . This concept had one half the number of fuel injectors per square inch as the flame tube configuration with 36 fuel injectors, and the NO, reduction was 65 percent of the ICAO standard. Future regional engine size sector tests are planned for the 2nd quarter of FY02 and large engine size sector tests for the 1st quarter of FY03.

  20. Estimation of rates-across-sites distributions in phylogenetic substitution models.

    PubMed

    Susko, Edward; Field, Chris; Blouin, Christian; Roger, Andrew J

    2003-10-01

    Previous work has shown that it is often essential to account for the variation in rates at different sites in phylogenetic models in order to avoid phylogenetic artifacts such as long branch attraction. In most current models, the gamma distribution is used for the rates-across-sites distributions and is implemented as an equal-probability discrete gamma. In this article, we introduce discrete distribution estimates with large numbers of equally spaced rate categories allowing us to investigate the appropriateness of the gamma model. With large numbers of rate categories, these discrete estimates are flexible enough to approximate the shape of almost any distribution. Likelihood ratio statistical tests and a nonparametric bootstrap confidence-bound estimation procedure based on the discrete estimates are presented that can be used to test the fit of a parametric family. We applied the methodology to several different protein data sets, and found that although the gamma model often provides a good parametric model for this type of data, rate estimates from an equal-probability discrete gamma model with a small number of categories will tend to underestimate the largest rates. In cases when the gamma model assumption is in doubt, rate estimates coming from the discrete rate distribution estimate with a large number of rate categories provide a robust alternative to gamma estimates. An alternative implementation of the gamma distribution is proposed that, for equal numbers of rate categories, is computationally more efficient during optimization than the standard gamma implementation and can provide more accurate estimates of site rates.

  1. Externally blown flap noise research

    NASA Technical Reports Server (NTRS)

    Dorsch, R. G.

    1974-01-01

    The Lewis Research Center cold-flow model externally blown flap (EBF) noise research test program is summarized. Both engine under-the-wing and over-the-wing EBF wing section configurations were studied. Ten large scale and nineteen small scale EBF models were tested. A limited number of forward airspeed effect and flap noise suppression tests were also run. The key results and conclusions drawn from the flap noise tests are summarized and discussed.

  2. In Vitro Evaluation of the Size, Knot Holding Capacity, and Knot Security of the Forwarder Knot Compared to Square and Surgeon's Knots Using Large Gauge Suture.

    PubMed

    Gillen, Alex M; Munsterman, Amelia S; Hanson, R Reid

    2016-11-01

    To investigate the strength, size, and holding capacity of the self-locking forwarder knot compared to surgeon's and square knots using large gauge suture. In vitro mechanical study. Knotted suture. Forwarder, surgeon's, and square knots were tested on a universal testing machine under linear tension using 2 and 3 USP polyglactin 910 and 2 USP polydioxanone. Knot holding capacity (KHC) and mode of failure were recorded and relative knot security (RKS) was calculated as a percentage of KHC. Knot volume and weight were assessed by digital micrometer and balance, respectively. ANOVA and post hoc testing were used tocompare strength between number of throws, suture, suture size, and knot type. P<.05 was considered significant. Forwarder knots had a higher KHC and RKS than surgeon's or square knots for all suture types and number of throws. No forwarder knots unraveled, but a proportion of square and surgeon's knots with <6 throws did unravel. Forwarder knots had a smaller volume and weight than surgeon's and square knots with equal number of throws. The forwarder knot of 4 throws using 3 USP polyglactin 910 had the highest KHC, RKS, and the smallest size and weight. Forwarder knots may be an alternative for commencing continuous patterns in large gauge suture, without sacrificing knot integrity, but further in vivo and ex vivo testing is required to assess the effects of this sliding knot on tissue perfusion before clinical application. © Copyright 2016 by The American College of Veterinary Surgeons.

  3. The aerodynamic characteristics of eight very thick airfoils from tests in the variable density wind tunnel

    NASA Technical Reports Server (NTRS)

    Jacobs, Eastman N

    1932-01-01

    Report presents the results of wind tunnel tests on a group of eight very thick airfoils having sections of the same thickness as those used near the roots of tapered airfoils. The tests were made to study certain discontinuities in the characteristic curves that have been obtained from previous tests of these airfoils, and to compare the characteristics of the different sections at values of the Reynolds number comparable with those attained in flight. The discontinuities were found to disappear as the Reynolds number was increased. The results obtained from the large-scale airfoil, a symmetrical airfoil having a thickness ratio of 21 per cent, has the best general characteristics.

  4. Utilizing the ultrasensitive Schistosoma up-converting phosphor lateral flow circulating anodic antigen (UCP-LF CAA) assay for sample pooling-strategies.

    PubMed

    Corstjens, Paul L A M; Hoekstra, Pytsje T; de Dood, Claudia J; van Dam, Govert J

    2017-11-01

    Methodological applications of the high sensitivity genus-specific Schistosoma CAA strip test, allowing detection of single worm active infections (ultimate sensitivity), are discussed for efficient utilization in sample pooling strategies. Besides relevant cost reduction, pooling of samples rather than individual testing can provide valuable data for large scale mapping, surveillance, and monitoring. The laboratory-based CAA strip test utilizes luminescent quantitative up-converting phosphor (UCP) reporter particles and a rapid user-friendly lateral flow (LF) assay format. The test includes a sample preparation step that permits virtually unlimited sample concentration with urine, reaching ultimate sensitivity (single worm detection) at 100% specificity. This facilitates testing large urine pools from many individuals with minimal loss of sensitivity and specificity. The test determines the average CAA level of the individuals in the pool thus indicating overall worm burden and prevalence. When requiring test results at the individual level, smaller pools need to be analysed with the pool-size based on expected prevalence or when unknown, on the average CAA level of a larger group; CAA negative pools do not require individual test results and thus reduce the number of tests. Straightforward pooling strategies indicate that at sub-population level the CAA strip test is an efficient assay for general mapping, identification of hotspots, determination of stratified infection levels, and accurate monitoring of mass drug administrations (MDA). At the individual level, the number of tests can be reduced i.e. in low endemic settings as the pool size can be increased as opposed to prevalence decrease. At the sub-population level, average CAA concentrations determined in urine pools can be an appropriate measure indicating worm burden. Pooling strategies allowing this type of large scale testing are feasible with the various CAA strip test formats and do not affect sensitivity and specificity. It allows cost efficient stratified testing and monitoring of worm burden at the sub-population level, ideally for large-scale surveillance generating hard data for performance of MDA programs and strategic planning when moving towards transmission-stop and elimination.

  5. Implicit Large Eddy Simulation of a wingtip vortex at Rec =1.2x106

    NASA Astrophysics Data System (ADS)

    Lombard, Jean-Eloi; Moxey, Dave; Sherwin, Spencer; SherwinLab Team

    2015-11-01

    We present recent developments in numerical methods for performing a Large Eddy Simulation (LES) of the formation and evolution of a wingtip vortex. The development of these vortices in the near wake, in combination with the large Reynolds numbers present in these cases, make these types of test cases particularly challenging to investigate numerically. To demonstrate the method's viability, we present results from numerical simulations of flow over a NACA 0012 profile wingtip at Rec = 1.2 x106 and compare them against experimental data, which is to date the highest Reynolds number achieved for a LES that has been correlated with experiments for this test case. Our model correlates favorably with experiment, both for the characteristic jetting in the primary vortex and pressure distribution on the wing surface. The proposed method is of general interest for the modeling of transitioning vortex dominated flows over complex geometries. McLaren Racing/Royal Academy of Engineering Research Chair.

  6. Large Capacity SMES for Voltage Dip Compensation

    NASA Astrophysics Data System (ADS)

    Iwatani, Yu; Saito, Fusao; Ito, Toshinobu; Shimada, Mamoru; Ishida, Satoshi; Shimanuki, Yoshio

    Voltage dips of power grids due to thunderbolts, snow damage, and so on, cause serious damage to production lines of precision instruments, for example, semiconductors. In recent years, in order to solve this problem, uninterruptible power supply systems (UPS) are used. UPS, however, has small capacity, so a great number of UPS are needed in large factories. Therefore, we have manufactured the superconducting magnetic energy storage (SMES) system for voltage dip compensation able to protect loads with large capacity collectively. SMES has advantages such as space conservation, long lifetime and others. In field tests, cooperating with CHUBU Electric Power Co., Inc. we proved that SMES is valuable for compensating voltage dips. Since 2007, 10MVA SMES improved from field test machines has been running in a domestic liquid crystal display plant, and in 2008, it protected plant loads from a number of voltage dips. In this paper, we report the action principle and components of the improved SMES for voltage dip compensation, and examples of waveforms when 10MVA SMES compensated voltage dips.

  7. Experimental Investigation of Reynolds Number Effects on Test Quality in a Hypersonic Expansion Tube

    NASA Astrophysics Data System (ADS)

    Rossmann, Tobias; Devin, Alyssa; Shi, Wen; Verhoog, Charles

    2017-11-01

    Reynolds number effects on test time and the temporal and spatial flow quality in a hypersonic expansion tube are explored using high-speed pressure, infrared optical, and Schlieren imaging measurements. Boundary layer models for shock tube flows are fairly well established to assist in the determination of test time and flow dimensions at typical high enthalpy test conditions. However, the application of these models needs to be more fully explored due to the unsteady expansion of turbulent boundary layers and contact regions separating dissimilar gasses present in expansion tube flows. Additionally, expansion tubes rely on the development of a steady jet with a large enough core-flow region at the exit of the acceleration tube to create a constant velocity region inside of the test section. High-speed measurements of pressure and Mach number at several locations within the expansion tube allow for the determination of an experimental x-t diagram. The comparison of the experimentally determined x-t diagram to theoretical highlights the Reynolds number dependent effects on expansion tube. Additionally, spatially resolved measurements of the Reynolds number dependent, steady core-flow in the expansion tube viewing section are shown. NSF MRI CBET #1531475, Lafayette College, McCutcheon Foundation.

  8. Macromolecular Origins of Harmonics Higher than the Third in Large-Amplitude Oscillatory Shear Flow

    NASA Astrophysics Data System (ADS)

    Giacomin, Alan; Jbara, Layal; Gilbert, Peter; Chemical Engineering Department Team

    2016-11-01

    In 1935, Andrew Gemant conceived of the complex viscosity, a rheological material function measured by "jiggling" an elastic liquid in oscillatory shear. This test reveals information about both the viscous and elastic properties of the liquid, and about how these properties depend on frequency. The test gained popularity with chemists when John Ferry perfected instruments for measuring both the real and imaginary parts of the complex viscosity. In 1958, Cox and Merz discovered that the steady shear viscosity curve was easily deduced from the magnitude of the complex viscosity, and today oscillatory shear is the single most popular rheological property measurement. With oscillatory shear, we can control two things: the frequency (Deborah number) and the shear rate amplitude (Weissenberg number). When the Weissenberg number is large, the elastic liquids respond with a shear stress over a series of odd-multiples of the test frequency. In this lecture we will explore recent attempts to deepen our understand of the physics of these higher harmonics, including especially harmonics higher than the third. Canada Research Chairs program of the Government of Canada for the Natural Sciences and Engineering Research Council of Canada (NSERC) Tier 1 Canada Research Chair in Rheology.

  9. Aerodynamic Design of a Dual-Flow Mach 7 Hypersonic Inlet System for a Turbine-Based Combined-Cycle Hypersonic Propulsion System

    NASA Technical Reports Server (NTRS)

    Sanders, Bobby W.; Weir, Lois J.

    2008-01-01

    A new hypersonic inlet for a turbine-based combined-cycle (TBCC) engine has been designed. This split-flow inlet is designed to provide flow to an over-under propulsion system with turbofan and dual-mode scramjet engines for flight from takeoff to Mach 7. It utilizes a variable-geometry ramp, high-speed cowl lip rotation, and a rotating low-speed cowl that serves as a splitter to divide the flow between the low-speed turbofan and the high-speed scramjet and to isolate the turbofan at high Mach numbers. The low-speed inlet was designed for Mach 4, the maximum mode transition Mach number. Integration of the Mach 4 inlet into the Mach 7 inlet imposed significant constraints on the low-speed inlet design, including a large amount of internal compression. The inlet design was used to develop mechanical designs for two inlet mode transition test models: small-scale (IMX) and large-scale (LIMX) research models. The large-scale model is designed to facilitate multi-phase testing including inlet mode transition and inlet performance assessment, controls development, and integrated systems testing with turbofan and scramjet engines.

  10. Significant lexical relationships

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pedersen, T.; Kayaalp, M.; Bruce, R.

    Statistical NLP inevitably deals with a large number of rare events. As a consequence, NLP data often violates the assumptions implicit in traditional statistical procedures such as significance testing. We describe a significance test, an exact conditional test, that is appropriate for NLP data and can be performed using freely available software. We apply this test to the study of lexical relationships and demonstrate that the results obtained using this test are both theoretically more reliable and different from the results obtained using previously applied tests.

  11. HPV Testing of Head and Neck Cancer in Clinical Practice.

    PubMed

    Robinson, Max

    The pathology laboratory has a central role in providing human papillomavirus (HPV) tests for patients with head and neck cancer. There is an extensive literature around HPV testing and a large number of proprietary HPV tests, which makes the field difficult to navigate. This review provides a concise contemporary overview of the evidence around HPV testing in head and neck cancer and signposts key publications, guideline documents and the most commonly used methods in clinical practice.

  12. Electric and hybrid vehicles

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Report characterizes state-of-the-art electric and hybrid (combined electric and heat engine) vehicles. Performance data for representative number of these vehicles were obtained from track and dynamometer tests. User experience information was obtained from fleet operators and individual owners of electric vehicles. Data on performance and physical characteristics of large number of vehicles were obtained from manufacturers and available literature.

  13. Testing Peer Effects among College Students: Evidence from an Unusual Admission Policy Change in China

    ERIC Educational Resources Information Center

    Lu, Fangwen

    2014-01-01

    This paper studies a natural experiment due to an unusual change in the college admission policy at a Chinese university, which brought a large number of low-score students into several academic departments in the university. Exploiting large variations in peer characteristics and strong interactions among peer groups, the analysis finds that…

  14. Episodic, generalized, and semantic memory tests: switching and strength effects.

    PubMed

    Humphreys, Michael S; Murray, Krista L

    2011-09-01

    We continue the process of investigating the probabilistic paired associate paradigm in an effort to understand the memory access control processes involved and to determine whether the memory structure produced is in transition between episodic and semantic memory. In this paradigm two targets are probabilistically paired with a cue across a large number of short lists. Participants can recall the target paired with the cue in the most recent list (list specific test), produce the first of the two targets that have been paired with that cue to come to mind (generalised test), and produce a free association response (semantic test). Switching between a generalised test and a list specific test did not produce a switching cost indicating a general similarity in the control processes involved. In addition, there was evidence for a dissociation between two different strength manipulations (amount of study time and number of cue-target pairings) such that number of pairings influenced the list specific, generalised and the semantic test but amount of study time only influenced the list specific and generalised test. © 2011 Canadian Psychological Association

  15. Application of identifying transmission spheres for spherical surface testing

    NASA Astrophysics Data System (ADS)

    Han, Christopher B.; Ye, Xin; Li, Xueyuan; Wang, Quanzhao; Tang, Shouhong; Han, Sen

    2017-06-01

    We developed a new application on Microsoft Foundation Classes (MFC) to identify correct transmission spheres (TS) for Spherical Surface Testing (SST). Spherical surfaces are important optical surfaces, and the wide application and high production rate of spherical surfaces necessitates an accurate and highly reliable measuring device. A Fizeau Interferometer is an appropriate tool for SST due to its subnanometer accuracy. It measures the contour of a spherical surface using a common path, which is insensitive to the surrounding circumstances. The Fizeau Interferometer transmits a wide laser beam, creating interference fringes from re-converging light from the transmission sphere and the test surface. To make a successful measurement, the application calculates and determines the appropriate transmission sphere for the test surface. There are 3 main inputs from the test surfaces that are utilized to determine the optimal sizes and F-numbers of the transmission spheres: (1) the curvatures (concave or convex), (2) the Radii of Curvature (ROC), and (3) the aperture sizes. The application will firstly calculate the F-numbers (i.e. ROC divided by aperture) of the test surface, secondly determine the correct aperture size of a convex surface, thirdly verify that the ROC of the test surface must be shorter than the reference surface's ROC of the transmission sphere, and lastly calculate the percentage of area that the test surface will be measured. However, the amount of interferometers and transmission spheres should be optimized when measuring large spherical surfaces to avoid requiring a large amount of interferometers and transmission spheres for each test surface. Current measuring practices involve tedious and potentially inaccurate calculations. This smart application eliminates human calculation errors, optimizes the selection of transmission spheres (including the least number required) and interferometer sizes, and increases efficiency.

  16. Automated design of paralogue ratio test assays for the accurate and rapid typing of copy number variation

    PubMed Central

    Veal, Colin D.; Xu, Hang; Reekie, Katherine; Free, Robert; Hardwick, Robert J.; McVey, David; Brookes, Anthony J.; Hollox, Edward J.; Talbot, Christopher J.

    2013-01-01

    Motivation: Genomic copy number variation (CNV) can influence susceptibility to common diseases. High-throughput measurement of gene copy number on large numbers of samples is a challenging, yet critical, stage in confirming observations from sequencing or array Comparative Genome Hybridization (CGH). The paralogue ratio test (PRT) is a simple, cost-effective method of accurately determining copy number by quantifying the amplification ratio between a target and reference amplicon. PRT has been successfully applied to several studies analyzing common CNV. However, its use has not been widespread because of difficulties in assay design. Results: We present PRTPrimer (www.prtprimer.org) software for automated PRT assay design. In addition to stand-alone software, the web site includes a database of pre-designed assays for the human genome at an average spacing of 6 kb and a web interface for custom assay design. Other reference genomes can also be analyzed through local installation of the software. The usefulness of PRTPrimer was tested within known CNV, and showed reproducible quantification. This software and database provide assays that can rapidly genotype CNV, cost-effectively, on a large number of samples and will enable the widespread adoption of PRT. Availability: PRTPrimer is available in two forms: a Perl script (version 5.14 and higher) that can be run from the command line on Linux systems and as a service on the PRTPrimer web site (www.prtprimer.org). Contact: cjt14@le.ac.uk Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:23742985

  17. A Correction for Recruitment Bias in Norms Derived from Meta-Analysis

    ERIC Educational Resources Information Center

    Williams, J. Michael; Cottle, Cindy C.

    2011-01-01

    Normative comparisons are an integral component of neuropsychological test interpretation and provide the basis for an inference of abnormal function and impairment. In order to remedy a deficit of normative standards for a large number of neuropsychology tests, Mitrushina, Boone, Razani, and D'Elia (2005) used the meta-analysis of studies that…

  18. Pay for Grades

    ERIC Educational Resources Information Center

    Johnston, Howard

    2008-01-01

    The practice of paying students to earn good grades either in class or on standardized achievement tests has touched off a storm of controversy. Praised by some educators as a way of linking economic rewards to school performance, it is being tested in a number of large cities, such as New York, Baltimore and Chicago, as well as some smaller…

  19. Using Internet-Based Language Testing Capacity to the Private Sector

    ERIC Educational Resources Information Center

    Garcia Laborda, Jesus

    2009-01-01

    Language testing has a large number of commercial applications in both the institutional and the private sectors. Some jobs in the health services sector or the public services sector require foreign language skills and these skills require continuous and efficient language assessments. Based on an experience developed through the cooperation of…

  20. Using the Larval Zebrafish Locomotor Asssay in Functional Neurotoxicity Screening: Light Brightness and the Order of Stimulus Presentation Affect the Outcome

    EPA Science Inventory

    We are evaluating methods to screen/prioritize large numbers of chemicals using 6 day old zebrafish (Danio rerio) as an alternative model for detecting neurotoxic effects. Our behavioral testing paradigm simultaneously tests individual larval zebrafish under sequential light and...

  1. Selected Test Items in American History. Bulletin Number 6, Fifth Edition.

    ERIC Educational Resources Information Center

    Anderson, Howard R.; Lindquist, E. F.

    Designed for high school students, this bulletin provides an extensive file of 1,062 multiple-choice questions in American history. Taken largely from the Iowa Every-Pupil Program and the Cooperative Test Service standardized examinations, the questions are chronologically divided into 16 topic areas. They include exploration and discovery;…

  2. Larval Behavioral Toxicity Screening: Light Intensity and the Order of Stimulus Presentation Affect the Outcome

    EPA Science Inventory

    The U.S. Environmental Protection Agency is screening large numbers of chemicals using 6 day old zebrafish (Danio rerio). We use a behavioral testing paradigm that simultaneously tests individual zebrafish under both light and dark conditions in a 96-well plate using a video tr...

  3. A study of airplane engine tests

    NASA Technical Reports Server (NTRS)

    Gage, Victor R

    1920-01-01

    This report is a study of the results obtained from a large number of test of an Hispano-Suiza airplane engine in the altitude laboratory of the Bureau of Standards. It was originally undertaken to determine the heat distribution in such an engine, but many other factors are also considered as bearing on this matter.

  4. Initial Development and Validation of the BullyHARM: The Bullying, Harassment, and Aggression Receipt Measure

    ERIC Educational Resources Information Center

    Hall, William J.

    2016-01-01

    This article describes the development and preliminary validation of the Bullying, Harassment, and Aggression Receipt Measure (BullyHARM). The development of the BullyHARM involved a number of steps and methods, including a literature review, expert review, cognitive testing, readability testing, data collection from a large sample, reliability…

  5. Testing of transition-region models: Test cases and data

    NASA Technical Reports Server (NTRS)

    Singer, Bart A.; Dinavahi, Surya; Iyer, Venkit

    1991-01-01

    Mean flow quantities in the laminar turbulent transition region and in the fully turbulent region are predicted with different models incorporated into a 3-D boundary layer code. The predicted quantities are compared with experimental data for a large number of different flows and the suitability of the models for each flow is evaluated.

  6. Using the Larval Zebrafish Locomotor Assay in Functional Neurotoxicity Screening: Light Intensity and the Order of Stimulus Presentation Affect the Outcome

    EPA Science Inventory

    The U.S. Environmental Protection Agency is evaluating methods to screen and prioritize large numbers of chemicals using 6 day old zebrafish (Danio rerio) as an alternative test model for detecting neurotoxic chemicals. We use a behavioral testing paradigm that simultaneously tes...

  7. Testing for Additivity in Chemical Mixtures Using a Fixed-Ratio Ray Design and Statistical Equivalence Testing Methods

    EPA Science Inventory

    Fixed-ratio ray designs have been used for detecting and characterizing interactions of large numbers of chemicals in combination. Single chemical dose-response data are used to predict an “additivity curve” along an environmentally relevant ray. A “mixture curve” is estimated fr...

  8. A performance comparison of two small rocket nozzles

    NASA Technical Reports Server (NTRS)

    Arrington, Lynn A.; Reed, Brian D.; Rivera, Angel, Jr.

    1996-01-01

    An experimental study was conducted on two small rockets (110 N thrust class) to directly compare a standard conical nozzle with a bell nozzle optimized for maximum thrust using the Rao method. In large rockets, with throat Reynolds numbers of greater than 1 x 10(exp 5), bell nozzles outperform conical nozzles. In rockets with throat Reynolds numbers below 1 x 10(exp 5), however, test results have been ambiguous. An experimental program was conducted to test two small nozzles at two different fuel film cooling percentages and three different chamber pressures. Test results showed that for the throat Reynolds number range from 2 x 10(exp 4) to 4 x 10(exp 4), the bell nozzle outperformed the conical nozzle. Thrust coefficients for the bell nozzle were approximately 4 to 12 percent higher than those obtained with the conical nozzle. As expected, testing showed that lowering the fuel film cooling increased performance for both nozzle types.

  9. Testing and checkout experiences in the National Transonic Facility since becoming operational

    NASA Technical Reports Server (NTRS)

    Bruce, W. E., Jr.; Gloss, B. B.; Mckinney, L. W.

    1988-01-01

    The U.S. National Transonic Facility, constructed by NASA to meet the national needs for High Reynolds Number Testing, has been operational in a checkout and test mode since the operational readiness review (ORR) in late 1984. During this time, there have been problems centered around the effect of large temperature excursions on the mechanical movement of large components, the reliable performance of instrumentation systems, and an unexpected moisture problem with dry insulation. The more significant efforts since the ORR are reviewed and NTF status concerning hardware, instrumentation and process controls systems, operating constraints imposed by the cryogenic environment, and data quality and process controls is summarized.

  10. Wind-turbine-performance assessment

    NASA Astrophysics Data System (ADS)

    Vachon, W. A.

    1982-06-01

    An updated summary of recent test data and experiences is reported from both federally and privately funded large wind turbine (WT) development and test programs, and from key WT programs in Europe. Progress and experiences on both the cluster of three MOD-2 2.5-MW WT's, the MOD-1 2-MW WT, and other WT installations are described. An examination of recent test experiences and plans from approximately five privately funded large WT programs in the United States indicates that, during machine checkout and startup, technical problems are identified, which require and startup, a number of technical problems are identified, which will require design changes and create program delays.

  11. Automatic control of cryogenic wind tunnels

    NASA Technical Reports Server (NTRS)

    Balakrishna, S.

    1989-01-01

    Inadequate Reynolds number similarity in testing of scaled models affects the quality of aerodynamic data from wind tunnels. This is due to scale effects of boundary-layer shock wave interaction which is likely to be severe at transonic speeds. The idea of operation of wind tunnels using test gas cooled to cryogenic temperatures has yielded a quantrum jump in the ability to realize full scale Reynolds number flow similarity in small transonic tunnels. In such tunnels, the basic flow control problem consists of obtaining and maintaining the desired test section flow parameters. Mach number, Reynolds number, and dynamic pressure are the three flow parameters that are usually required to be kept constant during the period of model aerodynamic data acquisition. The series of activity involved in modeling, control law development, mechanization of the control laws on a microcomputer, and the performance of a globally stable automatic control system for the 0.3-m Transonic Cryogenic Tunnel (TCT) are discussed. A lumped multi-variable nonlinear dynamic model of the cryogenic tunnel, generation of a set of linear control laws for small perturbation, and nonlinear control strategy for large set point changes including tunnel trajectory control are described. The details of mechanization of the control laws on a 16 bit microcomputer system, the software features, operator interface, the display and safety are discussed. The controller is shown to provide globally stable and reliable temperature control to + or - 0.2 K, pressure to + or - 0.07 psi and Mach number to + or - 0.002 of the set point value. This performance is obtained both during large set point commands as for a tunnel cooldown, and during aerodynamic data acquisition with intrusive activity like geometrical changes in the test section such as angle of attack changes, drag rake movements, wall adaptation and sidewall boundary-layer removal. Feasibility of the use of an automatic Reynolds number control mode with fixed Mach number control is demonstrated.

  12. Stagnation Region Heat Transfer Augmentation at Very High Turbulence Levels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ames, Forrest; Kingery, Joseph E.

    A database for stagnation region heat transfer has been extended to include heat transfer measurements acquired downstream from a new high intensity turbulence generator. This work was motivated by gas turbine industry heat transfer designers who deal with heat transfer environments with increasing Reynolds numbers and very high turbulence levels. The new mock aero-combustor turbulence generator produces turbulence levels which average 17.4%, which is 37% higher than the older turbulence generator. The increased level of turbulence is caused by the reduced contraction ratio from the liner to the exit. Heat transfer measurements were acquired on two large cylindrical leading edgemore » test surfaces having a four to one range in leading edge diameter (40.64 cm and 10.16 cm). Gandvarapu and Ames [1] previously acquired heat transfer measurements for six turbulence conditions including three grid conditions, two lower turbulence aero-combustor conditions, and a low turbulence condition. The data are documented and tabulated for an eight to one range in Reynolds numbers for each test surface with Reynolds numbers ranging from 62,500 to 500,000 for the large leading edge and 15,625 to 125,000 for the smaller leading edge. The data show augmentation levels of up to 136% in the stagnation region for the large leading edge. This heat transfer rate is an increase over the previous aero-combustor turbulence generator which had augmentation levels up to 110%. Note, the rate of increase in heat transfer augmentation decreases for the large cylindrical leading edge inferring only a limited level of turbulence intensification in the stagnation region. The smaller cylindrical leading edge shows more consistency with earlier stagnation region heat transfer results correlated on the TRL (Turbulence, Reynolds number, Length scale) parameter. The downstream regions of both test surfaces continue to accelerate the flow but at a much lower rate than the leading edge. Bypass transition occurs in these regions providing a useful set of data to ground the prediction of transition onset and length over a wide range of Reynolds numbers and turbulence intensity and scales.« less

  13. Investigation of flow fields within large scale hypersonic inlet models

    NASA Technical Reports Server (NTRS)

    Gnos, A. V.; Watson, E. C.; Seebaugh, W. R.; Sanator, R. J.; Decarlo, J. P.

    1973-01-01

    Analytical and experimental investigations were conducted to determine the internal flow characteristics in model passages representative of hypersonic inlets for use at Mach numbers to about 12. The passages were large enough to permit measurements to be made in both the core flow and boundary layers. The analytical techniques for designing the internal contours and predicting the internal flow-field development accounted for coupling between the boundary layers and inviscid flow fields by means of a displacement-thickness correction. Three large-scale inlet models, each having a different internal compression ratio, were designed to provide high internal performance with an approximately uniform static-pressure distribution at the throat station. The models were tested in the Ames 3.5-Foot Hypersonic Wind Tunnel at a nominal free-stream Mach number of 7.4 and a unit free-stream Reynolds number of 8.86 X one million per meter.

  14. Large increase in opportunistic testing for chlamydia during a pilot project in a primary health organisation.

    PubMed

    Azariah, Sunita; McKernon, Stephen; Werder, Suzanne

    2013-06-01

    The Auckland chlamydia pilot project was one of three funded by the Ministry of Health to trial implementation of the 2008 Chlamydia Management Guidelines. Chlamydia is the most commonly notified sexually transmitted infection in New Zealand. To increase opportunistic testing in under-25-year-olds and to improve documentation of partner notification in primary care. A four-month pilot was initiated in Total Healthcare Otara using a nurse-led approach. Laboratory testing data was analysed to assess whether the pilot had any impact on chlamydia testing volumes in the target age-group. Data entered in the practice management system was used to assess follow-up and management of chlamydia cases. During the pilot there was a 300% increase in the number of chlamydia tests in the target age group from 812 to 2410 and the number of male tests increased by nearly 500%. Twenty-four percent of people tested were positive for chlamydia, with no significant difference in prevalence by ethnicity. The pilot resulted in better documentation of patient follow-up in the patient management system. There was a large increase in chlamydia testing during the pilot with a high prevalence found in the population tested. Chlamydia remains an important health problem in New Zealand. The cost benefit of increased chlamydia screening at a population level has yet to be established.

  15. Using infective mosquitoes to challenge monkeys with Plasmodium knowlesi in malaria vaccine studies.

    PubMed

    Murphy, Jittawadee R; Weiss, Walter R; Fryauff, David; Dowler, Megan; Savransky, Tatyana; Stoyanov, Cristina; Muratova, Olga; Lambert, Lynn; Orr-Gonzalez, Sachy; Zeleski, Katie Lynn; Hinderer, Jessica; Fay, Michael P; Joshi, Gyan; Gwadz, Robert W; Richie, Thomas L; Villasante, Eileen Franke; Richardson, Jason H; Duffy, Patrick E; Chen, Jingyang

    2014-06-03

    When rhesus monkeys (Macaca mulatta) are used to test malaria vaccines, animals are often challenged by the intravenous injection of sporozoites. However, natural exposure to malaria comes via mosquito bite, and antibodies can neutralize sporozoites as they traverse the skin. Thus, intravenous injection may not fairly assess humoral immunity from anti-sporozoite malaria vaccines. To better assess malaria vaccines in rhesus, a method to challenge large numbers of monkeys by mosquito bite was developed. Several species and strains of mosquitoes were tested for their ability to produce Plasmodium knowlesi sporozoites. Donor monkey parasitaemia effects on oocyst and sporozoite numbers and mosquito mortality were documented. Methylparaben added to mosquito feed was tested to improve mosquito survival. To determine the number of bites needed to infect a monkey, animals were exposed to various numbers of P. knowlesi-infected mosquitoes. Finally, P. knowlesi-infected mosquitoes were used to challenge 17 monkeys in a malaria vaccine trial, and the effect of number of infectious bites on monkey parasitaemia was documented. Anopheles dirus, Anopheles crascens, and Anopheles dirus X (a cross between the two species) produced large numbers of P. knowlesi sporozoites. Mosquito survival to day 14, when sporozoites fill the salivary glands, averaged only 32% when donor monkeys had a parasitaemia above 2%. However, when donor monkey parasitaemia was below 2%, mosquitoes survived twice as well and contained ample sporozoites in their salivary glands. Adding methylparaben to sugar solutions did not improve survival of infected mosquitoes. Plasmodium knowlesi was very infectious, with all monkeys developing blood stage infections if one or more infected mosquitoes successfully fed. There was also a dose-response, with monkeys that received higher numbers of infected mosquito bites developing malaria sooner. Anopheles dirus, An. crascens and a cross between these two species all were excellent vectors for P. knowlesi. High donor monkey parasitaemia was associated with poor mosquito survival. A single infected mosquito bite is likely sufficient to infect a monkey with P. knowlesi. It is possible to efficiently challenge large groups of monkeys by mosquito bite, which will be useful for P. knowlesi vaccine studies.

  16. Three-Dimensional Flow Field Measurements in a Transonic Turbine Cascade

    NASA Technical Reports Server (NTRS)

    Giel, P. W.; Thurman, D. R.; Lopez, I.; Boyle, R. J.; VanFossen, G. J.; Jett, T. A.; Camperchioli, W. P.; La, H.

    1996-01-01

    Three-dimensional flow field measurements are presented for a large scale transonic turbine blade cascade. Flow field total pressures and pitch and yaw flow angles were measured at an inlet Reynolds number of 1.0 x 10(exp 6) and at an isentropic exit Mach number of 1.3 in a low turbulence environment. Flow field data was obtained on five pitchwise/spanwise measurement planes, two upstream and three downstream of the cascade, each covering three blade pitches. Three-hole boundary layer probes and five-hole pitch/yaw probes were used to obtain data at over 1200 locations in each of the measurement planes. Blade and endwall static pressures were also measured at an inlet Reynolds number of 0.5 x 10(exp 6) and at an isentropic exit Mach number of 1.0. Tests were conducted in a linear cascade at the NASA Lewis Transonic Turbine Blade Cascade Facility. The test article was a turbine rotor with 136 deg of turning and an axial chord of 12.7 cm. The flow field in the cascade is highly three-dimensional as a result of thick boundary layers at the test section inlet and because of the high degree of flow turning. The large scale allowed for very detailed measurements of both flow field and surface phenomena. The intent of the work is to provide benchmark quality data for CFD code and model verification.

  17. Testing the QCD string at large Nc from the thermodynamics of the hadronic phase

    NASA Astrophysics Data System (ADS)

    Cohen, Thomas D.

    2007-02-01

    It is generally believed that in the limit of a large number of colors (Nc) the description of confinement via flux tubes becomes valid and QCD can be modeled accurately via a hadronic string theory—at least for highly excited states. QCD at large Nc also has a well-defined deconfinement transition at a temperature Tc. In this talk it is shown how the thermodyanmics of the metastable hadronic phase of QCD (above Tc) at large NC can be related directly to properties of the effective QCD string. The key points in the derivation is the weakly interacting nature of hadrons at large Nc and the existence of a Hagedorn temperature TH for the effective string theory. From this it can be seen at large Nc and near TH, the energy density and pressure of the hadronic phase scale as E ˜ (TH - T)-(D⊥-6)/2 (for D⊥ < 6) and P ˜ (TH - T)-(D⊥-4)/2 (for D⊥ < 4) where D⊥ is the effective number of transverse dimensions of the string theory. This behavior for D⊥ < 6 is qualitatively different from typical models in statistical mechanics and if observed on the lattice would provide a direct test of the stringy nature of large Nc QCD. However since it can be seen that TH > Tc this behavior is of relevance only to the metastable phase. The prospect of using this result to extract D⊥ via lattice simulations of the metastable hadronic phase at moderately large Nc is discussed.

  18. At Birth, Humans Associate "Few" with Left and "Many" with Right.

    PubMed

    de Hevia, Maria Dolores; Veggiotti, Ludovica; Streri, Arlette; Bonn, Cory D

    2017-12-18

    Humans use spatial representations to structure abstract concepts [1]. One of the most well-known examples is the "mental number line"-the propensity to imagine numbers oriented in space [2, 3]. Human infants [4, 5], children [6, 7], adults [8], and nonhuman animals [9, 10] associate small numbers with the left side of space and large numbers with the right. In humans, cultural artifacts, such as the direction of reading and writing, modulate the directionality of this representation, with right-to-left reading cultures associating small numbers with right and large numbers with left [11], whereas the opposite association permeates left-to-right reading cultures [8]. Number-space mapping plays a central role in human mathematical concepts [12], but its origins remain unclear: is it the result of an innate bias or does it develop after birth? Infant humans are passively exposed to a spatially coded environment, so experience and culture could underlie the mental number line. To rule out this possibility, we tested neonates' responses to small or large auditory quantities paired with geometric figures presented on either the left or right sides of the screen. We show that 0- to 3-day-old neonates associate a small quantity with the left and a large quantity with the right when the multidimensional stimulus contains discrete numerical information, providing evidence that representations of number are associated to an oriented space at the start of postnatal life, prior to experience with language, culture, or with culture-specific biases. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. High Reynolds Number Investigation of a Flush-Mounted, S-Duct Inlet With Large Amounts of Boundary Layer Ingestion

    NASA Technical Reports Server (NTRS)

    Berrier, Bobby L.; Carter, Melissa B.; Allan, Brian G.

    2005-01-01

    An experimental investigation of a flush-mounted, S-duct inlet with large amounts of boundary layer ingestion has been conducted at Reynolds numbers up to full scale. The study was conducted in the NASA Langley Research Center 0.3-Meter Transonic Cryogenic Tunnel. In addition, a supplemental computational study on one of the inlet configurations was conducted using the Navier-Stokes flow solver, OVERFLOW. Tests were conducted at Mach numbers from 0.25 to 0.83, Reynolds numbers (based on aerodynamic interface plane diameter) from 5.1 million to 13.9 million (full-scale value), and inlet mass-flow ratios from 0.29 to 1.22, depending on Mach number. Results of the study indicated that increasing Mach number, increasing boundary layer thickness (relative to inlet height) or ingesting a boundary layer with a distorted profile decreased inlet performance. At Mach numbers above 0.4, increasing inlet airflow increased inlet pressure recovery but also increased distortion. Finally, inlet distortion was found to be relatively insensitive to Reynolds number, but pressure recovery increased slightly with increasing Reynolds number.

  20. Wind-tunnel investigation of aerodynamic loading on a 0.237-scale model of a remotely piloted research vehicle with a thick, high-aspect-ratio supercritical wing

    NASA Technical Reports Server (NTRS)

    Byrdsong, T. A.; Brooks, C. W., Jr.

    1983-01-01

    Wind-tunnel measurements were made of the wing-surface static-pressure distributions on a 0.237 scale model of a remotely piloted research vehicle equipped with a thick, high-aspect-ratio supercritical wing. Data are presented for two model configurations (with and without a ventral pod) at Mach numbers from 0.70 to 0.92 at angles of attack from -4 deg to 8 deg. Large variations of wing-surface local pressure distributions were developed; however, the characteristic supercritical-wing pressure distribution occurred near the design condition of 0.80 Mach number and 2 deg angle of attack. The significant variations of the local pressure distributions indicated pronounced shock-wave movements that were highly sensitive to angle of attack and Mach number. The effect of the vertical pod varied with test conditions; however at the higher Mach numbers, the effects on wing flow characteristics were significant at semispan stations as far outboard as 0.815. There were large variations of the wing loading in the range of test conditions, both model configurations exhibited a well-defined peak value of normal-force coefficient at the cruise angle of attack (2 deg) and Mach number (0.80).

  1. Side forces on forebodies at high angles of attack and Mach numbers from 0.1 to 0.7: two tangent ogives, paraboloid and cone

    NASA Technical Reports Server (NTRS)

    Keener, E. R.; Chapman, G. T.; Taleghani, J.; Cohen, L.

    1977-01-01

    An experimental investigation was conducted in the Ames 12-Foot Wind Tunnel to determine the subsonic aerodynamic characteristics of four forebodies at high angles of attack. The forebodies tested were a tangent ogive with fineness ratio of 5, a paraboloid with fineness ratio of 3.5, a 20 deg cone, and a tangent ogive with an elliptic cross section. The investigation included the effects of nose bluntness and boundary-layer trips. The tangent-ogive forebody was also tested in the presence of a short afterbody and with the afterbody attached. Static longitudinal and lateral/directional stability data were obtained. The investigation was conducted to investigate the existence of large side forces and yawing moments at high angles of attack and zero sideslip. It was found that all of the forebodies experience steady side forces that start at angles of attack of from 20 deg to 35 deg and exist to as high as 80 deg, depending on forebody shape. The side is as large as 1.6 times the normal force and is generally repeatable with increasing and decreasing angle of attack and, also, from test to test. The side force is very sensitive to the nature of the boundary layer, as indicated by large changes with boundary trips. The maximum side force caries considerably with Reynolds number and tends to decrease with increasing Mach number. The direction of the side force is sensitive to the body geometry near the nose. The angle of attack of onset of side force is not strongly influenced by Reynolds number or Mach number but varies with forebody shape. Maximum normal force often occurs at angles of attack near 60 deg. The effect of the elliptic cross section is to reduce the angle of onset by about 10 deg compared to that of an equivalent circular forebody with the same fineness ratio. The short afterbody reduces the angle of onset by about 5 deg.

  2. Earthquake number forecasts testing

    NASA Astrophysics Data System (ADS)

    Kagan, Yan Y.

    2017-10-01

    We study the distributions of earthquake numbers in two global earthquake catalogues: Global Centroid-Moment Tensor and Preliminary Determinations of Epicenters. The properties of these distributions are especially required to develop the number test for our forecasts of future seismic activity rate, tested by the Collaboratory for Study of Earthquake Predictability (CSEP). A common assumption, as used in the CSEP tests, is that the numbers are described by the Poisson distribution. It is clear, however, that the Poisson assumption for the earthquake number distribution is incorrect, especially for the catalogues with a lower magnitude threshold. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrences, the negative-binomial distribution (NBD) has two parameters. The second parameter can be used to characterize the clustering or overdispersion of a process. We also introduce and study a more complex three-parameter beta negative-binomial distribution. We investigate the dependence of parameters for both Poisson and NBD distributions on the catalogue magnitude threshold and on temporal subdivision of catalogue duration. First, we study whether the Poisson law can be statistically rejected for various catalogue subdivisions. We find that for most cases of interest, the Poisson distribution can be shown to be rejected statistically at a high significance level in favour of the NBD. Thereafter, we investigate whether these distributions fit the observed distributions of seismicity. For this purpose, we study upper statistical moments of earthquake numbers (skewness and kurtosis) and compare them to the theoretical values for both distributions. Empirical values for the skewness and the kurtosis increase for the smaller magnitude threshold and increase with even greater intensity for small temporal subdivision of catalogues. The Poisson distribution for large rate values approaches the Gaussian law, therefore its skewness and kurtosis both tend to zero for large earthquake rates: for the Gaussian law, these values are identically zero. A calculation of the NBD skewness and kurtosis levels based on the values of the first two statistical moments of the distribution, shows rapid increase of these upper moments levels. However, the observed catalogue values of skewness and kurtosis are rising even faster. This means that for small time intervals, the earthquake number distribution is even more heavy-tailed than the NBD predicts. Therefore for small time intervals, we propose using empirical number distributions appropriately smoothed for testing forecasted earthquake numbers.

  3. The Effect of Number of Ability Intervals on the Stability of Item Bias Detection.

    ERIC Educational Resources Information Center

    Loyd, Brenda

    The chi-square procedure has been suggested as a viable index of test bias because it provides the best agreement with the three parameter item characteristic curve without the large sample requirement, computer complexity, and cost. This study examines the effect of using different numbers of ability intervals on the reliability of chi-square…

  4. Trapping spotted wing drosophila, Drosophila suzukii (Matsumura)(Diptera: Drosophilidae) with combinations of vinegar and wine, and acetic acid and ethanol

    USDA-ARS?s Scientific Manuscript database

    Recommendations for monitoring spotted wing drosophila (SWD) Drosophila suzukii, (Matsumura) are to use either vinegar or wine as a bait for traps. Traps baited with vinegar and traps baited with wine, in field tests in northern Oregon, captured large numbers of male and female SWD flies. Numbers of...

  5. Problems of Applying Communication/Behavior Theories to a Program of Smoking Reduction.

    ERIC Educational Resources Information Center

    Becker, Samuel L.; And Others

    Because the use of tobacco contributes to a large number of deaths each year in the United States, a current research project at the University of Iowa tests the application of a number of theoretical ideas--including social bonding, diffusion, and the spiral of silence--and attempts to develop new ideas in an effort to reduce smoking. The…

  6. Novel Processing of Boron Carbide (B4C): Plasma Synthesized Nano Powders and Pressureless Sintering Forming of Complex Shapes

    DTIC Science & Technology

    2008-12-01

    Figure 4. B4C plates formed via hot pressing with a curved shape. Commercial B4C shows a large number of lenticular graphitic inclusions, Figure 5...materials and they act as crack initiation points in flexure testing. Figure 5. SEM micrograph showing large lenticular graphitic inclusions in commercial

  7. Use of Item Models in a Large-Scale Admissions Test: A Case Study

    ERIC Educational Resources Information Center

    Sinharay, Sandip; Johnson, Matthew S.

    2008-01-01

    "Item models" (LaDuca, Staples, Templeton, & Holzman, 1986) are classes from which it is possible to generate items that are equivalent/isomorphic to other items from the same model (e.g., Bejar, 1996, 2002). They have the potential to produce large numbers of high-quality items at reduced cost. This article introduces data from an…

  8. Comparing Human and Automated Essay Scoring for Prospective Graduate Students with Learning Disabilities and/or ADHD

    ERIC Educational Resources Information Center

    Buzick, Heather; Oliveri, Maria Elena; Attali, Yigal; Flor, Michael

    2016-01-01

    Automated essay scoring is a developing technology that can provide efficient scoring of large numbers of written responses. Its use in higher education admissions testing provides an opportunity to collect validity and fairness evidence to support current uses and inform its emergence in other areas such as K-12 large-scale assessment. In this…

  9. Large scale landslide susceptibility assessment using the statistical methods of logistic regression and BSA - study case: the sub-basin of the small Niraj (Transylvania Depression, Romania)

    NASA Astrophysics Data System (ADS)

    Roşca, S.; Bilaşco, Ş.; Petrea, D.; Fodorean, I.; Vescan, I.; Filip, S.; Măguţ, F.-L.

    2015-11-01

    The existence of a large number of GIS models for the identification of landslide occurrence probability makes difficult the selection of a specific one. The present study focuses on the application of two quantitative models: the logistic and the BSA models. The comparative analysis of the results aims at identifying the most suitable model. The territory corresponding to the Niraj Mic Basin (87 km2) is an area characterised by a wide variety of the landforms with their morphometric, morphographical and geological characteristics as well as by a high complexity of the land use types where active landslides exist. This is the reason why it represents the test area for applying the two models and for the comparison of the results. The large complexity of input variables is illustrated by 16 factors which were represented as 72 dummy variables, analysed on the basis of their importance within the model structures. The testing of the statistical significance corresponding to each variable reduced the number of dummy variables to 12 which were considered significant for the test area within the logistic model, whereas for the BSA model all the variables were employed. The predictability degree of the models was tested through the identification of the area under the ROC curve which indicated a good accuracy (AUROC = 0.86 for the testing area) and predictability of the logistic model (AUROC = 0.63 for the validation area).

  10. Compact Quantum Random Number Generator with Silicon Nanocrystals Light Emitting Device Coupled to a Silicon Photomultiplier

    NASA Astrophysics Data System (ADS)

    Bisadi, Zahra; Acerbi, Fabio; Fontana, Giorgio; Zorzi, Nicola; Piemonte, Claudio; Pucker, Georg; Pavesi, Lorenzo

    2018-02-01

    A small-sized photonic quantum random number generator, easy to be implemented in small electronic devices for secure data encryption and other applications, is highly demanding nowadays. Here, we propose a compact configuration with Silicon nanocrystals large area light emitting device (LED) coupled to a Silicon photomultiplier to generate random numbers. The random number generation methodology is based on the photon arrival time and is robust against the non-idealities of the detector and the source of quantum entropy. The raw data show high quality of randomness and pass all the statistical tests in national institute of standards and technology tests (NIST) suite without a post-processing algorithm. The highest bit rate is 0.5 Mbps with the efficiency of 4 bits per detected photon.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jaech, J.L.

    The use of a pooling technique in leak testing Plutonium Recycle Test Reactor fuel elements to reduce the number of tests is discussed. Since the proportion of defectives in this case is small, application of the method would suggest that the group size be large. It was suggested that additional savings might be introduced by subgrouping the originally grouped items in the event of a positive result, rather than testing them individually. An investigation was made to determine optimum subgrouping sizes. (M.C.G.)

  12. Wind tunnel investigation of aerodynamic characteristics of scale models of three rectangular shaped cargo containers

    NASA Technical Reports Server (NTRS)

    Laub, G. H.; Kodani, H. M.

    1972-01-01

    Wind tunnel tests were conducted on scale models of three rectangular shaped cargo containers to determine the aerodynamic characteristics of these typical externally-suspended helicopter cargo configurations. Tests were made over a large range of pitch and yaw attitudes at a nominal Reynolds number per unit length of 1.8 x one million. The aerodynamic data obtained from the tests are presented.

  13. Pull-out testing facility for geosynthetics.

    DOT National Transportation Integrated Search

    1992-11-01

    The considerable increase in using geosynthetics in soil reinforcement made it necessary to develop methods of measuring the interaction properties and modeling load transfer in reinforced-soil structures. The large number of factors that influence t...

  14. Designing Tailor-Made Academic Paths for University Language Students

    ERIC Educational Resources Information Center

    Beseghi, Micol; Bertolotti, Greta

    2013-01-01

    The Language Centre of the University of Parma is responsible for the organization and administration of foreign language tests to a large number of university students. In order to reduce the high rate of test failures, the Language Centre has recently devised a pilot programme as an alternative to more established modes of language learning,…

  15. Reference test methods for total water in lint cotton by Karl Fischer Titration and low temperature distillation

    USDA-ARS?s Scientific Manuscript database

    In a study of comparability of total water contents (%) of conditioned cottons by Karl Fischer Titration (KFT) and Low Temperature Distillation (LTD) reference methods, we demonstrated a match of averaged results based on a large number of replications and weighing the test specimens at the same tim...

  16. Breaking of scale invariance in the time dependence of correlation functions in isotropic and homogeneous turbulence

    NASA Astrophysics Data System (ADS)

    Tarpin, Malo; Canet, Léonie; Wschebor, Nicolás

    2018-05-01

    In this paper, we present theoretical results on the statistical properties of stationary, homogeneous, and isotropic turbulence in incompressible flows in three dimensions. Within the framework of the non-perturbative renormalization group, we derive a closed renormalization flow equation for a generic n-point correlation (and response) function for large wave-numbers with respect to the inverse integral scale. The closure is obtained from a controlled expansion and relies on extended symmetries of the Navier-Stokes field theory. It yields the exact leading behavior of the flow equation at large wave-numbers |p→ i| and for arbitrary time differences ti in the stationary state. Furthermore, we obtain the form of the general solution of the corresponding fixed point equation, which yields the analytical form of the leading wave-number and time dependence of n-point correlation functions, for large wave-numbers and both for small ti and in the limit ti → ∞. At small ti, the leading contribution at large wave-numbers is logarithmically equivalent to -α (ɛL ) 2 /3|∑tip→ i|2, where α is a non-universal constant, L is the integral scale, and ɛ is the mean energy injection rate. For the 2-point function, the (tp)2 dependence is known to originate from the sweeping effect. The derived formula embodies the generalization of the effect of sweeping to n-point correlation functions. At large wave-numbers and large ti, we show that the ti2 dependence in the leading order contribution crosses over to a |ti| dependence. The expression of the correlation functions in this regime was not derived before, even for the 2-point function. Both predictions can be tested in direct numerical simulations and in experiments.

  17. Space Shuttle wind tunnel testing program

    NASA Technical Reports Server (NTRS)

    Whitnah, A. M.; Hillje, E. R.

    1984-01-01

    A major phase of the Space Shuttle Vehicle (SSV) Development Program was the acquisition of data through the space shuttle wind tunnel testing program. It became obvious that the large number of configuration/environment combinations would necessitate an extremely large wind tunnel testing program. To make the most efficient use of available test facilities and to assist the prime contractor for orbiter design and space shuttle vehicle integration, a unique management plan was devised for the design and development phase. The space shuttle program is reviewed together with the evolutional development of the shuttle configuration. The wind tunnel testing rationale and the associated test program management plan and its overall results is reviewed. Information is given for the various facilities and models used within this program. A unique posttest documentation procedure and a summary of the types of test per disciplines, per facility, and per model are presented with detailed listing of the posttest documentation.

  18. Hierarchical group testing for multiple infections.

    PubMed

    Hou, Peijie; Tebbs, Joshua M; Bilder, Christopher R; McMahan, Christopher S

    2017-06-01

    Group testing, where individuals are tested initially in pools, is widely used to screen a large number of individuals for rare diseases. Triggered by the recent development of assays that detect multiple infections at once, screening programs now involve testing individuals in pools for multiple infections simultaneously. Tebbs, McMahan, and Bilder (2013, Biometrics) recently evaluated the performance of a two-stage hierarchical algorithm used to screen for chlamydia and gonorrhea as part of the Infertility Prevention Project in the United States. In this article, we generalize this work to accommodate a larger number of stages. To derive the operating characteristics of higher-stage hierarchical algorithms with more than one infection, we view the pool decoding process as a time-inhomogeneous, finite-state Markov chain. Taking this conceptualization enables us to derive closed-form expressions for the expected number of tests and classification accuracy rates in terms of transition probability matrices. When applied to chlamydia and gonorrhea testing data from four states (Region X of the United States Department of Health and Human Services), higher-stage hierarchical algorithms provide, on average, an estimated 11% reduction in the number of tests when compared to two-stage algorithms. For applications with rarer infections, we show theoretically that this percentage reduction can be much larger. © 2016, The International Biometric Society.

  19. Hierarchical group testing for multiple infections

    PubMed Central

    Hou, Peijie; Tebbs, Joshua M.; Bilder, Christopher R.; McMahan, Christopher S.

    2016-01-01

    Summary Group testing, where individuals are tested initially in pools, is widely used to screen a large number of individuals for rare diseases. Triggered by the recent development of assays that detect multiple infections at once, screening programs now involve testing individuals in pools for multiple infections simultaneously. Tebbs, McMahan, and Bilder (2013, Biometrics) recently evaluated the performance of a two-stage hierarchical algorithm used to screen for chlamydia and gonorrhea as part of the Infertility Prevention Project in the United States. In this article, we generalize this work to accommodate a larger number of stages. To derive the operating characteristics of higher-stage hierarchical algorithms with more than one infection, we view the pool decoding process as a time-inhomogeneous, finite-state Markov chain. Taking this conceptualization enables us to derive closed-form expressions for the expected number of tests and classification accuracy rates in terms of transition probability matrices. When applied to chlamydia and gonorrhea testing data from four states (Region X of the United States Department of Health and Human Services), higher-stage hierarchical algorithms provide, on average, an estimated 11 percent reduction in the number of tests when compared to two-stage algorithms. For applications with rarer infections, we show theoretically that this percentage reduction can be much larger. PMID:27657666

  20. High Resolution Mapping of Genetic Factors Affecting Abdominal Bristle Number in Drosophila Melanogaster

    PubMed Central

    Long, A. D.; Mullaney, S. L.; Reid, L. A.; Fry, J. D.; Langley, C. H.; Mackay, TFC.

    1995-01-01

    Factors responsible for selection response for abdominal bristle number and correlated responses in sternopleural bristle number were mapped to the X and third chromosome of Drosophila melanogaster. Lines divergent for high and low abdominal bristle number were created by 25 generations of artificial selection from a large base population, with an intensity of 25 individuals of each sex selected from 100 individuals of each sex scored per generation. Isogenic chromosome substitution lines in which the high (H) X or third chromosome were placed in an isogenic low (L) background were derived from the selection lines and from the 93 recombinant isogenic (RI) HL X and 67 RI chromosome 3 lines constructed from them. Highly polymorphic neutral r00 transposable elements were hybridized in situ to the polytene chromosomes of the RI lines to create a set of cytogenetic markers. These techniques yielded a dense map with an average spacing of 4 cM between informative markers. Factors affecting bristle number, and relative viability of the chromosome 3 RI lines, were mapped using a multiple regression interval mapping approach, conditioning on all markers >/=10 cM from the tested interval. Two factors with large effects on abdominal bristle number were mapped on the X chromosome and five factors on the third chromosome. One factor with a large effect on sternopleural bristle number was mapped to the X and two were mapped to the third chromosome; all factors with sternopleural effects corresponded to those with effects on abdominal bristle number. Two of the chromosome 3 factors with large effects on abdominal bristle number were also associated with reduced viability. Significant sex-specific effects and epistatic interactions between mapped factors of the same order of magnitude as the additive effects were observed. All factors mapped to the approximate positions of likely candidate loci (ASC, bb, emc, h, mab, Dl and E(spl)), previously characterized by mutations with large effects on bristle number. PMID:7768438

  1. Approximate number and approximate time discrimination each correlate with school math abilities in young children.

    PubMed

    Odic, Darko; Lisboa, Juan Valle; Eisinger, Robert; Olivera, Magdalena Gonzalez; Maiche, Alejandro; Halberda, Justin

    2016-01-01

    What is the relationship between our intuitive sense of number (e.g., when estimating how many marbles are in a jar), and our intuitive sense of other quantities, including time (e.g., when estimating how long it has been since we last ate breakfast)? Recent work in cognitive, developmental, comparative psychology, and computational neuroscience has suggested that our representations of approximate number, time, and spatial extent are fundamentally linked and constitute a "generalized magnitude system". But, the shared behavioral and neural signatures between number, time, and space may alternatively be due to similar encoding and decision-making processes, rather than due to shared domain-general representations. In this study, we investigate the relationship between approximate number and time in a large sample of 6-8 year-old children in Uruguay by examining how individual differences in the precision of number and time estimation correlate with school mathematics performance. Over four testing days, each child completed an approximate number discrimination task, an approximate time discrimination task, a digit span task, and a large battery of symbolic math tests. We replicate previous reports showing that symbolic math abilities correlate with approximate number precision and extend those findings by showing that math abilities also correlate with approximate time precision. But, contrary to approximate number and time sharing common representations, we find that each of these dimensions uniquely correlates with formal math: approximate number correlates more strongly with formal math compared to time and continues to correlate with math even when precision in time and individual differences in working memory are controlled for. These results suggest that there are important differences in the mental representations of approximate number and approximate time and further clarify the relationship between quantity representations and mathematics. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Fast physical-random number generation using laser diode's frequency noise: influence of frequency discriminator

    NASA Astrophysics Data System (ADS)

    Matsumoto, Kouhei; Kasuya, Yuki; Yumoto, Mitsuki; Arai, Hideaki; Sato, Takashi; Sakamoto, Shuichi; Ohkawa, Masashi; Ohdaira, Yasuo

    2018-02-01

    Not so long ago, pseudo random numbers generated by numerical formulae were considered to be adequate for encrypting important data-files, because of the time needed to decode them. With today's ultra high-speed processors, however, this is no longer true. So, in order to thwart ever-more advanced attempts to breach our system's protections, cryptologists have devised a method that is considered to be virtually impossible to decode, and uses what is a limitless number of physical random numbers. This research describes a method, whereby laser diode's frequency noise generate a large quantities of physical random numbers. Using two types of photo detectors (APD and PIN-PD), we tested the abilities of two types of lasers (FP-LD and VCSEL) to generate random numbers. In all instances, an etalon served as frequency discriminator, the examination pass rates were determined using NIST FIPS140-2 test at each bit, and the Random Number Generation (RNG) speed was noted.

  3. Low speed tests of a fixed geometry inlet for a tilt nacelle V/STOL airplane

    NASA Technical Reports Server (NTRS)

    Syberg, J.; Koncsek, J. L.

    1977-01-01

    Test data were obtained with a 1/4 scale cold flow model of the inlet at freestream velocities from 0 to 77 m/s (150 knots) and angles of attack from 45 deg to 120 deg. A large scale model was tested with a high bypass ratio turbofan in the NASA/ARC wind tunnel. A fixed geometry inlet is a viable concept for a tilt nacelle V/STOL application. Comparison of data obtained with the two models indicates that flow separation at high angles of attack and low airflow rates is strongly sensitive to Reynolds number and that the large scale model has a significantly improved range of separation-free operation.

  4. In-Flight Boundary-Layer Transition on a Large Flat Plate at Supersonic Speeds

    NASA Technical Reports Server (NTRS)

    Banks, Daniel W.; Fredericks, Michael Alan; Tracy, Richard R.; Matisheck, Jason R.; Vanecek, Neal D.

    2012-01-01

    A flight experiment was conducted to investigate the pressure distribution, local flow conditions, and boundary-layer transition characteristics on a large flat plate in flight at supersonic speeds up to Mach 2.0. The primary objective of the test was to characterize the local flow field in preparation for future tests of a high Reynolds number natural laminar flow test article. The tests used a F-15B testbed aircraft with a bottom centerline mounted test fixture. A second objective was to determine the boundary-layer transition characteristics on the flat plate and the effectiveness of using a simplified surface coating for future laminar flow flight tests employing infrared thermography. Boundary-layer transition was captured using an onboard infrared imaging system. The infrared imagery was captured in both analog and digital formats. Surface pressures were measured with electronically scanned pressure modules connected to 60 surface-mounted pressure orifices. The local flow field was measured with five 5-hole conical probes mounted near the leading edge of the test fixture. Flow field measurements revealed the local flow characteristics including downwash, sidewash, and local Mach number. Results also indicated that the simplified surface coating did not provide sufficient insulation from the metallic structure, which likely had a substantial effect on boundary-layer transition compared with that of an adiabatic surface. Cold wall conditions were predominant during the acceleration to maximum Mach number, and warm wall conditions were evident during the subsequent deceleration. The infrared imaging system was able to capture shock wave impingement on the surface of the flat plate in addition to indicating laminar-to-turbulent boundary-layer transition.

  5. Forward Collision Warning Systems (CWS) 

    DOT National Transportation Integrated Search

    2005-07-01

    The Federal Motor Carrier Safety Administrations (FMCSAs) safety goal is to reduce the number and severity of large truck fatalities and crashes. During the last several years, FMCSA has collaborated with the trucking industry to test and evalu...

  6. Influence of finger and mouth action observation on random number generation: an instance of embodied cognition for abstract concepts.

    PubMed

    Grade, Stéphane; Badets, Arnaud; Pesenti, Mauro

    2017-05-01

    Numerical magnitude and specific grasping action processing have been shown to interfere with each other because some aspects of numerical meaning may be grounded in sensorimotor transformation mechanisms linked to finger grip control. However, how specific these interactions are to grasping actions is still unknown. The present study tested the specificity of the number-grip relationship by investigating how the observation of different closing-opening stimuli that might or not refer to prehension-releasing actions was able to influence a random number generation task. Participants had to randomly produce numbers after they observed action stimuli representing either closure or aperture of the fingers, the hand or the mouth, or a colour change used as a control condition. Random number generation was influenced by the prior presentation of finger grip actions, whereby observing a closing finger grip led participants to produce small rather than large numbers, whereas observing an opening finger grip led them to produce large rather than small numbers. Hand actions had reduced or no influence on number production; mouth action influence was restricted to opening, with an overproduction of large numbers. Finally, colour changes did not influence number generation. These results show that some characteristics of observed finger, hand and mouth grip actions automatically prime number magnitude, with the strongest effect for finger grasping. The findings are discussed in terms of the functional and neural mechanisms shared between hand actions and number processing, but also between hand and mouth actions. The present study provides converging evidence that part of number semantics is grounded in sensory-motor mechanisms.

  7. A user-defined data type for the storage of time series data allowing efficient similarity screening.

    PubMed

    Sorokin, Anatoly; Selkov, Gene; Goryanin, Igor

    2012-07-16

    The volume of the experimentally measured time series data is rapidly growing, while storage solutions offering better data types than simple arrays of numbers or opaque blobs for keeping series data are sorely lacking. A number of indexing methods have been proposed to provide efficient access to time series data, but none has so far been integrated into a tried-and-proven database system. To explore the possibility of such integration, we have developed a data type for time series storage in PostgreSQL, an object-relational database system, and equipped it with an access method based on SAX (Symbolic Aggregate approXimation). This new data type has been successfully tested in a database supporting a large-scale plant gene expression experiment, and it was additionally tested on a very large set of simulated time series data. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. [Antibacterial prevention of suppurative complications after operations on the large intestine].

    PubMed

    Kuzin, M I; Pomelov, V S; Vandiaev, G K; Ialgashev, T Ia; Blatun, L A

    1983-05-01

    The data on comparative study of complications after operations on the large intestine are presented. During the preoperative period, 62 patients of the control group were treated with phthalylsulfathiazole, nevigramon and nystatin. Thirty-nine patients of the test group were treated with metronidazole and kanamycin monosulfate. Kanamycin monosulfate was used 3 days before the operation in a dose of 0.5 g orally 4 times a day whereas metronidazole in a dose of 0.5 g 3 times a day. The last doses of the drugs were administered 4-5 hours before the operation. After the operations the patients were treated with kanamycin sulfate for 3-5 days in a daily dose of 2 g intramuscularly. The number of the postoperative suppurative complications decreased from 22 to 5 per cent. No lethal outcomes were registered in the test group. The number of lethal outcomes in the control group amounted to 8 per cent.

  9. THE LARGE HIGH PRESSURE ARC PLASMA GENERATOR: A FACILITY FOR SIMULATING MISSLE AND SATELLITE RE-ENTRY. Research Report 56

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rose, P.; Powers, W.; Hritzay, D.

    1959-06-01

    The development of an arc wind tunnel capable of stagnation pressures in the excess of twenty atmospheres and using as much as fifteen megawatts of electrical power is described. The calibration of this facility shows that it is capable of reproducing the aerodynamic environment encountered by vehicles flying at velocities as great as satellite velocity. Its use as a missile re-entry material test facility is described. The large power capacity of this facility allows one to make material tests on specimens of size sufficient to be useful for material development yet at realistic energy and Reynolds number values. By themore » addition of a high-capacity vacuum system, this facility can be used to produce the low density, high Mach number environment needed for simulating satellite re-entry, as well as hypersonic flight at extreme altitudes. (auth)« less

  10. Flight Investigation at Low Angles of Attack to Determine the Longitudinal Stability and Control Characteristics of a Cruciform Canard Missile Configuration with a Low-Aspect-Ratio Wing and Blunt Nose at Mach Numbers from 1.2 to 2.1

    NASA Technical Reports Server (NTRS)

    Brown, Clarence A , Jr

    1957-01-01

    A full- scale rocket-powered model of a cruciform canard missile configuration with a low- aspect - ratio wing and blunt nose has been flight tested by the Langley Pilotless Aircraft Research Division. Static and dynamic longitudinal stability and control derivatives of this interdigitated canard-wing missile configuration were determined by using the pulsed- control technique at low angles of attack and for a Mach number range of 1.2 to 2.1. The lift - curve slope showed only small nonlinearities with changes in control deflection or angle of attack but indicated a difference in lift- .curve slope of approximately 7 percent for the two control deflections of delta = 3.0 deg and delta= -0.3 deg . The large tail length of the missile tested was effective in producing damping in pitch throughout the Mach number range tested. The aerodynamic- center location was nearly constant with Mach number for the two control deflections but was shown to be less stable with the larger control deflection. The increment of lift produced by the controls was small and positive throughout the Mach number range tested, whereas the pitching moment produced by the controls exhibited a normal trend of reduced effectiveness with increasing Mach number.The effectiveness of the controls in producing angle of attack, lift, and pitching moment was good at all Mach numbers tested.

  11. Flight Investigation at Low Angles of Attack to Determine the Longitudinal Stability and Control Characteristics of a Cruciform Canard Missile Configuration with a Low-Aspect-Ratio Wing and Blunt Nose at Mach Numbers from 1.2 to 2.1

    NASA Technical Reports Server (NTRS)

    Brown, C. A., Jr.

    1957-01-01

    A full-scale rocket-powered model of a cruciform canard missile configuration with a low-aspect-ratio wing and blunt nose has been flight tested by the Langley Pilotless Aircraft Research Division. Static and dynamic longitudinal stability and control derivatives of this interdigitated canard-wing missile configuration were determined by using the pulsed-control technique at low angles of attack and for a Mach number range of 1.2 to 2.1. The lift-curve slope showed only small nonlinearities with changes in control deflection or angle of attack but indicated a difference in lift-curve slope of approximately 7 percent for the two control deflections of delta = 3.0 deg and delta = -0.3 deg. The large tail length of the missile tested was effective in producing damping in pitch throughout the Mach number range tested. The aerodynamic-center location was nearly constant with Mach number for the two control deflections but was shown to be less stable with the larger control deflection. The increment of lift produced by the controls was small and positive throughout the Mach number range tested, whereas the pitching moment produced by the controls exhibited a normal trend of reduced effectiveness with increasing Mach number. The effectiveness of the controls in producing angle of attack, lift, and pitching moment was good at all Mach numbers tested.

  12. Residual Ductility and Microstructural Evolution in Continuous-Bending-under-Tension of AA-6022-T4

    PubMed Central

    Zecevic, Milovan; Roemer, Timothy J.; Knezevic, Marko; Korkolis, Yannis P.; Kinsey, Brad L.

    2016-01-01

    A ubiquitous experiment to characterize the formability of sheet metal is the simple tension test. Past research has shown that if the material is repeatedly bent and unbent during this test (i.e., Continuous-Bending-under-Tension, CBT), the percent elongation at failure can significantly increase. In this paper, this phenomenon is evaluated in detail for AA-6022-T4 sheets using a custom-built CBT device. In particular, the residual ductility of specimens that are subjected to CBT processing is investigated. This is achieved by subjecting a specimen to CBT processing and then creating subsize tensile test and microstructural samples from the specimens after varying numbers of CBT cycles. Interestingly, the engineering stress initially increases after CBT processing to a certain number of cycles, but then decreases with less elongation achieved for increasing numbers of CBT cycles. Additionally, a detailed microstructure and texture characterization are performed using standard scanning electron microscopy and electron backscattered diffraction imaging. The results show that the material under CBT preserves high integrity to large plastic strains due to a uniform distribution of damage formation and evolution in the material. The ability to delay ductile fracture during the CBT process to large plastic strains, results in formation of a strong <111> fiber texture throughout the material. PMID:28773257

  13. Large Payload Transportation and Test Considerations

    NASA Technical Reports Server (NTRS)

    Rucker, Michelle A.; Pope, James C.

    2011-01-01

    Ironically, the limiting factor to a national heavy lift strategy may not be the rocket technology needed to throw a heavy payload, but rather the terrestrial infrastructure - roads, bridges, airframes, and buildings - necessary to transport, acceptance test, and process large spacecraft. Failure to carefully consider how large spacecraft are designed, and where they are manufactured, tested, or launched, could result in unforeseen cost to modify/develop infrastructure, or incur additional risk due to increased handling or elimination of key verifications. During test and verification planning for the Altair project, a number of transportation and test issues related to the large payload diameter were identified. Although the entire Constellation Program - including Altair - was canceled in the 2011 NASA budget, issues identified by the Altair project serve as important lessons learned for future payloads that may be developed to support national "heavy lift" strategies. A feasibility study performed by the Constellation Ground Operations (CxGO) project found that neither the Altair Ascent nor Descent Stage would fit inside available transportation aircraft. Ground transportation of a payload this large over extended distances is generally not permitted by most states, so overland transportation alone would not have been an option. Limited ground transportation to the nearest waterway may be permitted, but water transportation could take as long as 66 days per production unit, depending on point of origin and acceptance test facility; transportation from the western United States would require transit through the Panama Canal to access the Kennedy Space Center launch site. Large payloads also pose acceptance test and ground processing challenges. Although propulsion, mechanical vibration, and reverberant acoustic test facilities at NASA s Plum Brook Station have been designed to accommodate large spacecraft, special handling and test work-arounds may be necessary, which could increase cost, schedule, and technical risk. Once at the launch site, there are no facilities currently capable of accommodating the combination of large payload size and hazardous processing (which includes hypergolic fuels, pyrotechnic devices, and high pressure gasses).

  14. Rapid, topology-based particle tracking for high-resolution measurements of large complex 3D motion fields.

    PubMed

    Patel, Mohak; Leggett, Susan E; Landauer, Alexander K; Wong, Ian Y; Franck, Christian

    2018-04-03

    Spatiotemporal tracking of tracer particles or objects of interest can reveal localized behaviors in biological and physical systems. However, existing tracking algorithms are most effective for relatively low numbers of particles that undergo displacements smaller than their typical interparticle separation distance. Here, we demonstrate a single particle tracking algorithm to reconstruct large complex motion fields with large particle numbers, orders of magnitude larger than previously tractably resolvable, thus opening the door for attaining very high Nyquist spatial frequency motion recovery in the images. Our key innovations are feature vectors that encode nearest neighbor positions, a rigorous outlier removal scheme, and an iterative deformation warping scheme. We test this technique for its accuracy and computational efficacy using synthetically and experimentally generated 3D particle images, including non-affine deformation fields in soft materials, complex fluid flows, and cell-generated deformations. We augment this algorithm with additional particle information (e.g., color, size, or shape) to further enhance tracking accuracy for high gradient and large displacement fields. These applications demonstrate that this versatile technique can rapidly track unprecedented numbers of particles to resolve large and complex motion fields in 2D and 3D images, particularly when spatial correlations exist.

  15. Animals and the 3Rs in toxicology research and testing: The way forward.

    PubMed

    Stokes, W S

    2015-12-01

    Despite efforts to eliminate the use of animals in testing and the availability of many accepted alternative methods, animals are still widely used for toxicological research and testing. While research using in vitro and computational models has dramatically increased in recent years, such efforts have not yet measurably impacted animal use for regulatory testing and are not likely to do so for many years or even decades. Until regulatory authorities have accepted test methods that can totally replace animals and these are fully implemented, large numbers of animals will continue to be used and many will continue to experience significant pain and distress. In order to positively impact the welfare of these animals, accepted alternatives must be implemented, and efforts must be directed at eliminating pain and distress and reducing animal numbers. Animal pain and distress can be reduced by earlier predictive humane endpoints, pain-relieving medications, and supportive clinical care, while sequential testing and routine use of integrated testing and decision strategies can reduce animal numbers. Applying advances in science and technology to the development of scientifically sound alternative testing models and strategies can improve animal welfare and further reduce and replace animal use. © The Author(s) 2015.

  16. An entropy-based statistic for genomewide association studies.

    PubMed

    Zhao, Jinying; Boerwinkle, Eric; Xiong, Momiao

    2005-07-01

    Efficient genotyping methods and the availability of a large collection of single-nucleotide polymorphisms provide valuable tools for genetic studies of human disease. The standard chi2 statistic for case-control studies, which uses a linear function of allele frequencies, has limited power when the number of marker loci is large. We introduce a novel test statistic for genetic association studies that uses Shannon entropy and a nonlinear function of allele frequencies to amplify the differences in allele and haplotype frequencies to maintain statistical power with large numbers of marker loci. We investigate the relationship between the entropy-based test statistic and the standard chi2 statistic and show that, in most cases, the power of the entropy-based statistic is greater than that of the standard chi2 statistic. The distribution of the entropy-based statistic and the type I error rates are validated using simulation studies. Finally, we apply the new entropy-based test statistic to two real data sets, one for the COMT gene and schizophrenia and one for the MMP-2 gene and esophageal carcinoma, to evaluate the performance of the new method for genetic association studies. The results show that the entropy-based statistic obtained smaller P values than did the standard chi2 statistic.

  17. Response of a 2-story test-bed structure for the seismic evaluation of nonstructural systems

    NASA Astrophysics Data System (ADS)

    Soroushian, Siavash; Maragakis, E. "Manos"; Zaghi, Arash E.; Rahmanishamsi, Esmaeel; Itani, Ahmad M.; Pekcan, Gokhan

    2016-03-01

    A full-scale, two-story, two-by-one bay, steel braced-frame was subjected to a number of unidirectional ground motions using three shake tables at the UNR-NEES site. The test-bed frame was designed to study the seismic performance of nonstructural systems including steel-framed gypsum partition walls, suspended ceilings and fire sprinkler systems. The frame can be configured to perform as an elastic or inelastic system to generate large floor accelerations or large inter story drift, respectively. In this study, the dynamic performance of the linear and nonlinear test-beds was comprehensively studied. The seismic performance of nonstructural systems installed in the linear and nonlinear test-beds were assessed during extreme excitations. In addition, the dynamic interactions of the test-bed and installed nonstructural systems are investigated.

  18. Statistical significance test for transition matrices of atmospheric Markov chains

    NASA Technical Reports Server (NTRS)

    Vautard, Robert; Mo, Kingtse C.; Ghil, Michael

    1990-01-01

    Low-frequency variability of large-scale atmospheric dynamics can be represented schematically by a Markov chain of multiple flow regimes. This Markov chain contains useful information for the long-range forecaster, provided that the statistical significance of the associated transition matrix can be reliably tested. Monte Carlo simulation yields a very reliable significance test for the elements of this matrix. The results of this test agree with previously used empirical formulae when each cluster of maps identified as a distinct flow regime is sufficiently large and when they all contain a comparable number of maps. Monte Carlo simulation provides a more reliable way to test the statistical significance of transitions to and from small clusters. It can determine the most likely transitions, as well as the most unlikely ones, with a prescribed level of statistical significance.

  19. Geometrical optimization of sensors for eddy currents nondestructive testing and evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thollon, F.; Burais, N.

    1995-05-01

    Design of Non Destructive Testing (NDT) and Non Destructive Evaluation (NDE) sensors is possible by solving Maxwell`s relations with FEM or BIM. But the large number of geometrical and electrical parameters of sensor and tested material implies many results that don`t give necessarily a well adapted sensor. The authors have used a genetic algorithm for automatic optimization. After having tested this algorithm with analytical solution of Maxwell`s relations for cladding thickness measurement, the method has been implemented in finite element package.

  20. Slicing of Silicon into Sheet Material: Silicon Sheet Growth Development for the Large Area Silicon Sheet Task of the Low Cost Silicon Solar Array Project

    NASA Technical Reports Server (NTRS)

    Fleming, J. R.

    1979-01-01

    Testing of low cost low suspension power slurry vehicles is presented. Cutting oils are unlikely to work, but a mineral oil with additives should be workable. Two different abrasives were tested. A cheaper silicon carbide from Norton gave excellent results except for excessive kerf loss: the particles were too big. An abrasive treated for lubricity showed no lubricity improvement in mineral oil vehicle. The bounce fixture was tested for the first time under constant cut rate conditions (rather than constant force). Although the cut was not completed before the blades broke, the blade lifetime of thin (100 micrometer) blades was 120 times the lifetime without the fixture. The large prototype saw completed a successful run, producing 90% cutting yield (849 wafers) at 20 wafers/cm. Although inexperience with large numbers of wafers caused cleaning breakage to reduce this yield to 74%, the yield was high enough that the concept of the large saw is proven workable.

  1. Cost-effectiveness of finding new HIV diagnoses using rapid HIV testing in community-based organizations.

    PubMed

    Shrestha, Ram K; Clark, Hollie A; Sansom, Stephanie L; Song, Binwei; Buckendahl, Holly; Calhoun, Cindy B; Hutchinson, Angela B; Heffelfinger, James D

    2008-01-01

    We assessed the cost-effectiveness of determining new human immunodeficiency virus (HIV) diagnoses using rapid HIV testing performed by community-based organizations (CBOs) in Kansas City, Missouri, and Detroit, Michigan. The CBOs performed rapid HIV testing during April 2004 through March 2006. In Kansas City, testing was performed in a clinic and in outreach settings. In Detroit, testing was performed in outreach settings only. Both CBOs used mobile testing vans. Measures of effectiveness were the number of HIV tests performed and the number of people notified of new HIV diagnoses, based on rapid tests. We retrospectively collected program costs, including those for personnel, test kits, mobile vans, and facility space. The CBO in Kansas City tested a mean of 855 people a year in its clinic and 703 people a year in outreach settings. The number of people notified of new HIV diagnoses was 19 (2.2%) in the clinic and five (0.7%) in outreach settings. The CBO in Detroit tested 976 people a year in outreach settings, and the number notified of new HIV diagnoses was 15 (1.5%). In Kansas City, the cost per person notified of a new HIV diagnosis was $3,637 in the clinic and $16,985 in outreach settings. In the Detroit outreach settings, the cost per notification was $13,448. The cost of providing a new HIV diagnosis was considerably higher in the outreach settings than in the clinic. The variation can be largely explained by differences in the number of undiagnosed infections among the people tested and by the costs of purchasing and operating a mobile van.

  2. Cable tester

    NASA Astrophysics Data System (ADS)

    Rammage, Robert L.

    1990-10-01

    A device for sequentially testing the plurality of connectors in a wiring harness is disclosed. The harness is attached to the tester by means of adapter cables and a rotary switch is used to sequentially, individually test the connectors by passing a current through the connector. If the connector is unbroken, a light will flash to show it is electrically sound. The adapters allow a large number of cable configurations to be tested using a single tester configuration.

  3. Experiences with a high-blockage model tested in the NASA Ames 12-foot pressure wind tunnel

    NASA Technical Reports Server (NTRS)

    Coder, D. W.

    1984-01-01

    Representation of the flow around full-scale ships was sought in the subsonic wind tunnels in order to a Hain Reynolds numbers as high as possible. As part of the quest to attain the largest possible Reynolds number, large models with high blockage are used which result in significant wall interference effects. Some experiences with such a high blockage model tested in the NASA Ames 12-foot pressure wind tunnel are summarized. The main results of the experiment relating to wind tunnel wall interference effects are also presented.

  4. Longitudinal afterbody grooves and shoulder radiusing for low-speed bluff body drag reduction

    NASA Technical Reports Server (NTRS)

    Howard, F. G.; Quass, B. F.; Weinstein, L. M.; Bushnell, D. M.

    1981-01-01

    A new low-speed drag reduction approach is proposed which employs longitudinal surface V-shaped grooves cutting through the afterbody shoulder region. The test Reynolds number range was from 20,000 to 200,000 based on undisturbed free-stream flow and a body diameter of 6.08 cm. The V-grooves are shown to be most effective in reducing drag when the afterbody shoulder radius is zero. Reductions in drag of up to 33% have been measured for this condition. For large shoulder radius, the grooves are only effective at the lower Reynolds numbers of the test.

  5. Using Systematic Item Selection Methods to Improve Universal Design of Assessments. Policy Directions. Number 18

    ERIC Educational Resources Information Center

    Johnstone, Christopher; Thurlow, Martha; Moore, Michael; Altman, Jason

    2006-01-01

    The No Child Left Behind Act of 2001 (NCLB) and other recent changes in federal legislation have placed greater emphasis on accountability in large-scale testing. Included in this emphasis are regulations that require assessments to be accessible. States are accountable for the success of all students, and tests should be designed in a way that…

  6. Opportunities to Develop Mathematical Proficiency: How Teachers Structure Participation in the Elementary Mathematics Classroom

    ERIC Educational Resources Information Center

    Freund, Deanna Patrice Nichols

    2011-01-01

    The opportunity to learn for African American and Latino children is extremely limited in a large number of US classrooms. Many societal issues are to blame, but high-stakes testing has exacerbated this problem. The pressure to increase test scores has caused a narrowing of the curriculum, particularly in low-performing schools, most of which are…

  7. Do You Catch Undersized Fish? Let's Go Fishing to Learn Some Important Concepts in Multiple Testing

    ERIC Educational Resources Information Center

    Zheng, Qiujie; Lu, Yonggang

    2016-01-01

    In the era of Big Data, because of diminishing cost of data collection and storage, a large number of statistical tests may even possibly be conducted all together by a high school student to seek for some "exciting" new scientific findings. In this article, we propose an interesting approach to introduce students to some important…

  8. Constraints in Teacher Training for Computer Assisted Language Testing Implementation

    ERIC Educational Resources Information Center

    Garcia Laborda, Jesus; Litzler, Mary Frances

    2011-01-01

    Many ELT examinations have gone online in the last few years and a large number of educational institutions have also started considering the possibility of implementing their own tests. This paper deals with the training of a group of 24 ELT teachers in the Region of Valencia (Spain). In 2007, the Ministry of Education provided funds to determine…

  9. The potential for increased power from combining P-values testing the same hypothesis.

    PubMed

    Ganju, Jitendra; Julie Ma, Guoguang

    2017-02-01

    The conventional approach to hypothesis testing for formal inference is to prespecify a single test statistic thought to be optimal. However, we usually have more than one test statistic in mind for testing the null hypothesis of no treatment effect but we do not know which one is the most powerful. Rather than relying on a single p-value, combining p-values from prespecified multiple test statistics can be used for inference. Combining functions include Fisher's combination test and the minimum p-value. Using randomization-based tests, the increase in power can be remarkable when compared with a single test and Simes's method. The versatility of the method is that it also applies when the number of covariates exceeds the number of observations. The increase in power is large enough to prefer combined p-values over a single p-value. The limitation is that the method does not provide an unbiased estimator of the treatment effect and does not apply to situations when the model includes treatment by covariate interaction.

  10. Chromosomal microarray analysis as the first-tier test for the identification of pathogenic copy number variants in chromosome 9 pericentric regions and its challenge.

    PubMed

    Wang, Jia-Chi; Boyar, Fatih Z

    2016-01-01

    Chromosomal microarray analysis (CMA) has been recommended and practiced routinely in the large reference laboratories of U.S.A. as the first-tier test for the postnatal evaluation of individuals with intellectual disability, autism spectrum disorders, and/or multiple congenital anomalies. Using CMA as a diagnostic tool and without a routine setting of fluorescence in situ hybridization with labeled bacterial artificial chromosome probes (BAC-FISH) in the large reference laboratories becomes a challenge in the characterization of chromosome 9 pericentric region. This region has a very complex genomic structure and contains a variety of heterochromatic and euchromatic polymorphic variants. These variants were usually studied by G-banding, C-banding and BAC-FISH analysis. Chromosomal microarray analysis (CMA) was not recommended since it may lead to false positive results. Here, we presented a cohort of four cases, in which high-resolution CMA was used as the first-tier test or simultaneously with G-banding analysis on the proband to identify pathogenic copy number variants (CNVs) in the whole genome. CMA revealed large pathogenic CNVs from chromosome 9 in 3 cases which also revealed different G-banding patterns between the two chromosome 9 homologues. Although we demonstrated that high-resolution CMA played an important role in the identification of pathogenic copy number variants in chromosome 9 pericentric regions, the lack of BAC-FISH analysis or other useful tools renders significant challenges in the characterization of chromosome 9 pericentric regions. None; it is not a clinical trial, and the cases were retrospectively collected and analyzed.

  11. Toxicity assessment of industrial chemicals and airborne contaminants: transition from in vivo to in vitro test methods: a review.

    PubMed

    Bakand, S; Winder, C; Khalil, C; Hayes, A

    2005-12-01

    Exposure to occupational and environmental contaminants is a major contributor to human health problems. Inhalation of gases, vapors, aerosols, and mixtures of these can cause a wide range of adverse health effects, ranging from simple irritation to systemic diseases. Despite significant achievements in the risk assessment of chemicals, the toxicological database, particularly for industrial chemicals, remains limited. Considering there are approximately 80,000 chemicals in commerce, and an extremely large number of chemical mixtures, in vivo testing of this large number is unachievable from both economical and practical perspectives. While in vitro methods are capable of rapidly providing toxicity information, regulatory agencies in general are still cautious about the replacement of whole-animal methods with new in vitro techniques. Although studying the toxic effects of inhaled chemicals is a complex subject, recent studies demonstrate that in vitro methods may have significant potential for assessing the toxicity of airborne contaminants. In this review, current toxicity test methods for risk evaluation of industrial chemicals and airborne contaminants are presented. To evaluate the potential applications of in vitro methods for studying respiratory toxicity, more recent models developed for toxicity testing of airborne contaminants are discussed.

  12. Some anomalies between wind tunnel and flight transition results

    NASA Technical Reports Server (NTRS)

    Harvey, W. D.; Bobbitt, P. J.

    1981-01-01

    A review of environmental disturbance influence and boundary layer transition measurements on a large collection of reference sharp cone tests in wind tunnels and of recent transonic-supersonic cone flight results have previously demonstrated the dominance of free-stream disturbance level on the transition process from the beginning to end. Variation of the ratio of transition Reynolds number at onset-to-end with Mach number has been shown to be consistently different between flight and wind tunnels. Previous correlations of the end of transition with disturbance level give good results for flight and large number of tunnels, however, anomalies occur for similar correlation based on transition onset. Present cone results with a tunnel sonic throat reduced the disturbance level by an order of magnitude with transition values comparable to flight.

  13. Acrolein Microspheres Are Bonded To Large-Area Substrates

    NASA Technical Reports Server (NTRS)

    Rembaum, Alan; Yen, Richard C. K.

    1988-01-01

    Reactive cross-linked microspheres produced under influence of ionizing radiation in aqueous solutions of unsaturated aldehydes, such as acrolein, with sodium dodecyl sulfate. Diameters of spheres depend on concentrations of ingredients. If polystyrene, polymethylmethacrylate, or polypropylene object immersed in solution during irradiation, microspheres become attached to surface. Resulting modified surface has grainy coating with reactivity similar to free microspheres. Aldehyde-substituted-functional microspheres react under mild conditions with number of organic reagents and with most proteins. Microsphere-coated macrospheres or films used to immobilize high concentrations of proteins, enzymes, hormones, viruses, cells, and large number of organic compounds. Applications include separation techniques, clinical diagnostic tests, catalytic processes, and battery separators.

  14. Fighting for independence.

    PubMed

    Saxon, Emma

    2016-01-19

    Male crickets (Gryllus bimaculatus) establish dominance hierarchies within a population by fighting with one another. Larger males win fights more frequently than their smaller counterparts, and a previous study found that males recognise one another primarily through sensory input from the antennae. This study therefore investigated whether the success of larger crickets is influenced by sensory input from the antennae, in part by assessing the number of fights that large 'antennectomized' crickets won against small crickets, compared with the number that large, intact crickets won. The success rate was significantly lower in antennectomized males, though they still won the majority of fights (73/100 versus 58/100, Fisher's exact test P < 0.05); the authors thus conclude that sensory input from the antennae affects the fighting success of large males, but that other size-related factors also play a part.

  15. RF Testing Of Microwave Integrated Circuits

    NASA Technical Reports Server (NTRS)

    Romanofsky, R. R.; Ponchak, G. E.; Shalkhauser, K. A.; Bhasin, K. B.

    1988-01-01

    Fixtures and techniques are undergoing development. Four test fixtures and two advanced techniques developed in continuing efforts to improve RF characterization of MMIC's. Finline/waveguide test fixture developed to test submodules of 30-GHz monolithic receiver. Universal commercially-manufactured coaxial test fixture modified to enable characterization of various microwave solid-state devices in frequency range of 26.5 to 40 GHz. Probe/waveguide fixture is compact, simple, and designed for non destructive testing of large number of MMIC's. Nondestructive-testing fixture includes cosine-tapered ridge, to match impedance wavequide to microstrip. Advanced technique is microwave-wafer probing. Second advanced technique is electro-optical sampling.

  16. An investigation of the measurement properties of the Spot-the-Word test in a community sample.

    PubMed

    Mackinnon, Andrew; Christensen, Helen

    2007-12-01

    Intellectual ability is assessed with the Spot-the-Word (STW) test (A. Baddeley, H. Emslie, & I. Nimmo Smith, 1993) by asking respondents to identify a word in a word-nonword item pair. Results in moderate-sized samples suggest this ability is resistant to decline due to dementia. The authors used a 3-parameter item response theory model to investigate the measurement properties of the STW in a large community-dwelling sample (n=2,480) 60 to 64 years of age. A number of poorly performing items were identified. Substantial guessing was present; however, the number of words correctly identified was found to be an accurate index of ability. Performance was moderately related to a number of tests of cognitive performance and was effectively unrelated to visual acuity and to physical or mental health status. The STW is a promising test of ability that, in the future, may be refined by the deletion or replacement of poorly functioning items.

  17. Invertebrate Paleontology.

    ERIC Educational Resources Information Center

    Feldmann, Rodney M.

    1983-01-01

    Indicating that, although no broad conceptual notions in invertebrate paleontology were proposed during 1982, a large number of excellent papers focusing on testing, modifying, and documenting earlier speculations were published or presented at professional meetings. Highlights of papers, conferences, and research studies are provided (including…

  18. Predicting developmental neurotoxicity in rodents from larval zebrafish - - and vice versa

    EPA Science Inventory

    The complexity of standard mammalian developmental neurotoxicity tests limits evaluation of large numbers of chemicals. Less complex, more rapid assays using larval zebrafish are gaining popularity for evaluating the developmental neurotoxicity of chemicals; there remains, howeve...

  19. Scale Effect on Clark Y Airfoil Characteristics from NACA Full-Scale Wind-Tunnel Tests

    NASA Technical Reports Server (NTRS)

    Silverstein, Abe

    1935-01-01

    This report presents the results of wind tunnel tests conducted to determine the aerodynamic characteristics of the Clark Y airfoil over a large range of Reynolds numbers. Three airfoils of aspect ratio 6 and with 4, 6, and 8 foot chords were tested at velocities between 25 and 118 miles per hour, and the characteristics were obtained for Reynolds numbers (based on the airfoil chord) in the range between 1,000,000 and 9,000,000 at the low angles of attack, and between 1,000,000 and 6,000,000 at maximum lift. With increasing Reynolds number the airfoil characteristics are affected in the following manner: the drag at zero lift decreases, the maximum lift increases, the slope of the lift curve increases, the angle of zero lift occurs at smaller negative angles, and the pitching moment at zero lift does not change appreciably.

  20. A two-stage design for multiple testing in large-scale association studies.

    PubMed

    Wen, Shu-Hui; Tzeng, Jung-Ying; Kao, Jau-Tsuen; Hsiao, Chuhsing Kate

    2006-01-01

    Modern association studies often involve a large number of markers and hence may encounter the problem of testing multiple hypotheses. Traditional procedures are usually over-conservative and with low power to detect mild genetic effects. From the design perspective, we propose a two-stage selection procedure to address this concern. Our main principle is to reduce the total number of tests by removing clearly unassociated markers in the first-stage test. Next, conditional on the findings of the first stage, which uses a less stringent nominal level, a more conservative test is conducted in the second stage using the augmented data and the data from the first stage. Previous studies have suggested using independent samples to avoid inflated errors. However, we found that, after accounting for the dependence between these two samples, the true discovery rate increases substantially. In addition, the cost of genotyping can be greatly reduced via this approach. Results from a study of hypertriglyceridemia and simulations suggest the two-stage method has a higher overall true positive rate (TPR) with a controlled overall false positive rate (FPR) when compared with single-stage approaches. We also report the analytical form of its overall FPR, which may be useful in guiding study design to achieve a high TPR while retaining the desired FPR.

  1. Factors affecting levels of circulating cell-free fetal DNA in maternal plasma and their implications for noninvasive prenatal testing.

    PubMed

    Kinnings, Sarah L; Geis, Jennifer A; Almasri, Eyad; Wang, Huiquan; Guan, Xiaojun; McCullough, Ron M; Bombard, Allan T; Saldivar, Juan-Sebastian; Oeth, Paul; Deciu, Cosmin

    2015-08-01

    Sufficient fetal DNA in a maternal plasma sample is required for accurate aneuploidy detection via noninvasive prenatal testing, thus highlighting a need to understand the factors affecting fetal fraction. The MaterniT21™ PLUS test uses massively parallel sequencing to analyze cell-free fetal DNA in maternal plasma and detect chromosomal abnormalities. We assess the impact of a variety of factors, both maternal and fetal, on the fetal fraction across a large number of samples processed by Sequenom Laboratories. The rate of increase in fetal fraction with increasing gestational age varies across the duration of the testing period and is also influenced by fetal aneuploidy status. Maternal weight trends inversely with fetal fraction, and we find no added benefit from analyzing body mass index or blood volume instead of weight. Strong correlations exist between fetal fractions from aliquots taken from the same patient at the same blood draw and also at different blood draws. While a number of factors trend with fetal fraction across the cohort as a whole, they are not the sole determinants of fetal fraction. In this study, the variability for any one patient does not appear large enough to justify postponing testing to a later gestational age. © 2015 John Wiley & Sons, Ltd.

  2. The positive and negative consequences of multiple-choice testing.

    PubMed

    Roediger, Henry L; Marsh, Elizabeth J

    2005-09-01

    Multiple-choice tests are commonly used in educational settings but with unknown effects on students' knowledge. The authors examined the consequences of taking a multiple-choice test on a later general knowledge test in which students were warned not to guess. A large positive testing effect was obtained: Prior testing of facts aided final cued-recall performance. However, prior testing also had negative consequences. Prior reading of a greater number of multiple-choice lures decreased the positive testing effect and increased production of multiple-choice lures as incorrect answers on the final test. Multiple-choice testing may inadvertently lead to the creation of false knowledge.

  3. Ordering pattern and performance of biochemical tests for diagnosing pheochromocytoma between 2000 and 2008.

    PubMed

    Yu, Run

    2009-01-01

    To examine what tests are ordered by physicians for pheochromocytoma diagnosis and how those tests perform in modern clinical practice. In this case series, electronic medical records of patients seen between January 2000 and July 2008 at a large academic hospital in Los Angeles, California, were queried, and patients older than 15 years who underwent any 1 of 5 tests for pheochromocytoma (measurement of plasma catecholamines, plasma fractionated metanephrines, urinary catecholamines, urinary metanephrines, or urinary vanillylmandelic acid) were identified. Because testing was performed in various reference laboratories, test results were classified into 1 of 3 categories: (a) markedly elevated, (b) moderately elevated, or (c) normal. Patient demographics, clinical history, test results, imaging study findings, and pathology records were reviewed. A total of 3980 tests were ordered for 1898 patients. Pretest probability was 2.2% (based on 681 patients in whom pheochromocytoma was confirmed or excluded), and hypertension was the most common indication for testing. The number of patients tested and the number of tests ordered increased over the years. The ordering pattern stabilized since 2006 when urinary metanephrines, urinary catecholamines, and plasma metanephrines were ordered more frequently. Sensitivity was highest for urinary metanephrines and vanillylmandelic acid, specificity was highest for vanillylmandelic acid and urinary catecholamines, and positive likelihood ratio was highest for vanillylmandelic acid. Positive predictive value for markedly elevated test results was 39% to 83%, while that for moderately elevated test results was only 2% to 14%. Ordering pattern and test performance differ significantly from those recommended and reported by large centers. The best testing strategy should incorporate local experience. Categorizing test results as markedly elevated, moderately elevated, and normal is important for result interpretation.

  4. On the influences of key modelling constants of large eddy simulations for large-scale compartment fires predictions

    NASA Astrophysics Data System (ADS)

    Yuen, Anthony C. Y.; Yeoh, Guan H.; Timchenko, Victoria; Cheung, Sherman C. P.; Chan, Qing N.; Chen, Timothy

    2017-09-01

    An in-house large eddy simulation (LES) based fire field model has been developed for large-scale compartment fire simulations. The model incorporates four major components, including subgrid-scale turbulence, combustion, soot and radiation models which are fully coupled. It is designed to simulate the temporal and fluid dynamical effects of turbulent reaction flow for non-premixed diffusion flame. Parametric studies were performed based on a large-scale fire experiment carried out in a 39-m long test hall facility. Several turbulent Prandtl and Schmidt numbers ranging from 0.2 to 0.5, and Smagorinsky constants ranging from 0.18 to 0.23 were investigated. It was found that the temperature and flow field predictions were most accurate with turbulent Prandtl and Schmidt numbers of 0.3, respectively, and a Smagorinsky constant of 0.2 applied. In addition, by utilising a set of numerically verified key modelling parameters, the smoke filling process was successfully captured by the present LES model.

  5. Glass sample preparation and performance investigations

    NASA Astrophysics Data System (ADS)

    Johnson, R. Barry

    1992-04-01

    This final report details the work performed under this delivery order from April 1991 through April 1992. The currently available capabilities for integrated optical performance modeling at MSFC for large and complex systems such as AXAF were investigated. The Integrated Structural Modeling (ISM) program developed by Boeing for the U.S. Air Force was obtained and installed on two DECstations 5000 at MSFC. The structural, thermal and optical analysis programs available in ISM were evaluated. As part of the optomechanical engineering activities, technical support was provided in the design of support structure, mirror assembly, filter wheel assembly and material selection for the Solar X-ray Imager (SXI) program. As part of the fabrication activities, a large number of zerodur glass samples were prepared in different sizes and shapes for acid etching, coating and polishing experiments to characterize the subsurface damage and stresses produced by the grinding and polishing operations. Various optical components for AXAF video microscope and the x-ray test facility were also fabricated. A number of glass fabrication and test instruments such as a scatter plate interferometer, a gravity feed saw and some phenolic cutting blades were fabricated, integrated and tested.

  6. Referral and Diagnosis of Developmental Auditory Processing Disorder in a Large, United States Hospital-Based Audiology Service.

    PubMed

    Moore, David R; Sieswerda, Stephanie L; Grainger, Maureen M; Bowling, Alexandra; Smith, Nicholette; Perdew, Audrey; Eichert, Susan; Alston, Sandra; Hilbert, Lisa W; Summers, Lynn; Lin, Li; Hunter, Lisa L

    2018-05-01

    Children referred to audiology services with otherwise unexplained academic, listening, attention, language, or other difficulties are often found to be audiometrically normal. Some of these children receive further evaluation for auditory processing disorder (APD), a controversial construct that assumes neural processing problems within the central auditory nervous system. This study focuses on the evaluation of APD and how it relates to diagnosis in one large pediatric audiology facility. To analyze electronic records of children receiving a central auditory processing evaluation (CAPE) at Cincinnati Children's Hospital, with a broad goal of understanding current practice in APD diagnosis and the test information which impacts that practice. A descriptive, cross-sectional analysis of APD test outcomes in relation to final audiologist diagnosis for 1,113 children aged 5-19 yr receiving a CAPE between 2009 and 2014. Children had a generally high level of performance on the tests used, resulting in marked ceiling effects on about half the tests. Audiologists developed the diagnostic category "Weakness" because of the large number of referred children who clearly had problems, but who did not fulfill the AAA/ASHA criteria for diagnosis of a "Disorder." A "right-ear advantage" was found in all tests for which each ear was tested, irrespective of whether the tests were delivered monaurally or dichotically. However, neither the side nor size of the ear advantage predicted the ultimate diagnosis well. Cooccurrence of CAPE with other learning problems was nearly universal, but neither the number nor the pattern of cooccurring problems was a predictor of APD diagnosis. The diagnostic patterns of individual audiologists were quite consistent. The number of annual assessments decreased dramatically during the study period. A simple diagnosis of APD based on current guidelines is neither realistic, given the current tests used, nor appropriate, as judged by the audiologists providing the service. Methods used to test for APD must recognize that any form of hearing assessment probes both sensory and cognitive processing. Testing must embrace modern methods, including digital test delivery, adaptive testing, referral to normative data, appropriate testing for young children, validated screening questionnaires, and relevant objective (physiological) methods, as appropriate. Audiologists need to collaborate with other specialists to understand more fully the behaviors displayed by children presenting with listening difficulties. To achieve progress, it is essential for clinicians and researchers to work together. As new understanding and methods become available, it will be necessary to sort out together what works and what doesn't work in the clinic, both from a theoretical and a practical perspective. American Academy of Audiology.

  7. The benefits of adaptive parametrization in multi-objective Tabu Search optimization

    NASA Astrophysics Data System (ADS)

    Ghisu, Tiziano; Parks, Geoffrey T.; Jaeggi, Daniel M.; Jarrett, Jerome P.; Clarkson, P. John

    2010-10-01

    In real-world optimization problems, large design spaces and conflicting objectives are often combined with a large number of constraints, resulting in a highly multi-modal, challenging, fragmented landscape. The local search at the heart of Tabu Search, while being one of its strengths in highly constrained optimization problems, requires a large number of evaluations per optimization step. In this work, a modification of the pattern search algorithm is proposed: this modification, based on a Principal Components' Analysis of the approximation set, allows both a re-alignment of the search directions, thereby creating a more effective parametrization, and also an informed reduction of the size of the design space itself. These changes make the optimization process more computationally efficient and more effective - higher quality solutions are identified in fewer iterations. These advantages are demonstrated on a number of standard analytical test functions (from the ZDT and DTLZ families) and on a real-world problem (the optimization of an axial compressor preliminary design).

  8. Transition of the Laminar Boundary Layer on a Delta Wing with 74 degree Sweep in Free Flight at Mach Numbers from 2.8 to 5.3

    NASA Technical Reports Server (NTRS)

    Chapman, Gary T.

    1961-01-01

    The tests were conducted at Mach numbers from 2.8 to 5.3, with model surface temperatures small compared to boundary-layer recovery temperature. The effects of Mach number, temperature ratio, unit Reynolds number, leading-edge diameter, and angle of attack were investigated in an exploratory fashion. The effect of heat-transfer condition (i.e., wall temperature to total temperature ratio) and Mach number can not be separated explicitly in free-flight tests. However, the data of the present report, as well as those of NACA TN 3473, were found to be more consistent when plotted versus temperature ratio. Decreasing temperature ratio increased the transition Reynolds number. The effect of unit Reynolds number was small as was the effect of leading-edge diameter within the range tested. At small values of angle of attack, transition moved forward on the windward surface and rearward on the leeward surface. This trend was reversed at high angles of attack (6 deg to 18 deg). Possible reasons for this are the reduction of crossflow on the windward side and the influence of the lifting vortices on the leeward surface. When the transition results on the 740 delta wing were compared to data at similar test conditions for an unswept leading edge, the results bore out the results of earlier research at nearly zero heat transfer; namely, sweep causes a large reduction in the transition Reynolds number.

  9. Reynolds Number Effects on the Stability and Control Characteristics of a Supersonic Transport

    NASA Technical Reports Server (NTRS)

    Owens, L. R.; Wahls, R. A.; Elzey, M. B.; Hamner, M. P.

    2002-01-01

    A High Speed Civil Transport (HSCT) configuration was tested in the National Transonic Facility at the NASA Langley Research Center as part of NASA's High Speed Research Program. A series of tests included longitudinal and lateral/directional studies at transonic and low speed, high-lift conditions across a range of Reynolds numbers from that available in conventional wind tunnels to near flight conditions. Results presented focus on Reynolds number sensitivities of the stability and control characteristics at Mach 0.30 and 0.95 for a complete HSCT aircraft configuration including empennage. The angle of attack where the pitching-moment departure occurred increased with higher Reynolds numbers for both the landing and transonic configurations. The stabilizer effectiveness increased with Reynolds number for both configurations. The directional stability also increased with Reynolds number for both configurations. The landing configuration without forebody chines exhibited a large yawing-moment departure at high angles of attack and zero sideslip that varied with increasing Reynolds numbers. This departure characteristic nearly disappeared when forebody chines were added. The landing configuration's rudder effectiveness also exhibited sensitivities to changes in Reynolds number.

  10. Experience with helium leak and thermal shocks test of SST-1 cryo components

    NASA Astrophysics Data System (ADS)

    Sharma, Rajiv; Nimavat, Hiren; Srikanth, G. L. N.; Bairagi, Nitin; Shah, Pankil; Tanna, V. L.; Pradhan, S.

    2012-11-01

    A steady state superconducting Tokamak SST-1 is presently under its assembly stage at the Institute for Plasma Research. The SST-1 machine is a family of Superconducting SC coils for both Toroidal field and Poloidal Field. An ultra high vacuum compatible vacuum vessel, placed in the bore of the TF coils, houses the plasma facing components. A high vacuum cryostat encloses all the SC coils and the vacuum vessel. Liquid Nitrogen (LN2) cooled thermal shield between the vacuum vessel & SC coils as well as between cryostat and the SC coils. There are number of crucial cryogenic components as Electrical isolators, 80 K thermal shield, Cryogenic flexible hose etc., which have to be passed the performance validation tests as part of fulfillment of the stringent QA/QC before incorporated in the main assembly. The individual leak tests of components at RT as well as after thermal cycle from 300 K to 77 K ensure us to make final overall leak proof system. These components include, Large numbers of Electrical Isolators for Helium as well as LN2 services, Flexible Bellows and Hoses for Helium as well as LN2 services, Thermal shock tests of large numbers of 80 K Bubble shields In order to validate the helium leak tightness of these components, we have used the calibrated mass spectrometer leak detector (MSLD) at 300 K, 77 K and 4.2. Since it is very difficult to locate the leaks, which are appearing at rather lower temperatures e.g. less than 20 K, We have invented different approaches to resolve the issue of such leaks. This paper, in general describes the design of cryogenic flexible hose, assembly, couplings for leak testing, test method and techniques of thermal cycles test at 77 K inflow conditions and leak testing aspects of different cryogenic components. The test results, the problems encountered and its solutions techniques are discussed.

  11. Does Regular Online Testing Enhance Student Learning in the Numerical Sciences? Robust Evidence from a Large Data Set

    ERIC Educational Resources Information Center

    Angus, Simon D.; Watson, Judith

    2009-01-01

    While a number of studies have been conducted on the impact of online assessment and teaching methods on student learning, the field does not seem settled around the promised benefits of such approaches. It is argued that the reason for this state of affairs is that few studies have been able to control for a number of confounding factors in…

  12. The Shock and Vibration Digest. Volume 14, Number 12

    DTIC Science & Technology

    1982-12-01

    to evaluate the uses of statistical energy analysis for determining sound transmission performance. Coupling loss factors were mea- sured and compared...measurements for the artificial (Also see No. 2623) cracks in mild-steel test pieces. 82-2676 Ihprovement of the Method of Statistical Energy Analysis for...eters, using a large number of free-response time histories In the application of the statistical energy analysis theory simultaneously in one analysis

  13. Application of the stepwise focusing method to optimize the cost-effectiveness of genome-wide association studies with limited research budgets for genotyping and phenotyping.

    PubMed

    Ohashi, J; Clark, A G

    2005-05-01

    The recent cataloguing of a large number of SNPs enables us to perform genome-wide association studies for detecting common genetic variants associated with disease. Such studies, however, generally have limited research budgets for genotyping and phenotyping. It is therefore necessary to optimize the study design by determining the most cost-effective numbers of SNPs and individuals to analyze. In this report we applied the stepwise focusing method, with two-stage design, developed by Satagopan et al. (2002) and Saito & Kamatani (2002), to optimize the cost-effectiveness of a genome-wide direct association study using a transmission/disequilibrium test (TDT). The stepwise focusing method consists of two steps: a large number of SNPs are examined in the first focusing step, and then all the SNPs showing a significant P-value are tested again using a larger set of individuals in the second focusing step. In the framework of optimization, the numbers of SNPs and families and the significance levels in the first and second steps were regarded as variables to be considered. Our results showed that the stepwise focusing method achieves a distinct gain of power compared to a conventional method with the same research budget.

  14. Comparison of Methods for Xenomonitoring in Vectors of Lymphatic Filariasis in Northeastern Tanzania

    PubMed Central

    Irish, Seth R.; Stevens, William M. B.; Derua, Yahya A.; Walker, Thomas; Cameron, Mary M.

    2015-01-01

    Monitoring Wuchereria bancrofti infection in mosquitoes (xenomonitoring) can play an important role in determining when lymphatic filariasis has been eliminated, or in focusing control efforts. As mosquito infection rates can be low, a method for collecting large numbers of mosquitoes is necessary. Gravid traps collected large numbers of Culex quinquefasciatus in Tanzania, and a collection method that targets mosquitoes that have already fed could result in increased sensitivity in detecting W. bancrofti-infected mosquitoes. The aim of this experiment was to test this hypothesis by comparing U.S. Centers for Disease Control and Prevention (CDC) light traps with CDC gravid traps in northeastern Tanzania, where Cx. quinquefasciatus is a vector of lymphatic filariasis. After an initial study where small numbers of mosquitoes were collected, a second study collected 16,316 Cx. quinquefasciatus in 60 gravid trap-nights and 240 light trap-nights. Mosquitoes were pooled and tested for presence of W. bancrofti DNA. Light and gravid traps collected similar numbers of mosquitoes per trap-night, but the physiological status of the mosquitoes was different. The estimated infection rate in mosquitoes collected in light traps was considerably higher than in mosquitoes collected in gravid traps, so light traps can be a useful tool for xenomonitoring work in Tanzania. PMID:26350454

  15. Concurrent Programming Using Actors: Exploiting Large-Scale Parallelism,

    DTIC Science & Technology

    1985-10-07

    ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK* Artificial Inteligence Laboratory AREA Is WORK UNIT NUMBERS 545 Technology Square...D-R162 422 CONCURRENT PROGRMMIZNG USING f"OS XL?ITP TEH l’ LARGE-SCALE PARALLELISH(U) NASI AC E Al CAMBRIDGE ARTIFICIAL INTELLIGENCE L. G AGHA ET AL...RESOLUTION TEST CHART N~ATIONAL BUREAU OF STANDA.RDS - -96 A -E. __ _ __ __’ .,*- - -- •. - MASSACHUSETTS INSTITUTE OF TECHNOLOGY ARTIFICIAL

  16. In Vitro Models of Human Toxicity Pathways

    EPA Science Inventory

    For toxicity testing and assessment programs to address the large numbers of substances of potential concern, a paradigm shift in the assessment of chemical hazard and risk is needed that takes advantage of advances in molecular toxicology, computational sciences, and information...

  17. Quantitative Assessment of Neurite Outgrowth in PC12 Cells

    EPA Science Inventory

    In vitro test methods can provide a rapid approach for the screening of large numbers of chemicals for their potential to produce toxicity. In order to identify potential developmental neurotoxicants, assessment of critical neurodevelopmental processes such as neuronal differenti...

  18. INTEGRATED CHEMICAL INFORMATION TECHNOLOGIES APPLIED TO TOXICOLOGY

    EPA Science Inventory

    A central regulatory mandate of the Environmental Protection Agency, spanning many Program Offices and issues, is to assess the potential health and environmental risks of large numbers of chemicals released into the environment, often in the absence of relevant test data. Model...

  19. Integral criteria for large-scale multiple fingerprint solutions

    NASA Astrophysics Data System (ADS)

    Ushmaev, Oleg S.; Novikov, Sergey O.

    2004-08-01

    We propose the definition and analysis of the optimal integral similarity score criterion for large scale multmodal civil ID systems. Firstly, the general properties of score distributions for genuine and impostor matches for different systems and input devices are investigated. The empirical statistics was taken from the real biometric tests. Then we carry out the analysis of simultaneous score distributions for a number of combined biometric tests and primary for ultiple fingerprint solutions. The explicit and approximate relations for optimal integral score, which provides the least value of the FRR while the FAR is predefined, have been obtained. The results of real multiple fingerprint test show good correspondence with the theoretical results in the wide range of the False Acceptance and the False Rejection Rates.

  20. The future of large old trees in urban landscapes.

    PubMed

    Le Roux, Darren S; Ikin, Karen; Lindenmayer, David B; Manning, Adrian D; Gibbons, Philip

    2014-01-01

    Large old trees are disproportionate providers of structural elements (e.g. hollows, coarse woody debris), which are crucial habitat resources for many species. The decline of large old trees in modified landscapes is of global conservation concern. Once large old trees are removed, they are difficult to replace in the short term due to typically prolonged time periods needed for trees to mature (i.e. centuries). Few studies have investigated the decline of large old trees in urban landscapes. Using a simulation model, we predicted the future availability of native hollow-bearing trees (a surrogate for large old trees) in an expanding city in southeastern Australia. In urban greenspace, we predicted that the number of hollow-bearing trees is likely to decline by 87% over 300 years under existing management practices. Under a worst case scenario, hollow-bearing trees may be completely lost within 115 years. Conversely, we predicted that the number of hollow-bearing trees will likely remain stable in semi-natural nature reserves. Sensitivity analysis revealed that the number of hollow-bearing trees perpetuated in urban greenspace over the long term is most sensitive to the: (1) maximum standing life of trees; (2) number of regenerating seedlings ha(-1); and (3) rate of hollow formation. We tested the efficacy of alternative urban management strategies and found that the only way to arrest the decline of large old trees requires a collective management strategy that ensures: (1) trees remain standing for at least 40% longer than currently tolerated lifespans; (2) the number of seedlings established is increased by at least 60%; and (3) the formation of habitat structures provided by large old trees is accelerated by at least 30% (e.g. artificial structures) to compensate for short term deficits in habitat resources. Immediate implementation of these recommendations is needed to avert long term risk to urban biodiversity.

  1. The Future of Large Old Trees in Urban Landscapes

    PubMed Central

    Le Roux, Darren S.; Ikin, Karen; Lindenmayer, David B.; Manning, Adrian D.; Gibbons, Philip

    2014-01-01

    Large old trees are disproportionate providers of structural elements (e.g. hollows, coarse woody debris), which are crucial habitat resources for many species. The decline of large old trees in modified landscapes is of global conservation concern. Once large old trees are removed, they are difficult to replace in the short term due to typically prolonged time periods needed for trees to mature (i.e. centuries). Few studies have investigated the decline of large old trees in urban landscapes. Using a simulation model, we predicted the future availability of native hollow-bearing trees (a surrogate for large old trees) in an expanding city in southeastern Australia. In urban greenspace, we predicted that the number of hollow-bearing trees is likely to decline by 87% over 300 years under existing management practices. Under a worst case scenario, hollow-bearing trees may be completely lost within 115 years. Conversely, we predicted that the number of hollow-bearing trees will likely remain stable in semi-natural nature reserves. Sensitivity analysis revealed that the number of hollow-bearing trees perpetuated in urban greenspace over the long term is most sensitive to the: (1) maximum standing life of trees; (2) number of regenerating seedlings ha−1; and (3) rate of hollow formation. We tested the efficacy of alternative urban management strategies and found that the only way to arrest the decline of large old trees requires a collective management strategy that ensures: (1) trees remain standing for at least 40% longer than currently tolerated lifespans; (2) the number of seedlings established is increased by at least 60%; and (3) the formation of habitat structures provided by large old trees is accelerated by at least 30% (e.g. artificial structures) to compensate for short term deficits in habitat resources. Immediate implementation of these recommendations is needed to avert long term risk to urban biodiversity. PMID:24941258

  2. Dimensional indicators of generalized anxiety disorder severity for DSM-V.

    PubMed

    Niles, Andrea N; Lebeau, Richard T; Liao, Betty; Glenn, Daniel E; Craske, Michelle G

    2012-03-01

    For DSM-V, simple dimensional measures of disorder severity will accompany diagnostic criteria. The current studies examine convergent validity and test-retest reliability of two potential dimensional indicators of worry severity for generalized anxiety disorder (GAD): percent of the day worried and number of worry domains. In study 1, archival data from diagnostic interviews from a community sample of individuals diagnosed with one or more anxiety disorders (n = 233) were used to assess correlations between percent of the day worried and number of worry domains with other measures of worry severity (clinical severity rating (CSR), age of onset, number of comorbid disorders, Penn state worry questionnaire (PSWQ)) and DSM-IV criteria (excessiveness, uncontrollability and number of physical symptoms). Both measures were significantly correlated with CSR and number of comorbid disorders, and with all three DSM-IV criteria. In study 2, test-retest reliability of percent of the day worried and number of worry domains were compared to test-retest reliability of DSM-IV diagnostic criteria in a non-clinical sample of undergraduate students (n = 97) at a large west coast university. All measures had low test-retest reliability except percent of the day worried, which had moderate test-retest reliability. Findings suggest that these two indicators capture worry severity, and percent of the day worried may be the most reliable existing indicator. These measures may be useful as dimensional measures for DSM-V. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Genetic variability and resistance of cultivars of cowpea [Vigna unguiculata (L.) Walp] to cowpea weevil (Callosobruchus maculatus Fabr.).

    PubMed

    Vila Nova, M X; Leite, N G A; Houllou, L M; Medeiros, L V; Lira Neto, A C; Hsie, B S; Borges-Paluch, L R; Santos, B S; Araujo, C S F; Rocha, A A; Costa, A F

    2014-03-31

    The cowpea weevil (Callosobruchus maculatus Fabr.) is the most destructive pest of the cowpea bean; it reduces seed quality. To control this pest, resistance testing combined with genetic analysis using molecular markers has been widely applied in research. Among the markers that show reliable results, the inter-simple sequence repeats (ISSRs) (microsatellites) are noteworthy. This study was performed to evaluate the resistance of 27 cultivars of cowpea bean to cowpea weevil. We tested the resistance related to the genetic variability of these cultivars using ISSR markers. To analyze the resistance of cultivars to weevil, a completely randomized test design with 4 replicates and 27 treatments was adopted. Five pairs of the insect were placed in 30 grains per replicate. Analysis of variance showed that the number of eggs and emerged insects were significantly different in the treatments, and the means were compared by statistical tests. The analysis of the large genetic variability in all cultivars resulted in the formation of different groups. The test of resistance showed that the cultivar Inhuma was the most sensitive to both number of eggs and number of emerged adults, while the TE96-290-12-G and MNC99-537-F4 (BRS Tumucumaque) cultivars were the least sensitive to the number of eggs and the number of emerged insects, respectively.

  4. The NASA Glen Research Center's Hypersonic Tunnel Facility. Chapter 16

    NASA Technical Reports Server (NTRS)

    Woike, Mark R.; Willis, Brian P.

    2001-01-01

    The NASA Glenn Research Center's Hypersonic Tunnel Facility (HTF) is a blow-down, freejet wind tunnel that provides true enthalpy flight conditions for Mach numbers of 5, 6, and 7. The Hypersonic Tunnel Facility is unique due to its large scale and use of non-vitiated (clean air) flow. A 3MW graphite core storage heater is used to heat the test medium of gaseous nitrogen to the high stagnation temperatures required to produce true enthalpy conditions. Gaseous oxygen is mixed into the heated test flow to generate the true air simulation. The freejet test section is 1.07m (42 in.) in diameter and 4.3m (14 ft) in length. The facility is well suited for the testing of large scale airbreathing propulsion systems. In this chapter, a brief history and detailed description of the facility are presented along with a discussion of the facility's application towards hypersonic airbreathing propulsion testing.

  5. Wall interference correction improvements for the ONERA main wind tunnels

    NASA Technical Reports Server (NTRS)

    Vaucheret, X.

    1982-01-01

    This paper describes improved methods of calculating wall interference corrections for the ONERA large windtunnels. The mathematical description of the model and its sting support have become more sophisticated. An increasing number of singularities is used until an agreement between theoretical and experimental signatures of the model and sting on the walls of the closed test section is obtained. The singularity decentering effects are calculated when the model reaches large angles of attack. The porosity factor cartography on the perforated walls deduced from the measured signatures now replaces the reference tests previously carried out in larger tunnels. The porosity factors obtained from the blockage terms (signatures at zero lift) and from the lift terms are in good agreement. In each case (model + sting + test section), wall corrections are now determined, before the tests, as a function of the fundamental parameters M, CS, CZ. During the windtunnel tests, the corrections are quickly computed from these functions.

  6. Highlights of Conference on Using Student Test Scores to Measure Teacher Performance: The State of the Art in Research and Practice

    ERIC Educational Resources Information Center

    Guarino, Cassandra; Reckase, Mark D.; Wooldridge, Jeffrey M.

    2013-01-01

    The push for accountability in public schooling has extended to the measurement of teacher performance, accelerated by federal efforts through Race to the Top. Currently, a large number of states and districts across the country are computing measures of teacher performance based on the standardized test scores of their students and using them to…

  7. Blue Whale Behavioral Response Study and Field Testing of the New Bioacoustic Probe

    DTIC Science & Technology

    2011-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Blue Whale Behavioral Response Study & Field Testing of...6849 email: jhildebrand@ucsd.edu Award Number: N000140811221 LONG-TERM GOALS Task 1: Blue Whales Behavioral Response Study The...behavioral response of large whales to commercial shipping and other low-frequency anthropogenic sound is not well understood. The PCAD model (NRC 2005

  8. Feeding damage to plants increases with plant size across 21 Brassicaceae species.

    PubMed

    Schlinkert, Hella; Westphal, Catrin; Clough, Yann; Ludwig, Martin; Kabouw, Patrick; Tscharntke, Teja

    2015-10-01

    Plant size is a major predictor of ecological functioning. We tested the hypothesis that feeding damage to plants increases with plant size, as the conspicuousness of large plants makes resource finding and colonisation easier. Further, large plants can be attractive to herbivores, as they offer greater amounts and ranges of resources and niches, but direct evidence from experiments testing size effects on feeding damage and consequently on plant fitness is so far missing. We established a common garden experiment with a plant size gradient (10-130 cm height) using 21 annual Brassicaceae species, and quantified plant size, biomass and number of all aboveground components (flowers, fruits, leaves, stems) and their proportional feeding damage. Plant reproductive fitness was measured using seed number, 1000 seed weight and total seed weight. Feeding damage to the different plant components increased with plant size or component biomass, with mean damage levels being approximately 30 % for flowers, 5 % for fruits and 1 % for leaves and stems. Feeding damage affected plant reproductive fitness depending on feeding damage type, with flower damage having the strongest effect, shown by greatly reduced seed number, 1000 seed weight and total seed weight. Finally, we found an overall negative effect of plant size on 1000 seed weight, but not on seed number and total seed weight. In conclusion, being conspicuous and attractive to herbivores causes greater flower damage leading to higher fitness costs for large plants, which might be partly counterbalanced by benefits such as enhanced competitive/compensatory abilities or more mutualistic pollinator visits.

  9. Blockage Testing in the NASA Glenn 225 Square Centimeter Supersonic Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Sevier, Abigail; Davis, David; Schoenenberger, Mark

    2017-01-01

    A feasibility study is in progress at NASA Glenn Research Center to implement a magnetic suspension and balance system in the 225 sq cm Supersonic Wind Tunnel for the purpose of testing the dynamic stability of blunt bodies. An important area of investigation in this study was determining the optimum size of the model and the iron spherical core inside of it. In order to minimize the required magnetic field and thus the size of the magnetic suspension system, it was determined that the test model should be as large as possible. Blockage tests were conducted to determine the largest possible model that would allow for tunnel start at Mach 2, 2.5, and 3. Three different forebody model geometries were tested at different Mach numbers, axial locations in the tunnel, and in both a square and axisymmetric test section. Experimental results showed that different model geometries produced more varied results at higher Mach Numbers. It was also shown that testing closer to the nozzle allowed larger models to start compared with testing near the end of the test section. Finally, allowable model blockage was larger in the axisymmetric test section compared with the square test section at the same Mach number. This testing answered key questions posed by the feasibility study and will be used in the future to dictate model size and performance required from the magnetic suspension system.

  10. Comparing Standard and Selective Degradation DNA Extraction Methods: Results from a Field Experiment with Sexual Assault Kits.

    PubMed

    Campbell, Rebecca; Pierce, Steven J; Sharma, Dhruv B; Shaw, Jessica; Feeney, Hannah; Nye, Jeffrey; Schelling, Kristin; Fehler-Cabral, Giannina

    2017-01-01

    A growing number of U.S. cities have large numbers of untested sexual assault kits (SAKs) in police property facilities. Testing older kits and maintaining current case work will be challenging for forensic laboratories, creating a need for more efficient testing methods. We evaluated selective degradation methods for DNA extraction using actual case work from a sample of previously unsubmitted SAKs in Detroit, Michigan. We randomly assigned 350 kits to either standard or selective degradation testing methods and then compared DNA testing rates and CODIS entry rates between the two groups. Continuation-ratio modeling showed no significant differences, indicating that the selective degradation method had no decrement in performance relative to customary methods. Follow-up equivalence tests indicated that CODIS entry rates for the two methods could differ by more than ±5%. Selective degradation methods required less personnel time for testing and scientific review than standard testing. © 2016 American Academy of Forensic Sciences.

  11. Modification of NASA Langley 8 foot high temperature tunnel to provide a unique national research facility for hypersonic air-breathing propulsion systems

    NASA Technical Reports Server (NTRS)

    Kelly, H. N.; Wieting, A. R.

    1984-01-01

    A planned modification of the NASA Langley 8-Foot High Temperature Tunnel to make it a unique national research facility for hypersonic air-breathing propulsion systems is described, and some of the ongoing supporting research for that modification is discussed. The modification involves: (1) the addition of an oxygen-enrichment system which will allow the methane-air combustion-heated test stream to simulate air for propulsion testing; and (2) supplemental nozzles to expand the test simulation capability from the current nominal Mach number to 7.0 include Mach numbers 3.0, 4.5, and 5.0. Detailed design of the modifications is currently underway and the modified facility is scheduled to be available for tests of large scale propulsion systems by mid 1988.

  12. Full-scale flammability test data for validation of aircraft fire mathematical models

    NASA Technical Reports Server (NTRS)

    Kuminecz, J. F.; Bricker, R. W.

    1982-01-01

    Twenty-five large scale aircraft flammability tests were conducted in a Boeing 737 fuselage at the NASA Johnson Space Center (JSC). The objective of this test program was to provide a data base on the propagation of large scale aircraft fires to support the validation of aircraft fire mathematical models. Variables in the test program included cabin volume, amount of fuel, fuel pan area, fire location, airflow rate, and cabin materials. A number of tests were conducted with jet A-1 fuel only, while others were conducted with various Boeing 747 type cabin materials. These included urethane foam seats, passenger service units, stowage bins, and wall and ceiling panels. Two tests were also included using special urethane foam and polyimide foam seats. Tests were conducted with each cabin material individually, with various combinations of these materials, and finally, with all materials in the cabin. The data include information obtained from approximately 160 locations inside the fuselage.

  13. Psychometrics behind Computerized Adaptive Testing.

    PubMed

    Chang, Hua-Hua

    2015-03-01

    The paper provides a survey of 18 years' progress that my colleagues, students (both former and current) and I made in a prominent research area in Psychometrics-Computerized Adaptive Testing (CAT). We start with a historical review of the establishment of a large sample foundation for CAT. It is worth noting that the asymptotic results were derived under the framework of Martingale Theory, a very theoretical perspective of Probability Theory, which may seem unrelated to educational and psychological testing. In addition, we address a number of issues that emerged from large scale implementation and show that how theoretical works can be helpful to solve the problems. Finally, we propose that CAT technology can be very useful to support individualized instruction on a mass scale. We show that even paper and pencil based tests can be made adaptive to support classroom teaching.

  14. Marine Hydrokinetic (MHK) Energy Conversion Research at UNH: From Fundamental Studies of Hydrofoil Sections, to Moderate Reynolds Number Turbine Tests in a Tow Tank, to Open Water Deployments at Tidal Energy Test Sites (Invited)

    NASA Astrophysics Data System (ADS)

    Wosnik, M.; Bachant, P.; Nedyalkov, I.; Rowell, M.; Dufresne, N.; Lyon, V.

    2013-12-01

    We report on research related to MHK turbines at the Center for Ocean Renewable Energy (CORE) at the University of New Hampshire (UNH). The research projects span varies scales, levels of complexity and environments - from fundamental studies of hydrofoil sections in a high speed water tunnel, to moderate Reynolds number turbine tests with inflow and wake studies in a large cross-section tow tank, to deployments of highly instrumented process models at tidal energy test sites in New England. A concerted effort over the past few years has brought significant new research infrastructure for marine hydrokinetic energy conversion online at UNH-CORE. It includes: a high-speed cavitation tunnel with independent control of velocity and pressure; a highly accurate tow mechanism, turbine test bed and wake traversing system for the 3.7m x 2.4m cross-section UNH tow tank; a 10.7m x 3.0m tidal energy test platform which can accommodate turbines up to 1.5m in diameter, for deployments at the UNH-CORE Tidal Energy Test Site in Great Bay Estuary, NH, a sheltered 'nursery site' suitable for intermediate scale tidal energy conversion device testing with peak currents typically above 2 m/s during each tidal cycle. Further, a large boundary layer wind tunnel, the new UNH Flow Physics Facility (W6.0m x H2.7m xL72m) is being used for detailed turbine wake studies, producing data and insight also applicable to MHK turbines in low Froude number deployments. Bi-directional hydrofoils, which perform equally well in either flow direction and could avoid the use of complex and maintenance-intensive yaw or blade pitch mechanisms, are being investigated theoretically, numerically and experimentally. For selected candidate shapes lift, drag, wake, and cavitation inception/desinence are measured. When combined with a cavitation inception model for MHK turbines, this information can be used to prescribe turbine design/operational parameters. Experiments were performed with a 1m diameter and 1m tall three-bladed cross-flow axis turbine (UNH RVAT) in a tow tank. For cross-flow axis turbines hydrofoil performance remains Reynolds number dependent at intermediate scales due to the large range of angles of attack encountered during turbine rotation. The experiments, with turbine diameter Reynolds numbers ReD = 0.5 x105 to 2.0 x106, were aimed at providing detailed data for model comparison at significantly higher Reynolds numbers than previously available. Measurements include rotor power, thrust, tip speed ratio, and detailed maps of mean flow and turbulence components in the near-wake. Mechanical exergy efficiency was calculated from power and drag measurements using an actuator disk approach. The spatial and temporal resolutions of different flow measurement techniques (ADCP, ADV, PIV) were systematically characterized. Finally, Reynolds-averaged Navier-Stokes (RANS) simulations were performed to assess their ability to predict the experimental results. A scaled version of a mixer-ejector hydrokinetic turbine, with a specially designed shroud to promotes wake mixing to enable increased mass flow through the turbine rotor, was evaluated experimentally at the UNH Tidal Energy Test Site in Great Bay Estuary, NH and in Muskeget Channel, MA. State-of-the-art instrumentation was used to measure the tidal energy resource and turbine wake flow velocities, turbine power extraction, test platform loadings and platform motion induced by sea state.

  15. Examination of ceramic restoration adhesive coverage in cusp-replacement premolar using acoustic emission under fatigue testing.

    PubMed

    Chang, Yen-Hsiang; Yu, Jin-Jie; Lin, Chun-Li

    2014-12-13

    This study investigates CAD/CAM ceramic cusp-replacing restoration resistance with and without buccal cusp replacement under static and dynamic cyclic loads, monitored using the acoustic emission (AE) technique. The cavity was designed in a typical MODP (mesial-occlusal-distal-palatal) restoration failure shape when the palatal cusp has been lost. Two ceramic restorations [without coverage (WOC) and with (WC) buccal cuspal coverage with 2.0 mm reduction in cuspal height] were prepared to perform the fracture and fatigue tests with normal (200 N) and high (600 N) occlusal forces. The load versus AE signals in the fracture and fatigue tests were recorded to evaluate the restored tooth failure resistance. The results showed that non-significant differences in load value in the fracture test and the accumulated number of AE signals under normal occlusal force (200 N) in the fatigue test were found between with and without buccal cuspal coverage restorations. The first AE activity occurring for the WOC restoration was lower than that for the WC restoration in the fracture test. The number of AE signals increased with the cyclic load number. The accumulated number of AE signals for the WOC restoration was 187, higher than that (85) for the WC restoration under 600 N in the fatigue test. The AE technique and fatigue tests employed in this study were used as an assessment tool to evaluate the resistances in large CAD/CAM ceramic restorations. Non-significant differences in the tested fracture loads and accumulated number of AE signals under normal occlusal force (200 N) between different restorations indicated that aggressive treatment (with coverage preparation) in palatal cusp-replacing ceramic premolars require more attention for preserving and protecting the remaining tooth.

  16. Screech tones from free and ducted supersonic jets

    NASA Technical Reports Server (NTRS)

    Tam, C. K. W.; Ahuja, K. K.; Jones, R. R., III

    1993-01-01

    The dependence of the instability wave spectrum on azimuthal mode number, the jet to ambient gas temperature ratio, and the jet Mach number is studied. It is shown that the switch of the dominant screech mode (axisymmetric to helical/flapping) as Mach number increases is due to the switch in dominance of the corresponding mode of instability waves. Super-resonance can occur when the feedback loop is powered by the most amplified instability wave. It is suggested that the large amplitude pressure fluctuations and tone in the test cells are generated by super-resonance.

  17. Development of concrete QC/QA specifications for highway construction in Kentucky.

    DOT National Transportation Integrated Search

    2001-08-01

    There is a growing trend toward quality-based specifications in highway construction. A large number of quality control/quality assurance (QC/QA) specifications shift the responsibility of day-to-day testing from the state DOH to the contractor. This...

  18. Idaho storm warning system operational test

    DOT National Transportation Integrated Search

    2000-12-01

    The Storm Warning Project was initiated in 1993 as a result of a large number of serious traffic crashes that occurred during periods of low visibility on I-84 in southeastern Idaho between 1988 and 1993. The purpose of the project was to determine i...

  19. Overview of ToxCast™

    EPA Science Inventory

    In 2007, EPA launched ToxCast™ in order to develop a cost-effective approach for prioritizing the toxicity testing of large numbers of chemicals in a short period of time. Using data from state-of-the-art high throughput screening (HTS) bioassays developed in the pharmaceutical i...

  20. Experimental Investigation of Gauge Widening and Rail Restraint Characteristics

    DOT National Transportation Integrated Search

    1984-11-01

    Gauge widening resulting from a loss of adequate rail restraint is one of the major track failure modes and the cause of a large number of derailments. A recent field and laboratory test program conducted by the Transportation Systems Center aimed at...

  1. Phenotypic screening for developmental neurotoxicity: mechanistic data at the level of the cell

    EPA Science Inventory

    There are large numbers of environmental chemicals with little or no available information on their toxicity, including developmental neurotoxicity. Because of the resource-intensive nature of traditional animal tests, high-throughput (HTP) methods that can rapidly evaluate chemi...

  2. Nuclear Misinformation

    ERIC Educational Resources Information Center

    Ford, Daniel F.; Kendall, Henry W.

    1975-01-01

    Many scientists feel that research into nuclear safety has been diverted or distorted, and the results of the research concealed or inaccurately reported on a large number of occasions. Of particular concern have been the emergency cooling systems which have not, as yet, been adequately tested. (Author/MA)

  3. Measurements of the Absorption by Auditorium SEATING—A Model Study

    NASA Astrophysics Data System (ADS)

    BARRON, M.; COLEMAN, S.

    2001-01-01

    One of several problems with seat absorption is that only small numbers of seats can be tested in standard reverberation chambers. One method proposed for reverberation chamber measurements involves extrapolation when the absorption coefficient results are applied to actual auditoria. Model seat measurements in an effectively large model reverberation chamber have allowed the validity of this extrapolation to be checked. The alternative barrier method for reverberation chamber measurements was also tested and the two methods were compared. The effect on the absorption of row-row spacing as well as absorption by small numbers of seating rows was also investigated with model seats.

  4. Influence of free-stream disturbances on boundary-layer transition

    NASA Technical Reports Server (NTRS)

    Harvey, W. D.

    1978-01-01

    Considerable experimental evidence exists which shows that free stream disturbances (the ratio of root-mean-square pressure fluctuations to mean values) in conventional wind tunnels increase with increasing Mach number at low supersonic to moderate hypersonic speeds. In addition to local conditions, the free stream disturbance level influences transition behavior on simple test models. Based on this observation, existing noise transition data obtained in the same test facility were correlated for a large number of reference sharp cones and flat plates and are shown to collapse along a single curve. This result is a significant improvement over previous attempts to correlate noise transition data.

  5. Vertical interferometer workstation for testing large spherical optics

    NASA Astrophysics Data System (ADS)

    Truax, B.

    2013-09-01

    The design of an interferometer workstation for the testing of large concave and convex spherical optics is presented. The workstation handles optical components and mounts up to 425 mm in diameter with mass of up to 40 kg with 6 axes of adjustment. A unique method for the implementation of focus, roll and pitch was used allowing for extremely precise adjustment. The completed system includes transmission spheres with f-numbers from f/1.6 to f/0.82 incorporating reference surface diameters of up to 306 mm and surface accuracies of better than 63 nm PVr. The design challenges and resulting solutions are discussed. System performance results are presented.

  6. High-Fidelity PIV of a Naturally Grown High Reynolds Number Turbulent Boundary Layer

    NASA Astrophysics Data System (ADS)

    Biles, Drummond; White, Chris; Klewicki, Joeseph

    2017-11-01

    High-fidelity particle image velocimetry data acquired in the Flow Physics Facility (FPF) at the University of New Hampshire is presented. Having a test section length of 72m, the FPF employs the ``big and slow'' approach to obtain well-resolved turbulent boundary layer measurements at high Reynolds number. We report on PIV measurements acquired in the streamwise-wall-normal plane at a downstream position 59m from the test-section inlet over the friction Reynolds number range 7000 < Reτ < 15000 . Local flow tracer seeding is employed through a wall-mounted slot fed by a large volume plenum located 13.4m upstream of the PIV measurement station. Both time-independent and time-dependent turbulent flow statistics are presented and compared to existing data.

  7. Wind-tunnel/flight correlation study of aerodynamic characteristics of a large flexible supersonic cruise airplane (XB-70-1). 3: A comparison between characteristics predicted from wind-tunnel measurements and those measured in flight

    NASA Technical Reports Server (NTRS)

    Arnaiz, H. H.; Peterson, J. B., Jr.; Daugherty, J. C.

    1980-01-01

    A program was undertaken by NASA to evaluate the accuracy of a method for predicting the aerodynamic characteristics of large supersonic cruise airplanes. This program compared predicted and flight-measured lift, drag, angle of attack, and control surface deflection for the XB-70-1 airplane for 14 flight conditions with a Mach number range from 0.76 to 2.56. The predictions were derived from the wind-tunnel test data of a 0.03-scale model of the XB-70-1 airplane fabricated to represent the aeroelastically deformed shape at a 2.5 Mach number cruise condition. Corrections for shape variations at the other Mach numbers were included in the prediction. For most cases, differences between predicted and measured values were within the accuracy of the comparison. However, there were significant differences at transonic Mach numbers. At a Mach number of 1.06 differences were as large as 27 percent in the drag coefficients and 20 deg in the elevator deflections. A brief analysis indicated that a significant part of the difference between drag coefficients was due to the incorrect prediction of the control surface deflection required to trim the airplane.

  8. Act on Numbers: Numerical Magnitude Influences Selection and Kinematics of Finger Movement

    PubMed Central

    Rugani, Rosa; Betti, Sonia; Ceccarini, Francesco; Sartori, Luisa

    2017-01-01

    In the past decade hand kinematics has been reliably adopted for investigating cognitive processes and disentangling debated topics. One of the most controversial issues in numerical cognition literature regards the origin – cultural vs. genetically driven – of the mental number line (MNL), oriented from left (small numbers) to right (large numbers). To date, the majority of studies have investigated this effect by means of response times, whereas studies considering more culturally unbiased measures such as kinematic parameters are rare. Here, we present a new paradigm that combines a “free response” task with the kinematic analysis of movement. Participants were seated in front of two little soccer goals placed on a table, one on the left and one on the right side. They were presented with left- or right-directed arrows and they were instructed to kick a small ball with their right index toward the goal indicated by the arrow. In a few test trials participants were presented also with a small (2) or a large (8) number, and they were allowed to choose the kicking direction. Participants performed more left responses with the small number and more right responses with the large number. The whole kicking movement was segmented in two temporal phases in order to make a hand kinematics’ fine-grained analysis. The Kick Preparation and Kick Finalization phases were selected on the basis of peak trajectory deviation from the virtual midline between the two goals. Results show an effect of both small and large numbers on action execution timing. Participants were faster to finalize the action when responding to small numbers toward the left and to large number toward the right. Here, we provide the first experimental demonstration which highlights how numerical processing affects action execution in a new and not-overlearned context. The employment of this innovative and unbiased paradigm will permit to disentangle the role of nature and culture in shaping the direction of MNL and the role of finger in the acquisition of numerical skills. Last but not least, similar paradigms will allow to determine how cognition can influence action execution. PMID:28912743

  9. Appreciating Uncertainty and Personal Preference in Genetic Testing.

    PubMed

    Kadlac, Adam

    2015-01-01

    Genetic testing seems to hold out hope for the cure of a number of debilitating conditions. At the same time, many people fear the information that genetic testing can make available. In this commentary, I argue that as of now, the nature of the information revealed in such tests should lead to cautious views about the value of genetic testing. Moreover, I suggest that our overall views about such testing should account for the fact that individuals place different sorts of value on the possession of their own genetic information. As a result, we should largely defer to personal preference in thinking about the propriety of genetic testing.

  10. Experimental Investigation of Jet-Induced Mixing of a Large Liquid Hydrogen Storage Tank

    NASA Technical Reports Server (NTRS)

    Lin, C. S.; Hasan, M. M.; Vandresar, N. T.

    1994-01-01

    Experiments have been conducted to investigate the effect of fluid mixing on the depressurization of a large liquid hydrogen storage tank. The test tank is approximately ellipsoidal, having a volume of 4.89 m(exp 3) and an average wall heat flux of 4.2 W/m(exp 2) due to external heat input. A mixer unit was installed near the bottom of the tank to generate an upward directed axial jet flow normal to the liquid-vapor interface. Mixing tests were initiated after achieving thermally stratified conditions in the tank either by the introduction of hydrogen gas into the tank or by self-pressurization due to ambient heat leak through the tank wall. The subcooled liquid jet directed towards the liquid-vapor interface by the mixer induced vapor condensation and caused a reduction in tank pressure. Tests were conducted at two jet submergence depths for jet Reynolds numbers from 80,000 to 495,000 and Richardson numbers from 0.014 to 0.52. Results show that the rate of tank pressure change is controlled by the competing effects of subcooled jet flow and the free convection boundary layer flow due to external tank wall heating. It is shown that existing correlations for mixing time and vapor condensation rate based on small scale tanks may not be applicable to large scale liquid hydrogen systems.

  11. Impact sensitivity test of liquid energetic materials

    NASA Astrophysics Data System (ADS)

    Tiutiaev, A.; Dolzhikov, A.; Zvereva, I.

    2017-10-01

    This paper presents new experimental method for sensitivity evaluation at the impact. A large number of researches shown that the probability of explosion initiating of liquid explosives by impact depends on the chemical nature and the various external characteristics. But the sensitivity of liquid explosive in the presence of gas bubbles increases many times as compared with the liquid without gas bubbles. In this case local chemical reaction focus are formed as a result of compression and heating of the gas inside the bubbles. In the liquid as a result of convection, wave motion, shock, etc. gas bubbles are easily generated, it is necessary to develop methods for determining sensitivity of liquid explosives to impact and to research the explosives ignition with bubbles. For the experimental investigation, the well-known impact machine and the so-called appliance 1 were used. Instead of the metal cup in the standard method in this paper polyurethane foam cylindrical container with liquid explosive was used. Polyurethane foam cylindrical container is easily deforms by impact. A large number of tests with different liquid explosives were made. It was found that the test liquid explosive to impact in appliance 1 with polyurethane foam to a large extent reflect the real mechanical sensitivity due to the small loss of impact energy on the deformation of the metal cup, as well as the best differentiation liquid explosive sensitivity due to the higher resolution method.

  12. Upper wing surface boundary layer measurements and static aerodynamic data obtained on a 0.015-scale model (42-0) or the SSV orbiter configuration 140A/B in the LTV HSWT at a Mach number of 4.6 (LA58)

    NASA Technical Reports Server (NTRS)

    Ball, J. W.; Lindahl, R. H.

    1976-01-01

    The purpose of the test was to investigate the nature of the Orbiter boundary layer characteristics at angles of attack from -4 to 32 degrees at a Mach number of 4.6. The effect of large grit, employed as transition strips, on both the nature of the boundary layer and the force and moment characteristics were investigated along with the effects of large negative elevon deflection on lee side separation. In addition, laminar and turbulent boundary layer separation phenomena which could cause asymmetric flow separation were investigated.

  13. Progress made in the construction of giant airplanes in Germany during the war

    NASA Technical Reports Server (NTRS)

    Baumann, A

    1920-01-01

    The construction of giant airplanes was begun in Germany in August, 1914. The tables annexed here show that a large number of airplanes weighing up to 15.5 tons were constructed and tested in Germany during the War, and it is certain that no other country turned out airplanes of this weight nor in such large numbers. An examination of the tables shows that by the end of the War all the manufacturers had arrived at a well-defined type, namely an airplane of about 12 tons with four engines of 260 horsepower each. The aircraft listed here are discussed with regard to useful weight and aerodynamic qualities.

  14. Development of a flash, bang, and smoke simulation of a shell burst

    NASA Technical Reports Server (NTRS)

    Williamson, F. R.; Kinney, J. F.; Wallace, T. V.

    1982-01-01

    A large number of experiments (cue test firings) were performed in the definition of the cue concepts and packaging configurations. A total of 344 of these experiments were recorded with instrumentation photography to allow a quantitative analysis of the smoke cloud to be made as a function of time. These analyses were predominantly made using a short test site. Supplementary long range visibility tests were conducted to insure the required 3 kilometer visibility of the smoke signature.

  15. Comparison of hemagglutination inhibition test and ELISA in quantification of antibodies to egg drop syndrome virus.

    PubMed

    Raj, G Dhinakar; Ratnapraba, S; Matheswaran, K; Nachimuthu, K

    2004-01-01

    A single-serum dilution ELISA for egg drop syndrome (EDS) virus-specific antibodies was developed. In testing 425 chicken sera it was found to have a 93.6% sensitivity and 98.7% specificity relative to a hemagglutination inhibition (HI) test. The correlation coefficient for ELISA and HI titers was 0.793. The ELISA was efficacious in quantification of both vaccinal and infection antibodies and could routinely be used for screening large numbers of field sera.

  16. Major International R and D Ranges and Test Facilities. Summary of Capabilities

    DTIC Science & Technology

    1990-01-01

    with a maximum impulse of firing of large numbers of rounds by a weapon in order to 1500 G and a maximum test item weight of 200 pounds. produce...millimetres may be fired safely, using test item weight of 1,000 pounds. training practice or training practice tracer ammunition. The :ange butts consist of...predictions of coherent sound propagation loss in the ocean. This information is useful in estimating the performance of low-frequency passive sonars

  17. Numerical simulation of a plane turbulent mixing layer, with applications to isothermal, rapid reactions

    NASA Technical Reports Server (NTRS)

    Lin, P.; Pratt, D. T.

    1987-01-01

    A hybrid method has been developed for the numerical prediction of turbulent mixing in a spatially-developing, free shear layer. Most significantly, the computation incorporates the effects of large-scale structures, Schmidt number and Reynolds number on mixing, which have been overlooked in the past. In flow field prediction, large-eddy simulation was conducted by a modified 2-D vortex method with subgrid-scale modeling. The predicted mean velocities, shear layer growth rates, Reynolds stresses, and the RMS of longitudinal velocity fluctuations were found to be in good agreement with experiments, although the lateral velocity fluctuations were overpredicted. In scalar transport, the Monte Carlo method was extended to the simulation of the time-dependent pdf transport equation. For the first time, the mixing frequency in Curl's coalescence/dispersion model was estimated by using Broadwell and Breidenthal's theory of micromixing, which involves Schmidt number, Reynolds number and the local vorticity. Numerical tests were performed for a gaseous case and an aqueous case. Evidence that pure freestream fluids are entrained into the layer by large-scale motions was found in the predicted pdf. Mean concentration profiles were found to be insensitive to Schmidt number, while the unmixedness was higher for higher Schmidt number. Applications were made to mixing layers with isothermal, fast reactions. The predicted difference in product thickness of the two cases was in reasonable quantitative agreement with experimental measurements.

  18. Measurements in Flight of the Longitudinal-Stability Characteristics of a Republic YF-84A Airplane (Army Serial No. 45-59488) at High Subsonic Mach Numbers

    NASA Technical Reports Server (NTRS)

    Turner, Howard L.; Cooper, George E.

    1948-01-01

    A brief investigation was made of the longitudinal-stability characteristics of a YF-84A airplane (Army Serial No. 45-79488). The airplane developed a pitching-up tendency at approximately 0.80 Mach number which necessitated large push forces and down-elevator deflections for further increases in speed. In steady turns at 35,000 feet with the center of gravity at 28.3 percent mean aerodynamic chord for normal accelerations up to the maximum test value, the control-force gradients were excessive at Mach numbers over 0.78. Airplane buffeting did not present a serious problem in accelerated or unaccelerated flight at 15,000 and 35,000 feet up to the maximum test Mach number of 0.84. It is believed that excessive control force would be the limiting factor in attaining speeds in excess of 0.84 Mach number, especially at altitudes below 35,000 feet.

  19. Wheelchair Shuttle Test for Assessing Aerobic Fitness in Youth With Spina Bifida: Validity and Reliability

    PubMed Central

    de Groot, Janke F.; Backx, Frank J.G.; Benner, Joyce; Kruitwagen, Cas L.J.J.; Takken, Tim

    2017-01-01

    Abstract Background Testing aerobic fitness in youth is important because of expected relationships with health. Objective The purpose of the study was to estimate the validity and reliability of the Shuttle Ride Test in youth who have spina bifida and use a wheelchair for mobility and sport. Design Ths study is a validity and reliability study. Methods The Shuttle Ride Test, Graded Wheelchair Propulsion Test, and skill-related fitness tests were administered to 33 participants for the validity study (age = 14.5 ± 3.1 y) and to 28 participants for the reliability study (age = 14.7 ± 3.3 y). Results No significant differences were found between the Graded Wheelchair Propulsion Test and the Shuttle Ride Test for most cardiorespiratory responses. Correlations between the Graded Wheelchair Propulsion Test and the Shuttle Ride Test were moderate to high (r = .55–.97). The variance in peak oxygen uptake (VO2peak) could be predicted for 77% of the participants by height, number of shuttles completed, and weight, with large prediction intervals. High correlations were found between number of shuttles completed and skill-related fitness tests (CI = .73 to −.92). Intraclass correlation coefficients were high (.77–.98), with a smallest detectable change of 1.5 for number of shuttles completed and with coefficients of variation of 6.2% and 6.4% for absolute VO2peak and relative VO2peak, respectively. Conclusions When measuring VO2peak directly by using a mobile gas analysis system, the Shuttle Ride Test is highly valid for testing VO2peak in youth who have spina bifida and use a wheelchair for mobility and sport. The outcome measure of number of shuttles represents aerobic fitness and is also highly correlated with both anaerobic performance and agility. It is not possible to predict VO2peak accurately by using the number of shuttles completed. Moreover, the Shuttle Ride Test is highly reliable in youth with spina bifida, with a good smallest detectable change for the number of shuttles completed. PMID:29029556

  20. Biological-Warfare Agent Decontamination Efficacy Testing: Large-Scale Chamber mVHP (Trademark) Decontamination System Evaluation for Biological Contamination

    DTIC Science & Technology

    2007-08-01

    Aluminum - +- - - Viton + + + _ . Silicone .... Polyimide (Kapton) + . _ . 81 - Apex .... B1 - Stens .... 21 3.5.5 Enumerated Coupon Results. The first...Vaporous Hydrogen Peroxide mVHP B. anthracis Silicone G. stearothermophilus CARC Metal 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF 18. NUMBER OF 19a...aircraft, vehicles, protective- and sensitive-equipment that encompass a variety of material properties, compositions and porosities. The test

  1. Blue Whale Behavioral Response Study & Field Testing of the New Bioacoustic Probe

    DTIC Science & Technology

    2012-09-30

    L. T. HATCH and C. W. CLARK. 2003. Variation in humpback whale (Megaptera novaeangliae) song length in relation to low-frequency sound broadcasts...1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Blue Whale Behavioral Response Study & Field Testing of...ucsd.edu Award Number: N000140811221 LONG-TERM GOALS Task 1: Blue Whales Behavioral Response Study The behavioral response of large whales

  2. Report on the operations of the coal-testing plant of the United States Geological Survey at the Louisiana Purchase Exposition, Saint Louis, Missouri, 1904: Part I.--Field work, classification of coals, chemical work

    USGS Publications Warehouse

    Parker, E.W.; Holmes, J.A.; Campbell, M.R.

    1906-01-01

    Notwithstanding these delays, the committee feels that through the hearty and patriotic cooperation of a large number of manufacturers of apparatus and machinery it was able to collect and install, within a notably short time, a testing plant that was well suited for such pioneer work.

  3. Development and tests of MCP based timing and multiplicity detector for MIPs

    NASA Astrophysics Data System (ADS)

    Feofilov, G.; Kondratev, V.; Stolyarov, O.; Tulina, T.; Valiev, F.; Vinogradov, L.

    2017-01-01

    We present summary of technological developments and tests of the MCP based large area detector aimed at precise timing and charged particles multiplicity measurements. Results obtained in course of these developments of isochronous (simultaneity) precise signal readout, passive summation of 1 ns signals, fast (1 GHz) front-end electronics, miniature vacuum systems, etc. could be potentially interesting for a number of future applications in different fields.

  4. Idaho storm warning system operational test : final report

    DOT National Transportation Integrated Search

    2000-12-01

    The Storm Warning Project was initiated in 1993 as a result of a large number of serious traffic crashes that occurred during periods of low visibility on I-84 in southeastern Idaho between 1988 and 1993. The purpose of the project was to determine i...

  5. Getting physical to fix pharma

    NASA Astrophysics Data System (ADS)

    Connelly, Patrick R.; Vuong, T. Minh; Murcko, Mark A.

    2011-09-01

    Powerful technologies allow the synthesis and testing of large numbers of new compounds, but the failure rate of pharmaceutical R&D remains very high. Greater understanding of the fundamental physical chemical behaviour of molecules could be the key to greatly enhancing the success rate of drug discovery.

  6. Characterization of Human Neural Progenitor Cell Models for Developmental Neurotoxicity Screening

    EPA Science Inventory

    Current testing methods for developmental neurotoxicity (DNT) make evaluation of the effects of large numbers of chemicals impractical and prohibitively expensive. As such, we are evaluating two different human neural progenitor cell (hNPC) models for their utility in screens for...

  7. PROTEOMICS IN ECOTOXICOLOGY: PROTEIN EXPRESSION PROFILING TO SCREEN CHEMICALS FOR ENDOCRINE ACTIVITY

    EPA Science Inventory

    Abstract for poster.

    Current endocrine testing methods are animal intensive and lack the throughput necessary to screen large numbers of environmental chemicals for adverse effects. In this study, Matrix Assisted Laser Desorption/Ionization Time-of-Flight Mass Spectrometry...

  8. TECHNICAL CHALLENGES ASSOCIATED WITH ASSESSING THE IN VITRO PULMONARY TOXICITY OF CARBON NANOTUBES

    EPA Science Inventory

    Nanotechnology continues to produce a large number of diverse engineered nanomaterials (NMs) with novel physicochemical properties for a variety of applications. Test methods that accurately assess/predict the toxicity of NMs are critically needed and it is unclear whether curren...

  9. Application of holography to flow visualization

    NASA Technical Reports Server (NTRS)

    Lee, G.

    1984-01-01

    Laser holographic interferometry is being applied to many different types of aerodynamics problems. These include two and three dimensional flows in wind tunnels, ballistic ranges, rotor test chambers and turbine facilities. Density over a large field is measured and velocity, pressure, and mach number can be deduced.

  10. Functional Assays and Alternative Species: Using Larval Zebrafish in Developmental Neurotoxicity Screening**

    EPA Science Inventory

    The U.S. Environmental Protection Agency is evaluating methods to screen and prioritize large numbers of chemicals for developmental toxicity. As such, we are exploring a behavioral testing paradigm, which can assess the effect of sublethal and subteratogenic concentrations of de...

  11. Studies of the Variables Affecting Behavior of Larval Zebrafish for Developmental Neurotoxicity Testing

    EPA Science Inventory

    The U.S. Environmental Protection Agency is evaluating methods to screen and prioritize large numbers of chemicals for developmental toxicity. We are exploring methods to detect developmentally neurotoxic chemicals using zebrafish behavior at 6 days of age. The behavioral paradig...

  12. Parameterized examination in econometrics

    NASA Astrophysics Data System (ADS)

    Malinova, Anna; Kyurkchiev, Vesselin; Spasov, Georgi

    2018-01-01

    The paper presents a parameterization of basic types of exam questions in Econometrics. This algorithm is used to automate and facilitate the process of examination, assessment and self-preparation of a large number of students. The proposed parameterization of testing questions reduces the time required to author tests and course assignments. It enables tutors to generate a large number of different but equivalent dynamic questions (with dynamic answers) on a certain topic, which are automatically assessed. The presented methods are implemented in DisPeL (Distributed Platform for e-Learning) and provide questions in the areas of filtering and smoothing of time-series data, forecasting, building and analysis of single-equation econometric models. Questions also cover elasticity, average and marginal characteristics, product and cost functions, measurement of monopoly power, supply, demand and equilibrium price, consumer and product surplus, etc. Several approaches are used to enable the required numerical computations in DisPeL - integration of third-party mathematical libraries, developing our own procedures from scratch, and wrapping our legacy math codes in order to modernize and reuse them.

  13. Random sampling of constrained phylogenies: conducting phylogenetic analyses when the phylogeny is partially known.

    PubMed

    Housworth, E A; Martins, E P

    2001-01-01

    Statistical randomization tests in evolutionary biology often require a set of random, computer-generated trees. For example, earlier studies have shown how large numbers of computer-generated trees can be used to conduct phylogenetic comparative analyses even when the phylogeny is uncertain or unknown. These methods were limited, however, in that (in the absence of molecular sequence or other data) they allowed users to assume that no phylogenetic information was available or that all possible trees were known. Intermediate situations where only a taxonomy or other limited phylogenetic information (e.g., polytomies) are available are technically more difficult. The current study describes a procedure for generating random samples of phylogenies while incorporating limited phylogenetic information (e.g., four taxa belong together in a subclade). The procedure can be used to conduct comparative analyses when the phylogeny is only partially resolved or can be used in other randomization tests in which large numbers of possible phylogenies are needed.

  14. Quantum Monte Carlo studies of solvated systems

    NASA Astrophysics Data System (ADS)

    Schwarz, Kathleen; Letchworth Weaver, Kendra; Arias, T. A.; Hennig, Richard G.

    2011-03-01

    Solvation qualitatively alters the energetics of diverse processes from protein folding to reactions on catalytic surfaces. An explicit description of the solvent in quantum-mechanical calculations requires both a large number of electrons and exploration of a large number of configurations in the phase space of the solvent. These problems can be circumvented by including the effects of solvent through a rigorous classical density-functional description of the liquid environment, thereby yielding free energies and thermodynamic averages directly, while eliminating the need for explicit consideration of the solvent electrons. We have implemented and tested this approach within the CASINO Quantum Monte Carlo code. Our method is suitable for calculations in any basis within CASINO, including b-spline and plane wave trial wavefunctions, and is equally applicable to molecules, surfaces, and crystals. For our preliminary test calculations, we use a simplified description of the solvent in terms of an isodensity continuum dielectric solvation approach, though the method is fully compatible with more reliable descriptions of the solvent we shall employ in the future.

  15. Metabolic emergencies and the emergency physician.

    PubMed

    Fletcher, Janice Mary

    2016-02-01

    Fifty percent of inborn errors of metabolism are present in later childhood and adulthood, with crises commonly precipitated by minor viral illnesses or increased protein ingestion. Many physicians only consider IEM after more common conditions (such as sepsis) have been considered. In view of the large number of inborn errors, it might appear that their diagnosis requires precise knowledge of a large number of biochemical pathways and their interrelationship. As a matter of fact, an adequate diagnostic approach can be based on the proper use of only a few screening tests. A detailed history of antecedent events, together with these simple screening tests, can be diagnostic, leading to life-saving, targeted treatments for many disorders. Unrecognised, IEM can lead to significant mortality and morbidity. Advice is available 24/7 through the metabolic service based at the major paediatric hospital in each state and Starship Children's Health in New Zealand. © 2016 The Author. Journal of Paediatrics and Child Health © 2016 Paediatrics and Child Health Division (Royal Australasian College of Physicians).

  16. Development of Dynamic Flow Field Pressure Probes Suitable for Use in Large Scale Supersonic Wind Tunnels

    NASA Technical Reports Server (NTRS)

    Porro, A. Robert

    2000-01-01

    A series of dynamic flow field pressure probes were developed for use in large-scale supersonic wind tunnels at NASA Glenn Research Center. These flow field probes include pitot, static, and five-hole conical pressure probes that are capable of capturing fast acting flow field pressure transients that occur on a millisecond time scale. The pitot and static probes can be used to determine local Mach number time histories during a transient event. The five-hole conical pressure probes are used primarily to determine local flow angularity, but can also determine local Mach number. These probes were designed, developed, and tested at the NASA Glenn Research Center. They were also used in a NASA Glenn 10-by 10-Foot Supersonic Wind Tunnel (SWT) test program where they successfully acquired flow field pressure data in the vicinity of a propulsion system during an engine compressor staff and inlet unstart transient event. Details of the design, development, and subsequent use of these probes are discussed in this report.

  17. On the use of total aerobic spore bacteria to make treatment decisions due to Cryptosporidium risk at public water system wells.

    PubMed

    Berger, Philip; Messner, Michael J; Crosby, Jake; Vacs Renwick, Deborah; Heinrich, Austin

    2018-05-01

    Spore reduction can be used as a surrogate measure of Cryptosporidium natural filtration efficiency. Estimates of log10 (log) reduction were derived from spore measurements in paired surface and well water samples in Casper Wyoming and Kearney Nebraska. We found that these data were suitable for testing the hypothesis (H 0 ) that the average reduction at each site was 2 log or less, using a one-sided Student's t-test. After establishing data quality objectives for the test (expressed as tolerable Type I and Type II error rates), we evaluated the test's performance as a function of the (a) true log reduction, (b) number of paired samples assayed and (c) variance of observed log reductions. We found that 36 paired spore samples are sufficient to achieve the objectives over a wide range of variance, including the variances observed in the two data sets. We also explored the feasibility of using smaller numbers of paired spore samples to supplement bioparticle counts for screening purposes in alluvial aquifers, to differentiate wells with large volume surface water induced recharge from wells with negligible surface water induced recharge. With key assumptions, we propose a normal statistical test of the same hypothesis (H 0 ), but with different performance objectives. As few as six paired spore samples appear adequate as a screening metric to supplement bioparticle counts to differentiate wells in alluvial aquifers with large volume surface water induced recharge. For the case when all available information (including failure to reject H 0 based on the limited paired spore data) leads to the conclusion that wells have large surface water induced recharge, we recommend further evaluation using additional paired biweekly spore samples. Published by Elsevier GmbH.

  18. High Reynolds Number Investigation of a Flush Mounted, S-Duct Inlet With Large Amounts of Boundary Layer Ingestion

    NASA Technical Reports Server (NTRS)

    Berrier, Bobby L.; Carter, Melissa B.; Allan, Brian G.

    2005-01-01

    An experimental investigation of a flush-mounted, S-duct inlet with large amounts of boundary layer ingestion has been conducted at Reynolds numbers up to full scale. The study was conducted in the NASA Langley Research Center 0.3-Meter Transonic Cryogenic Tunnel. In addition, a supplemental computational study on one of the inlet configurations was conducted using the Navier-Stokes flow solver, OVERFLOW. Tests were conducted at Mach numbers from 0.25 to 0.83, Reynolds numbers (based on aerodynamic interface plane diameter) from 5.1 million to 13.9 million (full-scale value), and inlet mass-flow ratios from 0.29 to 1.22, depending on Mach number. Results of the study indicated that increasing Mach number, increasing boundary layer thickness (relative to inlet height) or ingesting a boundary layer with a distorted profile decreased inlet performance. At Mach numbers above 0.4, increasing inlet airflow increased inlet pressure recovery but also increased distortion. Finally, inlet distortion was found to be relatively insensitive to Reynolds number, but pressure recovery increased slightly with increasing Reynolds number.This CD-ROM supplement contains inlet data including: Boundary layer data, Duct static pressure data, performance-AIP (fan face) data, Photos, Tunnel wall P-PTO data and definitions.

  19. Proficiency testing as a basis for estimating uncertainty of measurement: application to forensic alcohol and toxicology quantitations.

    PubMed

    Wallace, Jack

    2010-05-01

    While forensic laboratories will soon be required to estimate uncertainties of measurement for those quantitations reported to the end users of the information, the procedures for estimating this have been little discussed in the forensic literature. This article illustrates how proficiency test results provide the basis for estimating uncertainties in three instances: (i) For breath alcohol analyzers the interlaboratory precision is taken as a direct measure of uncertainty. This approach applies when the number of proficiency tests is small. (ii) For blood alcohol, the uncertainty is calculated from the differences between the laboratory's proficiency testing results and the mean quantitations determined by the participants; this approach applies when the laboratory has participated in a large number of tests. (iii) For toxicology, either of these approaches is useful for estimating comparability between laboratories, but not for estimating absolute accuracy. It is seen that data from proficiency tests enable estimates of uncertainty that are empirical, simple, thorough, and applicable to a wide range of concentrations.

  20. Simulating Large Area, High Intensity AM0 Illumination on Earth- Representative Testing at Elevated Temperatures for the BepiColombo and SolO Missions

    NASA Astrophysics Data System (ADS)

    Oberhuttinger, C.; Quabis, D.; Zimmermann, C. G.

    2014-08-01

    During both the BepiColombo and the Solar Orbiter (SolO) mission, severe environmental conditions with sun intensities up to 10.6 solar constants (SCs) resp. 12.8 SCs will be encountered. Therefore, a special cell design was developed which can withstand these environmental loads. To verify the solar cells under representative conditions, a set of specific tests is conducted. The key qualification test for these high intensity, high temperature (HIHT) missions is a combined test, which exposes a large number of cells simultaneously to the complete AM0 spectrum at the required irradiance and temperature. Such a test was set up in the VTC1.5 chamber located at ESTEC. This paper provides an overview of the challenges in designing a setup capable of achieving this HIHT simulation. The solutions that were developed will be presented. Also the performance of the setup will be illustrated by actual test results.

  1. Is there an association between astrological data and personality?

    PubMed

    Hume, N

    1977-07-01

    A test was made of the hypothesis that personality characteristics can be predicted on the basis of various features of the individual's astrological chart. Astrological charts were prepared for 196 college-age Ss who also were administered the MMPI and the Leary Interpersonal Check List. Ss were divided into those who had extreme scores on any of the 13 personality variables studied and those who did not. For each personality variable, comparisons were made on a large number of astrological dimensions between distributions of Ss with and without extreme test scores. Six hundred thirty-two such comparisons were made and evaluated with chi-square tests. In that the obtained number of statistically significnat chi-squares was less than what would be expected on a chance basis, the hypothesis was rejected.

  2. An experimental evaluation of S-duct inlet-diffuser configurations for turboprop offset gearbox applications

    NASA Technical Reports Server (NTRS)

    Mcdill, Paul L.

    1986-01-01

    A test program, utilizing a large scale model, was run in the NASA Lewis Research Center 10- by 10-ft wind tunnel to examine the influence on performance of design parameters of turboprop S-duct inlet/diffuser systems. The parametric test program investigated inlet lip thickness, inlet/diffuser cross-sectional geometry, throat design Mach number, and shaft fairing shape. The test program was run at angles of attack to 15 deg and tunnel Mach numbers to 0.35. Results of the program indicate that current design techniques can be used to design inlet/diffuser systems with acceptable total pressure recovery, but several of the design parameters, notably lip thickness (contraction ratio) and shaft fairing cross section, must be optimized to prevent excessive distortion at the compressor face.

  3. Experimental Study of Homogeneous Isotropic Slowly-Decaying Turbulence in Giant Grid-Wind Tunnel Set Up

    NASA Astrophysics Data System (ADS)

    Aliseda, Alberto; Bourgoin, Mickael; Eswirp Collaboration

    2014-11-01

    We present preliminary results from a recent grid turbulence experiment conducted at the ONERA wind tunnel in Modane, France. The ESWIRP Collaboration was conceived to probe the smallest scales of a canonical turbulent flow with very high Reynolds numbers. To achieve this, the largest scales of the turbulence need to be extremely big so that, even with the large separation of scales, the smallest scales would be well above the spatial and temporal resolution of the instruments. The ONERA wind tunnel in Modane (8 m -diameter test section) was chosen as a limit of the biggest large scales achievable in a laboratory setting. A giant inflatable grid (M = 0.8 m) was conceived to induce slowly-decaying homogeneous isotropic turbulence in a large region of the test section, with minimal structural risk. An international team or researchers collected hot wire anemometry, ultrasound anemometry, resonant cantilever anemometry, fast pitot tube anemometry, cold wire thermometry and high-speed particle tracking data of this canonical turbulent flow. While analysis of this large database, which will become publicly available over the next 2 years, has only started, the Taylor-scale Reynolds number is estimated to be between 400 and 800, with Kolmogorov scales as large as a few mm . The ESWIRP Collaboration is formed by an international team of scientists to investigate experimentally the smallest scales of turbulence. It was funded by the European Union to take advantage of the largest wind tunnel in Europe for fundamental research.

  4. Tests of Sunspot Number Sequences: 3. Effects of Regression Procedures on the Calibration of Historic Sunspot Data

    NASA Astrophysics Data System (ADS)

    Lockwood, M.; Owens, M. J.; Barnard, L.; Usoskin, I. G.

    2016-11-01

    We use sunspot-group observations from the Royal Greenwich Observatory (RGO) to investigate the effects of intercalibrating data from observers with different visual acuities. The tests are made by counting the number of groups [RB] above a variable cut-off threshold of observed total whole spot area (uncorrected for foreshortening) to simulate what a lower-acuity observer would have seen. The synthesised annual means of RB are then re-scaled to the full observed RGO group number [RA] using a variety of regression techniques. It is found that a very high correlation between RA and RB (r_{AB} > 0.98) does not prevent large errors in the intercalibration (for example sunspot-maximum values can be over 30 % too large even for such levels of r_{AB}). In generating the backbone sunspot number [R_{BB}], Svalgaard and Schatten ( Solar Phys., 2016) force regression fits to pass through the scatter-plot origin, which generates unreliable fits (the residuals do not form a normal distribution) and causes sunspot-cycle amplitudes to be exaggerated in the intercalibrated data. It is demonstrated that the use of Quantile-Quantile ("Q-Q") plots to test for a normal distribution is a useful indicator of erroneous and misleading regression fits. Ordinary least-squares linear fits, not forced to pass through the origin, are sometimes reliable (although the optimum method used is shown to be different when matching peak and average sunspot-group numbers). However, other fits are only reliable if non-linear regression is used. From these results it is entirely possible that the inflation of solar-cycle amplitudes in the backbone group sunspot number as one goes back in time, relative to related solar-terrestrial parameters, is entirely caused by the use of inappropriate and non-robust regression techniques to calibrate the sunspot data.

  5. A review and meta-analysis of the enemy release hypothesis in plant–herbivorous insect systems

    PubMed Central

    Meijer, Kim; Schilthuizen, Menno; Beukeboom, Leo

    2016-01-01

    A suggested mechanism for the success of introduced non-native species is the enemy release hypothesis (ERH). Many studies have tested the predictions of the ERH using the community approach (native and non-native species studied in the same habitat) or the biogeographical approach (species studied in their native and non-native range), but results are highly variable, possibly due to large variety of study systems incorporated. We therefore focused on one specific system: plants and their herbivorous insects. We performed a systematic review and compiled a large number (68) of datasets from studies comparing herbivorous insects on native and non-native plants using the community or biogeographical approach. We performed a meta-analysis to test the predictions from the ERH for insect diversity (number of species), insect load (number of individuals) and level of herbivory for both the community and biogeographical approach. For both the community and biogeographical approach insect diversity was significantly higher on native than on non-native plants. Insect load tended to be higher on native than non-native plants at the community approach only. Herbivory was not different between native and non-native plants at the community approach, while there was too little data available for testing the biogeographical approach. Our meta-analysis generally supports the predictions from the ERH for both the community and biogeographical approach, but also shows that the outcome is importantly determined by the response measured and approach applied. So far, very few studies apply both approaches simultaneously in a reciprocal manner while this is arguably the best way for testing the ERH. PMID:28028463

  6. Reverberation Chamber Uniformity Validation and Radiated Susceptibility Test Procedures for the NASA High Intensity Radiated Fields Laboratory

    NASA Technical Reports Server (NTRS)

    Koppen, Sandra V.; Nguyen, Truong X.; Mielnik, John J.

    2010-01-01

    The NASA Langley Research Center's High Intensity Radiated Fields Laboratory has developed a capability based on the RTCA/DO-160F Section 20 guidelines for radiated electromagnetic susceptibility testing in reverberation chambers. Phase 1 of the test procedure utilizes mode-tuned stirrer techniques and E-field probe measurements to validate chamber uniformity, determines chamber loading effects, and defines a radiated susceptibility test process. The test procedure is segmented into numbered operations that are largely software controlled. This document is intended as a laboratory test reference and includes diagrams of test setups, equipment lists, as well as test results and analysis. Phase 2 of development is discussed.

  7. Recent developments in rotary-balance testing of fighter aircraft configurations at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Malcolm, G. N.; Schiff, L. B.

    1985-01-01

    Two rotary balance apparatuses were developed for testing airplane models in a coning motion. A large scale apparatus, developed for use in the 12-Foot Pressure Wind tunnel primarily to permit testing at high Reynolds numbers, was recently used to investigate the aerodynamics of 0.05-scale model of the F-15 fighter aircraft. Effects of Reynolds number, spin rate parameter, model attitude, presence of a nose boom, and model/sting mounting angle were investigated. A smaller apparatus, which investigates the aerodynamics of bodies of revolution in a coning motion, was used in the 6-by-6 foot Supersonic Wind Tunnel to investigate the aerodynamic behavior of a simple representation of a modern fighter, the Standard Dynamic Model (SDM). Effects of spin rate parameter and model attitude were investigated. A description of the two rigs and a discussion of some of the results obtained in the respective test are presented.

  8. Experimental Investigation of a Large-Scale Low-Boom Inlet Concept

    NASA Technical Reports Server (NTRS)

    Hirt, Stefanie M.; Chima, Rodrick V.; Vyas, Manan A.; Wayman, Thomas R.; Conners, Timothy R.; Reger, Robert W.

    2011-01-01

    A large-scale low-boom inlet concept was tested in the NASA Glenn Research Center 8- x 6- foot Supersonic Wind Tunnel. The purpose of this test was to assess inlet performance, stability and operability at various Mach numbers and angles of attack. During this effort, two models were tested: a dual stream inlet designed to mimic potential aircraft flight hardware integrating a high-flow bypass stream; and a single stream inlet designed to study a configuration with a zero-degree external cowl angle and to permit surface visualization of the vortex generator flow on the internal centerbody surface. During the course of the test, the low-boom inlet concept was demonstrated to have high recovery, excellent buzz margin, and high operability. This paper will provide an overview of the setup, show a brief comparison of the dual stream and single stream inlet results, and examine the dual stream inlet characteristics.

  9. Glass sample preparation and performance investigations. [solar x-ray imager

    NASA Technical Reports Server (NTRS)

    Johnson, R. Barry

    1992-01-01

    This final report details the work performed under this delivery order from April 1991 through April 1992. The currently available capabilities for integrated optical performance modeling at MSFC for large and complex systems such as AXAF were investigated. The Integrated Structural Modeling (ISM) program developed by Boeing for the U.S. Air Force was obtained and installed on two DECstations 5000 at MSFC. The structural, thermal and optical analysis programs available in ISM were evaluated. As part of the optomechanical engineering activities, technical support was provided in the design of support structure, mirror assembly, filter wheel assembly and material selection for the Solar X-ray Imager (SXI) program. As part of the fabrication activities, a large number of zerodur glass samples were prepared in different sizes and shapes for acid etching, coating and polishing experiments to characterize the subsurface damage and stresses produced by the grinding and polishing operations. Various optical components for AXAF video microscope and the x-ray test facility were also fabricated. A number of glass fabrication and test instruments such as a scatter plate interferometer, a gravity feed saw and some phenolic cutting blades were fabricated, integrated and tested.

  10. Endwall Heat Transfer Measurements in a Transonic Turbine Cascade

    NASA Technical Reports Server (NTRS)

    Giel, P. W.; Thurman, D. R.; VanFossen, G. J.; Hippensteele, S. A.; Boyle, R. J.

    1996-01-01

    Turbine blade endwall heat transfer measurements are given for a range of Reynolds and Mach numbers. Data were obtained for Reynolds numbers based on inlet conditions of 0.5 and 1.0 x 106, for isentropic exit Mach numbers of 1.0 and 1.3, and for freestream turbulence intensities of 0.25% and 7.0%. Tests were conducted in a linear cascade at the NASA Lewis Transonic Turbine Blade Cascade Facility. The test article was a turbine rotor with 136' of turning and an axial chord of 12.7 cm. The large scale allowed for very detailed measurements of both flow field and surface phenomena. The intent of the work is to provide benchmark quality data for computational fluid dynamics (CFD) code and model verification. The flow field in the cascade is highly three-dimensional as a result of thick boundary layers at the test section inlet. Endwall heat transfer data were obtained using a steady-state liquid crystal technique.

  11. Development of a Semi-Span Test Capability at the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Gatlin, G. M.; Parker, P. A.; Owens, L. R., Jr.

    2001-01-01

    A need for low-speed, high Reynolds number test capabilities has been identified for the design and development of advanced subsonic transport high-lift systems. In support of this need, multiple investigations have been conducted in the National Transonic Facility (NTF) at the NASA Langley Research Center to develop a semi-span testing capability that will provide the low-speed, flight Reynolds number data currently unattainable using conventional sting-mounted, full-span models. Although a semi-span testing capability will effectively double the Reynolds number capability over full-span models, it does come at the expense of contending with the issue of the interaction of the flow over the model with the windtunnel wall boundary layer. To address this issue the size and shape of the semi-span model mounting geometry have been investigated, and the results are presented herein. The cryogenic operating environment of the NTF produced another semi-span test technique issue in that varying thermal gradients have developed on the large semi-span balance. The suspected cause of these thermal gradients and methods to eliminate them are presented. Data are also presented that demonstrate the successful elimination of these varying thermal gradients during cryogenic operations.

  12. Inverse sampling regression for pooled data.

    PubMed

    Montesinos-López, Osval A; Montesinos-López, Abelardo; Eskridge, Kent; Crossa, José

    2017-06-01

    Because pools are tested instead of individuals in group testing, this technique is helpful for estimating prevalence in a population or for classifying a large number of individuals into two groups at a low cost. For this reason, group testing is a well-known means of saving costs and producing precise estimates. In this paper, we developed a mixed-effect group testing regression that is useful when the data-collecting process is performed using inverse sampling. This model allows including covariate information at the individual level to incorporate heterogeneity among individuals and identify which covariates are associated with positive individuals. We present an approach to fit this model using maximum likelihood and we performed a simulation study to evaluate the quality of the estimates. Based on the simulation study, we found that the proposed regression method for inverse sampling with group testing produces parameter estimates with low bias when the pre-specified number of positive pools (r) to stop the sampling process is at least 10 and the number of clusters in the sample is also at least 10. We performed an application with real data and we provide an NLMIXED code that researchers can use to implement this method.

  13. Present Status and Extensions of the Monte Carlo Performance Benchmark

    NASA Astrophysics Data System (ADS)

    Hoogenboom, J. Eduard; Petrovic, Bojan; Martin, William R.

    2014-06-01

    The NEA Monte Carlo Performance benchmark started in 2011 aiming to monitor over the years the abilities to perform a full-size Monte Carlo reactor core calculation with a detailed power production for each fuel pin with axial distribution. This paper gives an overview of the contributed results thus far. It shows that reaching a statistical accuracy of 1 % for most of the small fuel zones requires about 100 billion neutron histories. The efficiency of parallel execution of Monte Carlo codes on a large number of processor cores shows clear limitations for computer clusters with common type computer nodes. However, using true supercomputers the speedup of parallel calculations is increasing up to large numbers of processor cores. More experience is needed from calculations on true supercomputers using large numbers of processors in order to predict if the requested calculations can be done in a short time. As the specifications of the reactor geometry for this benchmark test are well suited for further investigations of full-core Monte Carlo calculations and a need is felt for testing other issues than its computational performance, proposals are presented for extending the benchmark to a suite of benchmark problems for evaluating fission source convergence for a system with a high dominance ratio, for coupling with thermal-hydraulics calculations to evaluate the use of different temperatures and coolant densities and to study the correctness and effectiveness of burnup calculations. Moreover, other contemporary proposals for a full-core calculation with realistic geometry and material composition will be discussed.

  14. Efficient geostatistical inversion of transient groundwater flow using preconditioned nonlinear conjugate gradients

    NASA Astrophysics Data System (ADS)

    Klein, Ole; Cirpka, Olaf A.; Bastian, Peter; Ippisch, Olaf

    2017-04-01

    In the geostatistical inverse problem of subsurface hydrology, continuous hydraulic parameter fields, in most cases hydraulic conductivity, are estimated from measurements of dependent variables, such as hydraulic heads, under the assumption that the parameter fields are autocorrelated random space functions. Upon discretization, the continuous fields become large parameter vectors with O (104 -107) elements. While cokriging-like inversion methods have been shown to be efficient for highly resolved parameter fields when the number of measurements is small, they require the calculation of the sensitivity of each measurement with respect to all parameters, which may become prohibitive with large sets of measured data such as those arising from transient groundwater flow. We present a Preconditioned Conjugate Gradient method for the geostatistical inverse problem, in which a single adjoint equation needs to be solved to obtain the gradient of the objective function. Using the autocovariance matrix of the parameters as preconditioning matrix, expensive multiplications with its inverse can be avoided, and the number of iterations is significantly reduced. We use a randomized spectral decomposition of the posterior covariance matrix of the parameters to perform a linearized uncertainty quantification of the parameter estimate. The feasibility of the method is tested by virtual examples of head observations in steady-state and transient groundwater flow. These synthetic tests demonstrate that transient data can reduce both parameter uncertainty and time spent conducting experiments, while the presented methods are able to handle the resulting large number of measurements.

  15. A Large Scale Wind Tunnel for the Study of High Reynolds Number Turbulent Boundary Layer Physics

    NASA Astrophysics Data System (ADS)

    Priyadarshana, Paththage; Klewicki, Joseph; Wosnik, Martin; White, Chris

    2008-11-01

    Progress and the basic features of the University of New Hampshire's very large multi-disciplinary wind tunnel are reported. The refinement of the overall design has been greatly aided through consultations with an external advisory group. The facility test section is 73 m long, 6 m wide, and 2.5 m nominally high, and the maximum free stream velocity is 30 m/s. A very large tunnel with relatively low velocities makes the small scale turbulent motions resolvable by existing measurement systems. The maximum Reynolds number is estimated at &+circ;= δuτ/ν˜50000, where δ is the boundary layer thickness and uτ is the friction velocity. The effects of scale separation on the generation of the Reynolds stress gradient appearing in the mean momentum equation are briefly discussed to justify the need to attain &+circ; in excess of about 40000. Lastly, plans for future utilization of the facility as a community-wide resource are outlined. This project is supported through the NSF-EPSCoR RII Program, grant number EPS0701730.

  16. Species-area relationships and extinction forecasts.

    PubMed

    Halley, John M; Sgardeli, Vasiliki; Monokrousos, Nikolaos

    2013-05-01

    The species-area relationship (SAR) predicts that smaller areas contain fewer species. This is the basis of the SAR method that has been used to forecast large numbers of species committed to extinction every year due to deforestation. The method has a number of issues that must be handled with care to avoid error. These include the functional form of the SAR, the choice of equation parameters, the sampling procedure used, extinction debt, and forest regeneration. Concerns about the accuracy of the SAR technique often cite errors not much larger than the natural scatter of the SAR itself. Such errors do not undermine the credibility of forecasts predicting large numbers of extinctions, although they may be a serious obstacle in other SAR applications. Very large errors can arise from misinterpretation of extinction debt, inappropriate functional form, and ignoring forest regeneration. Major challenges remain to understand better the relationship between sampling protocol and the functional form of SARs and the dynamics of relaxation, especially in continental areas, and to widen the testing of extinction forecasts. © 2013 New York Academy of Sciences.

  17. The SR-71 Test Bed Aircraft: A Facility for High-Speed Flight Research

    NASA Technical Reports Server (NTRS)

    Corda, Stephen; Moes, Timothy R.; Mizukami, Masashi; Hass, Neal E.; Jones, Daniel; Monaghan, Richard C.; Ray, Ronald J.; Jarvis, Michele L.; Palumbo, Nathan

    2000-01-01

    The SR-71 test bed aircraft is shown to be a unique platform to flight-test large experiments to supersonic Mach numbers. The test bed hardware mounted on the SR-71 upper fuselage is described. This test bed hardware is composed of a fairing structure called the "canoe" and a large "reflection plane" flat plate for mounting experiments. Total experiment weights, including the canoe and reflection plane, as heavy as 14,500 lb can be mounted on the aircraft and flight-tested to speeds as fast as Mach 3.2 and altitudes as high as 80,000 ft. A brief description of the SR-71 aircraft is given, including details of the structural modifications to the fuselage, modifications to the J58 engines to provide increased thrust, and the addition of a research instrumentation system. Information is presented based on flight data that describes the SR-71 test bed aerodynamics, stability and control, structural and thermal loads, the canoe internal environment, and reflection plane flow quality. Guidelines for designing SR-71 test bed experiments are also provided.

  18. D-OPTIMAL EXPERIMENTAL DESIGNS TO TEST FOR DEPARTURE FROM ADDITIVITY IN A FIXED-RATIO MIXTURE RAY.

    EPA Science Inventory

    Humans are exposed to mixtures of environmental compounds. A regulatory assumption is that the mixtures of chemicals act in an additive manner. However, this assumption requires experimental validation. Traditional experimental designs (full factorial) require a large number of e...

  19. IN VITRO ASSESSMENT OF DEVELOPMENTAL NEUROTOXICITY: USE OF MICROELECTRODE ARRAYS TO MEASURE FUNCTIONAL CHANGES IN NEURONAL NETWORK ONTOGENY

    EPA Science Inventory

    Because the Developmental Neurotoxicity Testing Battery requires large numbers of animals and is expensive, development of in vitro approaches to screen chemicals for potential developmental neurotoxicity is a high priority. Many proposed approaches for screening are biochemical,...

  20. In Vitro Assessment of Developmental Neurotoxicity: Use of Microelectrode Arrays to Measure Functional Changes in Neuronal Network Ontogeny*

    EPA Science Inventory

    Because the Developmental Neurotoxicity Testing Guidelines require large numbers of animals and is expensive, development of in vitro approaches to screen chemicals for potential developmental neurotoxicity is a high priority. Many proposed approaches for screening are biochemica...

  1. Components of Self-Regulated Learning; Implications for School Performance

    ERIC Educational Resources Information Center

    Mih, Codruta; Mih, Viorel

    2010-01-01

    Self-regulated school learning behavior includes the activation of a relatively large number of psychological dimensions. Among the most important self-regulation constructs that influence school learning are: learning goals, personal self-efficacy, metacognition and test-anxiety. The adaptive functioning of these is associated with high…

  2. Interspecies Correlation Estimation (ICE) models predict supplemental toxicity data for SSDs

    EPA Science Inventory

    Species sensitivity distributions (SSD) require a large number of toxicity values for a diversity of taxa to define a hazard level protective of multiple species. For most chemicals, measured toxicity data are limited to a few standard test species that are unlikely to adequately...

  3. USE OF RFID TO TRACK HAZARDOUS WASTE SHIPMENTS ACROSS DOMESTIC AND INTERNATIONAL BORDERS

    EPA Science Inventory

    Radio-frequency identification system (RFID) is an emerging commodity tracking technology that is being tested and implemented in a large number of applications worldwide. RFID is a method of transmitting data using radio waves, usually through communication with a tag. Both ac...

  4. The ToxCast Chemical Prioritization Program at the US EPA (UCLA Molecular Toxicology Program)

    EPA Science Inventory

    To meet the needs of chemical regulators reviewing large numbers of data-poor chemicals for safety, the EPA's National Center for Computational Toxicology is developing a means of efficiently testing thousands of compounds for potential toxicity. High-throughput bioactivity profi...

  5. D-OPTIMAL EXPERIMENTAL DESIGNS TO TEST FOR DEPARTURE FROM ADDITIVITY IN A FIXED-RATIO RAY MIXTURE.

    EPA Science Inventory

    Risk assessors are becoming increasingly aware of the importance of assessing interactions between chemicals in a mixture. Most traditional designs for evaluating interactions are prohibitive when the number of chemicals in the mixture is large. However, evaluation of interacti...

  6. D-OPTIMAL EXPERIMENTAL DESIGNS TO TEST FOR DEPARTURE FROM ADDITIVITY IN A FIXED-RATIO MIXTURE RAY.

    EPA Science Inventory

    Traditional factorial designs for evaluating interactions among chemicals in a mixture are prohibitive when the number of chemicals is large. However, recent advances in statistically-based experimental design have made it easier to evaluate interactions involving many chemicals...

  7. Studies of the Variables Affecting Behavior of Larval Zebrafish for Developmental Neurotoxicity Testing*

    EPA Science Inventory

    The U.S. Environmental Protection Agency is evaluating methods to screen and prioritize large numbers of chemicals for developmental toxicity. We are exploring methods to screen for developmentally neurotoxic chemicals using zebrafish behavior at 6 days of age. The behavioral par...

  8. 76 FR 159 - Discretionary Grant Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-03

    ... detection of iron deficiency, another pediatric health issue. Proficiency testing (PT) is a proven method... monthly PT and other lab quality improvement tools to nearly 600 laboratories across the U.S. and beyond... Competition: The participation of large numbers of these labs in voluntary proficiency was by design, and...

  9. Planning multi-arm screening studies within the context of a drug development program

    PubMed Central

    Wason, James M S; Jaki, Thomas; Stallard, Nigel

    2013-01-01

    Screening trials are small trials used to decide whether an intervention is sufficiently promising to warrant a large confirmatory trial. Previous literature examined the situation where treatments are tested sequentially until one is considered sufficiently promising to take forward to a confirmatory trial. An important consideration for sponsors of clinical trials is how screening trials should be planned to maximize the efficiency of the drug development process. It has been found previously that small screening trials are generally the most efficient. In this paper we consider the design of screening trials in which multiple new treatments are tested simultaneously. We derive analytic formulae for the expected number of patients until a successful treatment is found, and propose methodology to search for the optimal number of treatments, and optimal sample size per treatment. We compare designs in which only the best treatment proceeds to a confirmatory trial and designs in which multiple treatments may proceed to a multi-arm confirmatory trial. We find that inclusion of a large number of treatments in the screening trial is optimal when only one treatment can proceed, and a smaller number of treatments is optimal when more than one can proceed. The designs we investigate are compared on a real-life set of screening designs. Copyright © 2013 John Wiley & Sons, Ltd. PMID:23529936

  10. Characterizing and Optimizing the Performance of the MAESTRO 49-Core Processor

    DTIC Science & Technology

    2014-03-27

    process large volumes of data, it is necessary during testing to vary the dimensions of the inbound data matrix to determine what effect this has on the...needed that can process the extra data these systems seek to collect. However, the space environment presents a number of threats, such as ambient or...induced faults, and that also have sufficient computational power to handle the large flow of data they encounter. This research investigates one

  11. Incremental wind tunnel testing of high lift systems

    NASA Astrophysics Data System (ADS)

    Victor, Pricop Mihai; Mircea, Boscoianu; Daniel-Eugeniu, Crunteanu

    2016-06-01

    Efficiency of trailing edge high lift systems is essential for long range future transport aircrafts evolving in the direction of laminar wings, because they have to compensate for the low performance of the leading edge devices. Modern high lift systems are subject of high performance requirements and constrained to simple actuation, combined with a reduced number of aerodynamic elements. Passive or active flow control is thus required for the performance enhancement. An experimental investigation of reduced kinematics flap combined with passive flow control took place in a low speed wind tunnel. The most important features of the experimental setup are the relatively large size, corresponding to a Reynolds number of about 2 Million, the sweep angle of 30 degrees corresponding to long range airliners with high sweep angle wings and the large number of flap settings and mechanical vortex generators. The model description, flap settings, methodology and results are presented.

  12. Cationic lipids: molecular structure/ transfection activity relationships and interactions with biomembranes.

    PubMed

    Koynova, Rumiana; Tenchov, Boris

    2010-01-01

    Abstract Synthetic cationic lipids, which form complexes (lipoplexes) with polyanionic DNA, are presently the most widely used constituents of nonviral gene carriers. A large number of cationic amphiphiles have been synthesized and tested in transfection studies. However, due to the complexity of the transfection pathway, no general schemes have emerged for correlating the cationic lipid chemistry with their transfection efficacy and the approaches for optimizing their molecular structures are still largely empirical. Here we summarize data on the relationships between transfection activity and cationic lipid molecular structure and demonstrate that the transfection activity depends in a systematic way on the lipid hydrocarbon chain structure. A number of examples, including a large series of cationic phosphatidylcholine derivatives, show that optimum transfection is displayed by lipids with chain length of approximately 14 carbon atoms and that the transfection efficiency strongly increases with increase of chain unsaturation, specifically upon replacement of saturated with monounsaturated chains.

  13. Factors contributing to airborne particle dispersal in the operating room.

    PubMed

    Noguchi, Chieko; Koseki, Hironobu; Horiuchi, Hidehiko; Yonekura, Akihiko; Tomita, Masato; Higuchi, Takashi; Sunagawa, Shinya; Osaki, Makoto

    2017-07-06

    Surgical-site infections due to intraoperative contamination are chiefly ascribable to airborne particles carrying microorganisms. The purpose of this study is to identify the actions that increase the number of airborne particles in the operating room. Two surgeons and two surgical nurses performed three patterns of physical movements to mimic intraoperative actions, such as preparing the instrument table, gowning and donning/doffing gloves, and preparing for total knee arthroplasty. The generation and behavior of airborne particles were filmed using a fine particle visualization system, and the number of airborne particles in 2.83 m 3 of air was counted using a laser particle counter. Each action was repeated five times, and the particle measurements were evaluated through one-way analysis of variance multiple comparison tests followed by Tukey-Kramer and Bonferroni-Dunn multiple comparison tests for post hoc analysis. Statistical significance was defined as a P value ≤ .01. A large number of airborne particles were observed while unfolding the surgical gown, removing gloves, and putting the arms through the sleeves of the gown. Although numerous airborne particles were observed while applying the stockinet and putting on large drapes for preparation of total knee arthroplasty, fewer particles (0.3-2.0 μm in size) were detected at the level of the operating table under laminar airflow compared to actions performed in a non-ventilated preoperative room (P < .01). The results of this study suggest that surgical staff should avoid unnecessary actions that produce a large number of airborne particles near a sterile area and that laminar airflow has the potential to reduce the incidence of bacterial contamination.

  14. Screening for Lung Cancer

    PubMed Central

    Mazzone, Peter J.; Naidich, David P.; Bach, Peter B.

    2013-01-01

    Background: Lung cancer is by far the major cause of cancer deaths largely because in the majority of patients it is at an advanced stage at the time it is discovered, when curative treatment is no longer feasible. This article examines the data regarding the ability of screening to decrease the number of lung cancer deaths. Methods: A systematic review was conducted of controlled studies that address the effectiveness of methods of screening for lung cancer. Results: Several large randomized controlled trials (RCTs), including a recent one, have demonstrated that screening for lung cancer using a chest radiograph does not reduce the number of deaths from lung cancer. One large RCT involving low-dose CT (LDCT) screening demonstrated a significant reduction in lung cancer deaths, with few harms to individuals at elevated risk when done in the context of a structured program of selection, screening, evaluation, and management of the relatively high number of benign abnormalities. Whether other RCTs involving LDCT screening are consistent is unclear because data are limited or not yet mature. Conclusions: Screening is a complex interplay of selection (a population with sufficient risk and few serious comorbidities), the value of the screening test, the interval between screening tests, the availability of effective treatment, the risk of complications or harms as a result of screening, and the degree with which the screened individuals comply with screening and treatment recommendations. Screening with LDCT of appropriate individuals in the context of a structured process is associated with a significant reduction in the number of lung cancer deaths in the screened population. Given the complex interplay of factors inherent in screening, many questions remain on how to effectively implement screening on a broader scale. PMID:23649455

  15. Designing a multiroute synthesis scheme in combinatorial chemistry.

    PubMed

    Akavia, Adi; Senderowitz, Hanoch; Lerner, Alon; Shamir, Ron

    2004-01-01

    Solid-phase mix-and-split combinatorial synthesis is often used to produce large arrays of compounds to be tested during the various stages of the drug development process. This method can be represented by a synthesis graph in which nodes correspond to grow operations and arcs to beads transferred among the different reaction vessels. In this work, we address the problem of designing such a graph which maximizes the number of produced target compounds (namely, compounds out of an input library of desired molecules), given constraints on the number of beads used for library synthesis and on the number of reaction vessels available for concurrent grow steps. We present a heuristic based on a discrete search for solving this problem, test our solution on several data sets, explore its behavior, and show that it achieves good performance.

  16. Pitot-probe displacement in a supersonic turbulent boundary layer

    NASA Technical Reports Server (NTRS)

    Allen, J. M.

    1972-01-01

    Eight circular pitot probes ranging in size from 2 to 70 percent of the boundary-layer thickness were tested to provide experimental probe displacement results in a two-dimensional turbulent boundary layer at a nominal free-stream Mach number of 2 and unit Reynolds number of 8 million per meter. The displacement obtained in the study was larger than that reported by previous investigators in either an incompressible turbulent boundary layer or a supersonic laminar boundary layer. The large probes indicated distorted Mach number profiles, probably due to separation. When the probes were small enough to cause no appreciable distortion, the displacement was constant over most of the boundary layer. The displacement in the near-wall region decreased to negative displacement in some cases. This near-wall region was found to extend to about one probe diameter from the test surface.

  17. Modeling number of bacteria per food unit in comparison to bacterial concentration in quantitative risk assessment: impact on risk estimates.

    PubMed

    Pouillot, Régis; Chen, Yuhuan; Hoelzer, Karin

    2015-02-01

    When developing quantitative risk assessment models, a fundamental consideration for risk assessors is to decide whether to evaluate changes in bacterial levels in terms of concentrations or in terms of bacterial numbers. Although modeling bacteria in terms of integer numbers may be regarded as a more intuitive and rigorous choice, modeling bacterial concentrations is more popular as it is generally less mathematically complex. We tested three different modeling approaches in a simulation study. The first approach considered bacterial concentrations; the second considered the number of bacteria in contaminated units, and the third considered the expected number of bacteria in contaminated units. Simulation results indicate that modeling concentrations tends to overestimate risk compared to modeling the number of bacteria. A sensitivity analysis using a regression tree suggests that processes which include drastic scenarios consisting of combinations of large bacterial inactivation followed by large bacterial growth frequently lead to a >10-fold overestimation of the average risk when modeling concentrations as opposed to bacterial numbers. Alternatively, the approach of modeling the expected number of bacteria in positive units generates results similar to the second method and is easier to use, thus potentially representing a promising compromise. Published by Elsevier Ltd.

  18. The Development of an 8-inch by 8-inch Slotted Tunnel for Mach Numbers up to 1.28

    NASA Technical Reports Server (NTRS)

    Little, B. H., Jr.; Cubbage, James J., Jr.

    1961-01-01

    An 8-inch by 8-inch transonic tunnel model with test section slotted on two opposite walls was constructed in which particular emphasis -was given to the development of slot geometry, slot-flow reentry section, and short-diffuser configurations for good test-region flow and minimum total-pressure losses. Center-line static pressures through the test section, wall static pressures through the other parts of the tunnel, and total-pressure distributions at the inlet and exit stations of the diffuser were measured- With a slot length equal to two tunnel heights and 1/14 open-area-ratio slotted walls) a test region one tunnel height in length was obtained in which the deviation from the mean Mach number was less than +/- 0.01 up to Mach number 1.15. With 1/7 open-area-ratio slotted walls, a test region 0.84 tunnel heights in length with deviation less than +/- O.01 was obtained up to Mach number 1.26. Increasing the tunnel diffuser angle from 6.4 to 10 deg. increased pressure loss through the tunnel at Mach number 1.20 from 15 percent to 20 percent of the total pressure. The use of other diffusers with equivalent angles of 10 deg. but contoured so that the initial diffusion angle was less than 10 deg. and the final angle was 200 reduced the losses to as low as 16 percent. A method for changing the test-section Mach number rapidly by controlling the flow through a bypass line from the tunnel settling chamber to the slot-flow plenum chamber of the test section was very effective. The test-section Mach number was reduced approximately 5 percent in 1/8 second by bleeding into the test section a flow of air equal to 2 percent of the mainstream flow and 30 percent in 1/4 second with bleed flow equal to 10 percent of the mainstream flow. The rate of reduction was largely determined by the opening rate of the bleed-flow-control valve.

  19. CrossTalk. The Journal of Defense Software Engineering. Volume 13, Number 6, June 2000

    DTIC Science & Technology

    2000-06-01

    Techniques for Efficiently Generating and Testing Software This paper presents a proven process that uses advanced tools to design, develop and test... optimal software. by Keith R. Wegner Large Software Systems—Back to Basics Development methods that work on small problems seem to not scale well to...Ability Requirements for Teamwork: Implications for Human Resource Management, Journal of Management, Vol. 20, No. 2, 1994. 11. Ferguson, Pat, Watts S

  20. An Observational Test for Shock-induced Crystallization of Cometary Silicates

    NASA Technical Reports Server (NTRS)

    Nuth, J. A.; Johnson, N. M.

    2003-01-01

    Crystalline silicates have been observed in comets and in protostellar nebulae, and there are currently at least two explanations for their formation: thermal annealing in the inner nebula, followed by transport to the regions of cometary formation and in-situ shock processing of amorphous grains at 5 - 10 AU in the Solar Nebula. The tests suggested to date to validate these models have not yet been carried out: some of these tests require a longterm commitment to observe both the dust and gas compositions in a large number of comets. Here we suggest a simpler test.

  1. RabbitQR: fast and flexible big data processing at LSST data rates using existing, shared-use hardware

    NASA Astrophysics Data System (ADS)

    Kotulla, Ralf; Gopu, Arvind; Hayashi, Soichi

    2016-08-01

    Processing astronomical data to science readiness was and remains a challenge, in particular in the case of multi detector instruments such as wide-field imagers. One such instrument, the WIYN One Degree Imager, is available to the astronomical community at large, and, in order to be scientifically useful to its varied user community on a short timescale, provides its users fully calibrated data in addition to the underlying raw data. However, time-efficient re-processing of the often large datasets with improved calibration data and/or software requires more than just a large number of CPU-cores and disk space. This is particularly relevant if all computing resources are general purpose and shared with a large number of users in a typical university setup. Our approach to address this challenge is a flexible framework, combining the best of both high performance (large number of nodes, internal communication) and high throughput (flexible/variable number of nodes, no dedicated hardware) computing. Based on the Advanced Message Queuing Protocol, we a developed a Server-Manager- Worker framework. In addition to the server directing the work flow and the worker executing the actual work, the manager maintains a list of available worker, adds and/or removes individual workers from the worker pool, and re-assigns worker to different tasks. This provides the flexibility of optimizing the worker pool to the current task and workload, improves load balancing, and makes the most efficient use of the available resources. We present performance benchmarks and scaling tests, showing that, today and using existing, commodity shared- use hardware we can process data with data throughputs (including data reduction and calibration) approaching that expected in the early 2020s for future observatories such as the Large Synoptic Survey Telescope.

  2. Random number generators for large-scale parallel Monte Carlo simulations on FPGA

    NASA Astrophysics Data System (ADS)

    Lin, Y.; Wang, F.; Liu, B.

    2018-05-01

    Through parallelization, field programmable gate array (FPGA) can achieve unprecedented speeds in large-scale parallel Monte Carlo (LPMC) simulations. FPGA presents both new constraints and new opportunities for the implementations of random number generators (RNGs), which are key elements of any Monte Carlo (MC) simulation system. Using empirical and application based tests, this study evaluates all of the four RNGs used in previous FPGA based MC studies and newly proposed FPGA implementations for two well-known high-quality RNGs that are suitable for LPMC studies on FPGA. One of the newly proposed FPGA implementations: a parallel version of additive lagged Fibonacci generator (Parallel ALFG) is found to be the best among the evaluated RNGs in fulfilling the needs of LPMC simulations on FPGA.

  3. Computation of large-scale statistics in decaying isotropic turbulence

    NASA Technical Reports Server (NTRS)

    Chasnov, Jeffrey R.

    1993-01-01

    We have performed large-eddy simulations of decaying isotropic turbulence to test the prediction of self-similar decay of the energy spectrum and to compute the decay exponents of the kinetic energy. In general, good agreement between the simulation results and the assumption of self-similarity were obtained. However, the statistics of the simulations were insufficient to compute the value of gamma which corrects the decay exponent when the spectrum follows a k(exp 4) wave number behavior near k = 0. To obtain good statistics, it was found necessary to average over a large ensemble of turbulent flows.

  4. Validation of US3D for Capsule Aerodynamics using 05-CA Wind Tunnel Test Data

    NASA Technical Reports Server (NTRS)

    Schwing, Alan

    2012-01-01

    RANS is ill-suited for analysis of these problems. For transonic and supersonic cases, US3D shows fairly good agreement using DES across all cases. Separation prediction and resulting backshell pressure are problems across all portions of this analysis. This becomes more of an issue at lower Mach numbers: .Stagnation pressures not as large - wake and backshell are more significant .Errors on shoulder act on a large area - small discrepancies manifest as large changes Subsonic comparisons are mixed with regard to integrated loads and merit more attention. Dominant unsteady behavior (wake shedding) resolved well, though.

  5. Characteristics of acoustic wave from atmospheric nuclear explosions conducted at the USSR Test Sites

    NASA Astrophysics Data System (ADS)

    Sokolova, Inna

    2015-04-01

    Availability of the acoustic wave on the record of microbarograph is one of discriminate signs of atmospheric (surface layer of atmosphere) and contact explosions. Nowadays there is large number of air wave records from chemical explosions recorded by the IMS infrasound stations installed during recent decade. But there is small number of air wave records from nuclear explosions as air and contact nuclear explosions had been conducted since 1945 to 1962, before the Limited Test Ban Treaty was signed in 1963 (the treaty banning nuclear weapon tests in the atmosphere, in outer space and under water) by the Great Britain, USSR and USA. That time there was small number of installed microbarographs. First infrasound stations in the USSR appeared in 1954, and by the moment of the USSR collapse the network consisted of 25 infrasound stations, 3 of which were located on Kazakhstan territory - in Kurchatov (East Kazakhstan), in Borovoye Observatory (North Kazakhstan) and Talgar Observatory (Northern Tien Shan). The microbarograph of Talgar Observatory was installed in 1962 and recorded large number of air nuclear explosions conducted at Semipalatinsk Test Site and Novaya Zemlya Test Site. The epicentral distance to the STS was ~700 km, and to Novaya Zemlya Test Site ~3500 km. The historical analog records of the microbarograph were analyzed on the availability of the acoustic wave. The selected records were digitized, the database of acoustic signals from nuclear explosions was created. In addition, acoustic signals from atmospheric nuclear explosions conducted at the USSR Test Sites were recorded by analogue broadband seismic stations at wide range of epicentral distances, 300-3600 km. These signals coincide well by its form and spectral content with records of microbarographs and can be used for monitoring tasks and discrimination in places where infrasound observations are absent. Nuclear explosions which records contained acoustic wave were from 0.03 to 30 kt yield for the STS, and from 8.3 to 25 Mt yield for Novaya Zemlya Test Site region. The peculiarities of the wave pattern and spectral content of the acoustic wave records, and relation regularities of acoustic wave amplitude and periods with explosion yield and distance were investigated. The created database can be applied in different monitoring tasks, such as infrasound stations calibration, discrimination of nuclear explosions, precision of nuclear explosions parameters, determination of the explosion yield etc.

  6. Testing and validation of computerized decision support systems.

    PubMed

    Sailors, R M; East, T D; Wallace, C J; Carlson, D A; Franklin, M A; Heermann, L K; Kinder, A T; Bradshaw, R L; Randolph, A G; Morris, A H

    1996-01-01

    Systematic, through testing of decision support systems (DSSs) prior to release to general users is a critical aspect of high quality software design. Omission of this step may lead to the dangerous, and potentially fatal, condition of relying on a system with outputs of uncertain quality. Thorough testing requires a great deal of effort and is a difficult job because tools necessary to facilitate testing are not well developed. Testing is a job ill-suited to humans because it requires tireless attention to a large number of details. For these reasons, the majority of DSSs available are probably not well tested prior to release. We have successfully implemented a software design and testing plan which has helped us meet our goal of continuously improving the quality of our DSS software prior to release. While requiring large amounts of effort, we feel that the process of documenting and standardizing our testing methods are important steps toward meeting recognized national and international quality standards. Our testing methodology includes both functional and structural testing and requires input from all levels of development. Our system does not focus solely on meeting design requirements but also addresses the robustness of the system and the completeness of testing.

  7. The Influence of Low Wall Temperature on Boundary-Layer Transition and Local Heat Transfer on 2-Inch-Diameter Hemispheres at a Mach Number of 4.95 and a Reynolds Number per Foot of 73.2 x 10(exp 6)

    NASA Technical Reports Server (NTRS)

    Cooper, Morton; Mayo, Edward E.; Julius, Jerome D.

    1960-01-01

    Measurements of the location of boundary-layer transition and the local heat transfer have been made on 2-inch-diameter hemispheres in the Langley gas dynamics laboratory at a Mach number of 4.95, a Reynolds number per foot of 73.2 x 10(exp 6), and a stagnation temperature of approximately 400 F. The transient-heating thin-skin calorimeter technique was used, and the initial values of the wall-to-stream stagnation- temperature ratios were 0.16 (cold-model tests) and 0.65 (hot-model test). During two of the four cold tests, the boundary-layer flow changed from turbulent to laminar over large regions of the hemisphere as the model heated. On the basis of a detailed consideration of the magnitude of roughness possibly present during these two cold tests, it appears that this destabilizing effect of low wall temperatures (cooling) was not caused by roughness as a dominant influence. This idea of a decrease in boundary-layer stability with cooling has been previously suggested. (See, for example, NASA Memorandum 10-8-58E.) For the laminar data obtained during the early part of the hot test, the correlation of the local-heating data with laminar theory was excellent.

  8. The effects of an educational meeting and subsequent computer reminders on the ordering of laboratory tests by rheumatologists: an interrupted time series analysis.

    PubMed

    Lesuis, Nienke; den Broeder, Nathan; Boers, Nadine; Piek, Ester; Teerenstra, Steven; Hulscher, Marlies; van Vollenhoven, Ronald; den Broeder, Alfons A

    2017-01-01

    To examine the effects of an educational meeting and subsequent computer reminders on the number of ordered laboratory tests. Using interrupted time series analysis we assessed whether trends in the number of laboratory tests ordered by rheumatologists between September 2012 and September 2015 at the Sint Maartenskliniek (the Netherlands) changed following an educational meeting (September 2013) and the introduction of computer reminders into the Computerised Physician Order Entry System (July 2014). The analyses were done for the set of tests on which both interventions had focussed (intervention tests; complement, cryoglobulins, immunoglobins, myeloma protein) and a set of control tests unrelated to the interventions (alanine transferase, anti-cyclic citrullinated peptide, C-reactive protein, creatine, haemoglobin, leukocytes, mean corpuscular volume, rheumatoid factor and thrombocytes). At the start of the study, 101 intervention tests and 7660 control tests were ordered per month by the rheumatologists. After the educational meeting, both the level and trend of ordered intervention and control tests did not change significantly. After implementation of the reminders, the level of ordered intervention tests decreased by 85.0 tests (95%-CI -133.3 to -36.8, p<0.01), the level of control tests did not change following the introduction of reminders. In summary, an educational meeting alone was not effective in decreasing the number of ordered intervention tests, but the combination with computer reminders did result in a large decrease of those tests. Therefore, we recommend using computer reminders in addition to education if reduction of inappropriate test use is aimed for.

  9. Trends in serum creatinine testing in Oxfordshire, UK, 1993-2013: a population-based cohort study.

    PubMed

    Oke, Jason; Shine, Brian; McFadden, Emily; Stevens, Richard; Lasserson, Daniel; Perera, Rafael

    2015-12-16

    To determine how many kidney function tests are done, on whom, how frequently they are performed and how they have changed over time. Retrospective study of all serum creatinine, urine albumin and urine creatinine tests. Primary and secondary care in Oxfordshire from 1993 to 2013. Unselected population of 1,220,447 people. The total number of creatinine and urinary protein tests ordered from primary and secondary care and the number of tests per year stratified by categories of estimated glomerular filtration rate (eGFR). The frequency of testing in patients having their kidney function monitored. Creatinine requests from primary care increased steadily from 1997 and exceeded 220,000 requests in 2013. Tests corresponding to normal kidney function (eGFR >60/mL/min/1.73 m(2)) constituted 59% of all kidney function tests in 1993 and accounted for 83% of all tests in 2013. Test corresponding to chronic kidney disease (CKD) stages 3-5 declined after 2007. Reduced kidney function, albuminuria, male gender, diabetes and age were independently associated with more frequent monitoring. For a female patient between 61 and 80 years and with stage 3a CKD, the average number of serum creatinine tests (95% CI) was 3.23/year (3.19 to 3.26) and for a similar woman with diabetes, the average number of tests was 5.50 (5.44 to 5.56) tests per year. There has been a large increase in the number of kidney function tests over the past two decades. However, we found little evidence that this increase is detecting more CKD. Tests are becoming more frequent in people with and without evidence of renal impairment. Future work using a richer data source could help unravel the underlying reasons for the increased testing and determine how much is necessary and useful. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  10. Number of Streptococcus mutans and Lactobacillus in saliva versus the status of cigarette smoking, considering duration of smoking and number of cigarettes smoked daily.

    PubMed

    Nakonieczna-Rudnicka, Marta; Bachanek, Teresa

    2017-09-21

    A large number of colonies of Streptococcus mutans (SM) and Lactobacillus (LB) cariogenic bacteria in the saliva show a high risk of dental caries development. Cotinine is a biomarker of exposure to the tobacco smoke. The aim of the study was assessment of the number of Streptococcus mutans and Lactobacillus in the saliva of non-smokers and smokers considering the duration of smoking and the number of cigarettes smoked daily. The number of SM and LB was analysed in relation to the frequency of oral health check-ups. The investigated group comprised 124 people aged 20-54. 58 (46.8%) reported cigarette smoking; 66 (53.2%) reported they had never smoked cigarettes and had never attempted to smoke. Cotinine concentration in the saliva was assayed using the Cotinine test (Calbiotech), and the number of SM and LB with the use of the CRT bacteria test (Ivoclar Vivadent, Liechtenstein). Statistical analysis was conducted using Chi2 and Mann-Whitney tests. Test values of p<0.05 were considered statistically significant. No essential correlation was stated between the number of SM and LB and the status of smoking, the number of cigarettes smoked daily and duration of cigarette smoking. Smokers who reported having dental check-ups at least once a year significantly more frequently had a small number of LB stated in relation to people who had dental check-ups to control their oral health less frequently than once a year. The number of SM and LB in saliva does not depend on the smoking status, the number of cigarettes smoked daily and duration of smoking.

  11. The use of mice in the Sereny test as a virulence assay of shigellae and enteroinvasive Escherichia coli.

    PubMed Central

    Murayama, S Y; Sakai, T; Makino, S; Kurata, T; Sasakawa, C; Yoshikawa, M

    1986-01-01

    We examined the possibility that mice could be used in the Sereny test instead of guinea pigs or rabbits. Although the reactions in mice were more transient and not as pronounced as those in guinea pigs, mice indeed could be used to distinguish even macroscopically between virulent and avirulent shigellae. Virulent enteroinvasive Escherichia coli strains were also positive for the mouse Sereny test. We described the macroscopic and microscopic appearance of the mouse eyes. Thus, mice are recommended for use in the Sereny test, particularly when a large number of samples are to be tested. Images PMID:3510985

  12. Large Payload Ground Transportation and Test Considerations

    NASA Technical Reports Server (NTRS)

    Rucker, Michelle A.

    2016-01-01

    Many spacecraft concepts under consideration by the National Aeronautics and Space Administration’s (NASA’s) Evolvable Mars Campaign take advantage of a Space Launch System payload shroud that may be 8 to 10 meters in diameter. Large payloads can theoretically save cost by reducing the number of launches needed--but only if it is possible to build, test, and transport a large payload to the launch site in the first place. Analysis performed previously for the Altair project identified several transportation and test issues with an 8.973 meters diameter payload. Although the entire Constellation Program—including Altair—has since been canceled, these issues serve as important lessons learned for spacecraft designers and program managers considering large payloads for future programs. A transportation feasibility study found that, even broken up into an Ascent and Descent Module, the Altair spacecraft would not fit inside available aircraft. Ground transportation of such large payloads over extended distances is not generally permitted, so overland transportation alone would not be an option. Limited ground transportation to the nearest waterway may be possible, but water transportation could take as long as 67 days per production unit, depending on point of origin and acceptance test facility; transportation from the western United States would require transit through the Panama Canal to access the Kennedy Space Center launch site. Large payloads also pose acceptance test and ground processing challenges. Although propulsion, mechanical vibration, and reverberant acoustic test facilities at NASA’s Plum Brook Station have been designed to accommodate large spacecraft, special handling and test work-arounds may be necessary, which could increase cost, schedule, and technical risk. Once at the launch site, there are no facilities currently capable of accommodating the combination of large payload size and hazardous processing such as hypergolic fuels, pyrotechnic devices, and high pressure gasses. Ironically, the limiting factor to a national heavy lift strategy may not be the rocket technology needed to throw a heavy payload, but rather the terrestrial infrastructure—roads, bridges, airframes, and buildings—necessary to transport, acceptance test, and process large spacecraft. Failure to carefully consider where and how large spacecraft are manufactured, tested, and launched could result in unforeseen cost to modify existing (or develop new) infrastructure, or incur additional risk due to increased handling operations or eliminating key verifications. Although this paper focuses on the canceled Altair spacecraft as a case study, the issues identified here have wide applicability to other large payloads, including concepts under consideration for NASA’s Evolvable Mars Campaign.

  13. Developmental Exposure to Valproate or Ethanol Alters Locomotor Activity and Retino-Tectal Projection Area in Zebrafish Embryos

    EPA Science Inventory

    Given the minimal developmental neurotoxicity data available for the large number of new and existing chemicals, there is a critical need for alternative methods to identify and prioritize chemicals for further testing. We outline a developmental neurotoxicity screening approach ...

  14. A web-based genome browser for 'SNP-aware' assay design

    USDA-ARS?s Scientific Manuscript database

    Human and animal genomes contain an abundance of single nucleotide polymorphisms (SNPs) that are useful for genetic testing. However, the relatively large number of SNPs present in diverse populations can pose serious problems when designing assays. It is important to “mask” some SNP positions so ...

  15. Online Assessment of Learning and Engagement in University Laboratory Practicals

    ERIC Educational Resources Information Center

    Whitworth, David E.; Wright, Kate

    2015-01-01

    In science education, laboratory practicals are frequently assessed through submission of a report. A large increase in student numbers necessitated us adapting a traditional practical report into an online test with automated marking. The assessment was designed to retain positive features of the traditional laboratory report but with added…

  16. Use of human clearance rates to predict the biotransformation of pharmaceuticals by fish: A test of the read-across approach

    EPA Science Inventory

    Pharmaceuticals are increasingly found in aquatic environments near wastewater treatment plant discharge, and may be of particular concern to aquatic life given their pseudo-persistence. The large number of detected pharmaceuticals necessitates a prioritization method for hazard...

  17. Principal Appraisals Get a Remake

    ERIC Educational Resources Information Center

    Zubrzycki, Jaclyn

    2013-01-01

    A growing number of school districts--including large ones like those in Chicago, Dallas, Los Angeles, and Hawaii--have become recent converts to new principal-evaluation systems that tie school leaders' appraisals to student test scores. As of this school year, student achievement accounts for 40 percent to 50 percent of principals' evaluations…

  18. DEVELOPMENT OF AN OBJECTIVE AND QUANTIFIABLE TERATOLOGICAL SCREEN FOR USE IN ZEBRAFISH LARVAE.

    EPA Science Inventory

    To address EPA’s need to prioritize large numbers of chemicals for testing, a rapid, cost-effective in vivo screen for potential developmental toxicity using an alternative vertebrate species (zebrafish;Danio rerio) has been developed. A component of that screen is the observatio...

  19. Development of a Context-Rich Database of ToxCast Assay Annotations (SOT)

    EPA Science Inventory

    Major concerns exist for the large number of environmental chemicals which lack toxicity data. The tens of thousands of commercial substances in need of screening for potential human health effects would cost millions of dollars and several decades to test in traditional animal-b...

  20. Evaluation of PLS, LS-SVM, and LWR for quantitative spectroscopic analysis of soils

    USDA-ARS?s Scientific Manuscript database

    Soil testing requires the analysis of large numbers of samples in laboratory that are often time consuming and expensive. Mid-infrared spectroscopy (mid-IR) and near-infrared spectroscopy (NIRS) are fast, non-destructive, and inexpensive analytical methods that have been used for soil analysis, in l...

  1. Validation, acceptance, and extension of a predictive model of reproductive toxicity using ToxCast data

    EPA Science Inventory

    The EPA ToxCast research program uses a high-throughput screening (HTS) approach for predicting the toxicity of large numbers of chemicals. Phase-I tested 309 well-characterized chemicals (mostly pesticides) in over 500 assays of different molecular targets, cellular responses an...

  2. Problems in air traffic management. VI., Interaction of training-entry age with intellectual and personality characteristics of air traffic control specialists.

    DOT National Transportation Integrated Search

    1965-07-01

    Over 900 Enroute and Terminal Air Traffic Controller Specialist (ATCS) trainees were administered a large number of aptitude and personality tests. Examination of the relationships between the performance scores and age at entry into training reveale...

  3. Predictive Model of Rat Reproductive Toxicity from ToxCast High Throughput Screening

    EPA Science Inventory

    The EPA ToxCast research program uses high throughput screening for bioactivity profiling and predicting the toxicity of large numbers of chemicals. ToxCast Phase‐I tested 309 well‐characterized chemicals in over 500 assays for a wide range of molecular targets and cellular respo...

  4. Applying Adverse Outcome Pathways (AOPs) to support Integrated Approaches to Testing and Assessment (IATA workshop report)

    EPA Science Inventory

    Chemical regulation is challenged by the large number of chemicals requiring assessment for potential human health and environmental impacts. Current approaches are too resource intensive in terms of time, money and animal use to evaluate all chemicals under development or alread...

  5. Genetic structure of populations and differentiation in forest trees

    Treesearch

    Raymond P. Guries; F. Thomas Ledig

    1981-01-01

    Electrophoretic techniques permit population biologists to analyze genetic structure of natural populations by using large numbers of allozyme loci. Several methods of analysis have been applied to allozyme data, including chi-square contingency tests, F-statistics, and genetic distance. This paper compares such statistics for pitch pine (Pinus rigida...

  6. Complementing in vitro hazard assessment with exposure and pharmacokinetics considerations for chemical prioritization

    EPA Science Inventory

    Traditional toxicity testing involves a large investment in resources, often using low-throughput in vivo animal studies for limited numbers of chemicals. An alternative strategy is the emergence of high-throughput (HT) in vitro assays as a rapid, cost-efficient means to screen t...

  7. The Positivity Scale

    ERIC Educational Resources Information Center

    Caprara, Gian Vittorio; Alessandri, Guido; Eisenberg, Nancy; Kupfer, A.; Steca, Patrizia; Caprara, Maria Giovanna; Yamaguchi, Susumu; Fukuzawa, Ai; Abela, John

    2012-01-01

    Five studies document the validity of a new 8-item scale designed to measure "positivity," defined as the tendency to view life and experiences with a positive outlook. In the first study (N = 372), the psychometric properties of Positivity Scale (P Scale) were examined in accordance with classical test theory using a large number of…

  8. Biological profiling and dose-response modeling tools, characterizing uncertainty

    EPA Science Inventory

    Through its ToxCast project, the U.S. EPA has developed a battery of in vitro high throughput screening (HTS) assays designed to assess the potential toxicity of environmental chemicals. At present, over 1800 chemicals have been tested in up to 600 assays, yielding a large number...

  9. Predictive Signatures of Developmental Toxicity Modeled with HTS Data from ToxCast™ Bioactivity Profiles

    EPA Science Inventory

    The EPA ToxCast™ research program uses a high-throughput screening (HTS) approach for predicting the toxicity of large numbers of chemicals. Phase-I contains 309 well-characterized chemicals which are mostly pesticides tested in over 600 assays of different molecular targets, cel...

  10. Learning the Constellations: From Junior High to Undergraduate Descriptive Astronomy Class

    NASA Astrophysics Data System (ADS)

    Stephens, Denise C.; Hintz, Eric G.; Hintz, Maureen; Lawler, Jeannette; Jones, Michael; Bench, Nathan

    2015-01-01

    As part of two separate studies we have examined the ability of students to learn and remember a group of constellations, bright stars, and deep sky objects. For a group of junior high students we tested their knowledge of only the constellations by giving them a 'constellation quiz' without any instruction. We then provided the students with a lab session, and retested. We also tested a large number of undergraduate students in our descriptive astronomy classes, but in this case there were the same 30 constellations, 17 bright stars, and 3 deep sky objects. The undergraduate students were tested in a number of ways: 1) pre-testing without instruction, 2) self-reporting of knowledge, 3) normal constellation quizzes as part of the class, and 4) retesting students from previous semesters. This provided us with a set of baseline measurements, allowed us to track the learning curve, and test retention of the material. We will present our early analysis of the data.

  11. Reliability of programs specified with equational specifications

    NASA Astrophysics Data System (ADS)

    Nikolik, Borislav

    Ultrareliability is desirable (and sometimes a demand of regulatory authorities) for safety-critical applications, such as commercial flight-control programs, medical applications, nuclear reactor control programs, etc. A method is proposed, called the Term Redundancy Method (TRM), for obtaining ultrareliable programs through specification-based testing. Current specification-based testing schemes need a prohibitively large number of testcases for estimating ultrareliability. They assume availability of an accurate program-usage distribution prior to testing, and they assume the availability of a test oracle. It is shown how to obtain ultrareliable programs (probability of failure near zero) with a practical number of testcases, without accurate usage distribution, and without a test oracle. TRM applies to the class of decision Abstract Data Type (ADT) programs specified with unconditional equational specifications. TRM is restricted to programs that do not exceed certain efficiency constraints in generating testcases. The effectiveness of TRM in failure detection and recovery is demonstrated on formulas from the aircraft collision avoidance system TCAS.

  12. A new low-cost procedure for detecting nucleic acids in low-incidence samples: a case study of detecting spores of Paenibacillus larvae from bee debris.

    PubMed

    Ryba, Stepan; Kindlmann, Pavel; Titera, Dalibor; Haklova, Marcela; Stopka, Pavel

    2012-10-01

    American foulbrood, because of its virulence and worldwide spread, is currently one of the most dangerous diseases of honey bees. Quick diagnosis of this disease is therefore vitally important. For its successful eradication, however, all the hives in the region must be tested. This is time consuming and costly. Therefore, a fast and sensitive method of detecting American foulbrood is needed. Here we present a method that significantly reduces the number of tests needed by combining batches of samples from different hives. The results of this method were verified by testing each sample. A simulation study was used to compare the efficiency of the new method with testing all the samples and to develop a decision tool for determining when best to use the new method. The method is suitable for testing large numbers of samples (over 100) when the incidence of the disease is low (10% or less).

  13. Force and pressure tests of the GA(W)-1 airfoil with a 20% aileron and pressure tests with a 30% Fowler flap

    NASA Technical Reports Server (NTRS)

    Wentz, W. H., Jr.; Seetharam, H. C.; Fiscko, K. A.

    1977-01-01

    Wind tunnel force and pressure tests were conducted for the GA(W)-1 airfoil equipped with a 20% aileron, and pressure tests were conducted with a 30% Fowler flap. All tests were conducted at a Reynolds number of 2.2 and a Mach number of 0.13. The aileron provides control effectiveness similar to ailerons applied to more conventional airfoils. Effects of aileron gaps from 0% to 2% chord were evaluated, as well as hinge moment characteristics. The aft camber of the GA(W)-1 section results in a substantial up-aileron moment, but the hinge moments associated with aileron deflection are similar to other configurations. Fowler flap pressure distributions indicate that unseparated flow is achieved for flap settings up to 40 deg., over a limited angle of attack range. Theoretical pressure distributions compare favorably with experiments for low flap deflections, but show substantial errors at large deflections.

  14. A replacement for islet equivalents with improved reliability and validity.

    PubMed

    Huang, Han-Hung; Ramachandran, Karthik; Stehno-Bittel, Lisa

    2013-10-01

    Islet equivalent (IE), the standard estimate of isolated islet volume, is an essential measure to determine the amount of transplanted islet tissue in the clinic and is used in research laboratories to normalize results, yet it is based on the false assumption that all islets are spherical. Here, we developed and tested a new easy-to-use method to quantify islet volume with greater accuracy. Isolated rat islets were dissociated into single cells, and the total cell number per islet was determined by using computer-assisted cytometry. Based on the cell number per islet, we created a regression model to convert islet diameter to cell number with a high R2 value (0.8) and good validity and reliability with the same model applicable to young and old rats and males or females. Conventional IE measurements overestimated the tissue volume of islets. To compare results obtained using IE or our new method, we compared Glut2 protein levels determined by Western Blot and proinsulin content via ELISA between small (diameter≤100 μm) and large (diameter≥200 μm) islets. When normalized by IE, large islets showed significantly lower Glut2 level and proinsulin content. However, when normalized by cell number, large and small islets had no difference in Glut2 levels, but large islets contained more proinsulin. In conclusion, normalizing islet volume by IE overestimated the tissue volume, which may lead to erroneous results. Normalizing by cell number is a more accurate method to quantify tissue amounts used in islet transplantation and research.

  15. Melanocytic nevi, nevus genes and melanoma risk in a large case-control study in the United Kingdom

    PubMed Central

    Newton-Bishop, Julia A; Chang, Yu-Mei; Iles, Mark M; Taylor, John C; Bakker, Bert; Chan, May; Leake, Susan; Karpavicius, Birute; Haynes, Sue; Fitzgibbon, Elaine; Elliott, Faye; Kanetsky, Peter A.; Harland, Mark; Barrett, Jennifer H; Bishop, D Timothy

    2010-01-01

    Background Increased number of melanocytic nevi is a potent melanoma risk factor. We have carried out a large population-based case-control study to explore the environmental and genetic determinants of nevi and the relationship with melanoma risk. Methods We report nevus phenotype in relation to differing patterns of sun exposure, inherited variation at loci shown in recent genome-wide association studies to be nevus genes, and risk. Results Increased numbers of nevi were associated with holiday sun exposure, particularly on intermittently sun-exposed body sites (test for trend p<0.0001). Large nevi were also associated with holiday sun exposure (p=0.002). Single nucleotide polymorphisms (SNPs) on chromosomes 9 and 22 were associated with increased numbers of nevi (p=0.04 and p=0.002 respectively) and larger nevi (p=0.03 and p=0.002), whereas that on chromosome 6 was associated only with large nevi (p=0.01). Melanoma risk was associated with increased nevus count, large nevi and atypical nevi for tumors in all body sites (including rare sites) irrespective of age. The risk persisted when adjusted for inheritance of nevus SNPs. Conclusions The at-risk nevus phenotype is associated with behaviors known to increase melanoma risk (holiday sun exposure). Although SNPs on chromosomes 6, 9 and 22 were shown to be nevus genes they explained only a small proportion of melanoma risk and nevus phenotype; therefore a number of nevus genes likely remain to be identified. Impact This paper confirms the importance of nevi in melanoma pathogenesis and increases understanding of their genetic determinants. PMID:20647408

  16. Toxicology in Addiction Medicine.

    PubMed

    Schwarz, Daniel A; George, M P; Bluth, Martin H

    2016-12-01

    Toxicology testing in addiction medicine varies across the spectrum, yet remains a powerful tool in monitoring addictive patients. There are many reference laboratories offering toxicology testing, and physicians should have some understanding of laboratory, methodology, testing portfolio, and customer support structure to aid them in selecting the best toxicology laboratory for their patients. Consultation with a clinical pathologist/toxicologist in conjunction with the consideration of monitoring large numbers of illicit and psychoactive drugs in the addictive patient may provide important clinical information for their treatment. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Simple Tests for Rapid Detection of Canine Parvovirus Antigen and Canine Parvovirus-Specific Antibodies▿ †

    PubMed Central

    Marulappa, Shashidhara Y.; Kapil, Sanjay

    2009-01-01

    Canine parvovirus (CPV) is the number one viral cause of enteritis, morbidity, and mortality in 8-week-old young puppies. We have developed twin assays (slide agglutination test [SAT] for CPV antigen and slide inhibition test [SIT] for CPV antibody) that are sensitive, specific, cost-effective, generic for all genotypes of CPV, and provide instant results for CPV antigen detection in feces and antibody quantification in serum. We found these assays to be useful for routine applications in kennels with large numbers of puppies at risk. The results of these assays are available in 1 min and do not require any special instrumentation. SAT-SIT technology will find applications in rapid screening of samples for other hemagglutinating emerging viruses of animals and humans (influenza virus and severe acute respiratory syndrome coronavirus). PMID:18987166

  18. Experimental Surface Pressure Data Obtained on 65 deg Delta Wing Across Reynolds Number and Mach Number Ranges. Vol. 4: Large-radius leading edge

    NASA Technical Reports Server (NTRS)

    Chu, Julio; Luckring, James M.

    1996-01-01

    An experimental wind tunnel test of a 65 deg delta wing model with interchangeable leading edges was conducted in the Langley National Transonic Facility (NTF). The objective was to investigate the effects of Reynolds and Mach numbers on slender-wing leading-edge vortex flows with four values of wing leading-edge bluntness. Experimentally obtained pressure data are presented without analysis in tabulated and graphical formats across a Reynolds number range of 6 x 10(exp 6) to 120 x 10(exp 6) at a Mach number of 0.85 and across a Mach number range of 0.4 to 0.9 at Reynolds numbers of 6 x 10(exp 6) and 60 x 10(exp 6). Normal-force and pitching-moment coefficient plots for these Reynolds number and Mach number ranges are also presented.

  19. Evaluation of integration methods for hybrid simulation of complex structural systems through collapse

    NASA Astrophysics Data System (ADS)

    Del Carpio R., Maikol; Hashemi, M. Javad; Mosqueda, Gilberto

    2017-10-01

    This study examines the performance of integration methods for hybrid simulation of large and complex structural systems in the context of structural collapse due to seismic excitations. The target application is not necessarily for real-time testing, but rather for models that involve large-scale physical sub-structures and highly nonlinear numerical models. Four case studies are presented and discussed. In the first case study, the accuracy of integration schemes including two widely used methods, namely, modified version of the implicit Newmark with fixed-number of iteration (iterative) and the operator-splitting (non-iterative) is examined through pure numerical simulations. The second case study presents the results of 10 hybrid simulations repeated with the two aforementioned integration methods considering various time steps and fixed-number of iterations for the iterative integration method. The physical sub-structure in these tests consists of a single-degree-of-freedom (SDOF) cantilever column with replaceable steel coupons that provides repeatable highlynonlinear behavior including fracture-type strength and stiffness degradations. In case study three, the implicit Newmark with fixed-number of iterations is applied for hybrid simulations of a 1:2 scale steel moment frame that includes a relatively complex nonlinear numerical substructure. Lastly, a more complex numerical substructure is considered by constructing a nonlinear computational model of a moment frame coupled to a hybrid model of a 1:2 scale steel gravity frame. The last two case studies are conducted on the same porotype structure and the selection of time steps and fixed number of iterations are closely examined in pre-test simulations. The generated unbalance forces is used as an index to track the equilibrium error and predict the accuracy and stability of the simulations.

  20. Trends in Animal Rabies Surveillance in the Endemic State of Minas Gerais, Brazil

    PubMed Central

    Oviedo-Pastrana, Misael E.; Oliveira, Camila S. F.; Capanema, Renato O.; Nicolino, Rafael R.; Oviedo-Socarras, Teresa J.; Haddad, João Paulo A.

    2015-01-01

    Rabies is a viral zoonosis affecting mammal species and causes large economic losses. Included among the neglected diseases, it is still insufficiently addressed by governments and the international community, despite formal surveillance and control programs. This study used a dataset of 10,112 rabies diagnoses in animals provided by the Brazilian passive surveillance system from 2001 to 2012. The positivity rate of the tested samples was 26.4%, and a reduction in the total samples sent during the last six years was observed. The kernel density map indicated case concentration in the south region and a decrease in density of rabies cases in the second period studied (2007 to 2012). The directional trend of positive rabies diagnoses remained in the south region, as shown by the standard deviational ellipse. The spatial scan statistic identified three large clusters of positive diagnoses, one in the first period (2001-2006) and two in the second period (2007-2012), indicating an expansion of risk areas. The decrease in rabies cases from 2006 to 2012 does not necessarily reflect lower viral circulation or improvement in actions by epidemiological surveillance; this decrease could indicate a deficiency in epidemiological surveillance during the observation period due to the increase in the silent areas. Surveillance should maintain an increasing or constant number of tests during the years in addition to a reduction in the number of outbreaks of rabies, which would indicate a lower positivity rate. The findings in this study indicate deterioration in the effectiveness of the passive surveillance for rabies. The number of rabies cases, total number of tests performed and positivity rate are good indicators for evaluating passive surveillance. This paper can function as a guide for the assessment and improvement of the actions in passive surveillance of rabies. PMID:25774775

  1. An index of biological integrity (IBI) for Pacific Northwest rivers

    USGS Publications Warehouse

    Mebane, C.A.; Maret, T.R.; Hughes, R.M.

    2003-01-01

    The index of biotic integrity (IBI) is a commonly used measure of relative aquatic ecosystem condition; however, its application to coldwater rivers over large geographic areas has been limited. A seven-step process was used to construct and test an IBI applicable to fish assemblages in coldwater rivers throughout the U.S. portion of the Pacific Northwest. First, fish data from the region were compiled from previous studies and candidate metrics were selected. Second, reference conditions were estimated from historical reports and minimally disturbed reference sites in the region. Third, data from the upper Snake River basin were used to test metrics and develop the initial index. Fourth, candidate metrics were evaluated for their redundancy, variability, precision, and ability to reflect a wide range of conditions while distinguishing reference sites from disturbed sites. Fifth, the selected metrics were standardized by being scored continuously from 0 to 1 and then weighted as necessary to produce an IBI ranging from 0 to 100. The resulting index included 10 metrics: number of native coldwater species, number of age-classes of sculpins Cottus spp., percentage of sensitive native individuals, percentage of coldwater individuals, percentage of tolerant individuals, number of alien species, percentage of common carp Cyprinus carpio individuals, number of selected salmonid age-classes, catch per unit effort of coldwater individuals, and percentage of individuals with selected anomalies. Sixth, the IBI responses were tested with additional data sets from throughout the Pacific Northwest. Last, scores from two minimally disturbed reference rivers were evaluated for longitudinal gradients along the river continuum. The IBI responded to environmental disturbances and was spatially and temporally stable at over 150 sites in the Pacific Northwest. The results support its use across a large geographic area to describe the relative biological condition of coolwater and coldwater rivers with low species richness.

  2. In situ performance curves measurements of large pumps

    NASA Astrophysics Data System (ADS)

    Anton, A.

    2010-08-01

    The complex energetic system on the river Lotru in Romania comprises of a series of lakes and pumping stations and a major hydroelectric power plant: Lotru-Ciunget. All the efforts have been oriented towards the maintenance of the Pelton turbines and very little attention has been directed to the pumps. In the system, there are three large pumping stations and only in the last 5 years, the pump performances have become a concern. The performances where determined using portable ultrasonic flow meters, a Yates meter, precision manometers and appropriate electrical equipment for power measurement (Power Analiser - NORMA D4000 LEM). The measurements are not supposed to interfere with the normal operation so only a limited number of tests could be performed. Based on those tests, portions of the test curves have been measured and represented in specific diagrams.

  3. Small-scale fixed wing airplane software verification flight test

    NASA Astrophysics Data System (ADS)

    Miller, Natasha R.

    The increased demand for micro Unmanned Air Vehicles (UAV) driven by military requirements, commercial use, and academia is creating a need for the ability to quickly and accurately conduct low Reynolds Number aircraft design. There exist several open source software programs that are free or inexpensive that can be used for large scale aircraft design, but few software programs target the realm of low Reynolds Number flight. XFLR5 is an open source, free to download, software program that attempts to take into consideration viscous effects that occur at low Reynolds Number in airfoil design, 3D wing design, and 3D airplane design. An off the shelf, remote control airplane was used as a test bed to model in XFLR5 and then compared to flight test collected data. Flight test focused on the stability modes of the 3D plane, specifically the phugoid mode. Design and execution of the flight tests were accomplished for the RC airplane using methodology from full scale military airplane test procedures. Results from flight test were not conclusive in determining the accuracy of the XFLR5 software program. There were several sources of uncertainty that did not allow for a full analysis of the flight test results. An off the shelf drone autopilot was used as a data collection device for flight testing. The precision and accuracy of the autopilot is unknown. Potential future work should investigate flight test methods for small scale UAV flight.

  4. A comparison of soil moisture characteristics predicted by the Arya-Paris model with laboratory-measured data

    NASA Technical Reports Server (NTRS)

    Arya, L. M.; Richter, J. C.; Davidson, S. A. (Principal Investigator)

    1982-01-01

    Soil moisture characteristics predicted by the Arya-Paris model were compared with the laboratory measured data for 181 New Jersey soil horizons. For a number of soil horizons, the predicted and the measured moisture characteristic curves are almost coincident; for a large number of other horizons, despite some disparity, their shapes are strikingly similar. Uncertainties in the model input and laboratory measurement of the moisture characteristic are indicated, and recommendations for additional experimentation and testing are made.

  5. North American pollinosis due to insect-pollinated plants.

    PubMed

    Lewis, W H; Vinay, P

    1979-05-01

    In warmer regions of North America many newly introduced plants are cultivated widely and others are becoming aggresive naturalized weeds. A large number are insect-pollinated and shed considerable quantities of airborne pollen. Levels of allergenicity based on skin test data, numbers of patients having immediate hypersensitivity and localities where airborne pollen grains have been identified are presented for each genus of entomophilous plants considered incitants of pollinosis. Among the most relevant are Acacia, Brassica, Citrus, Ligustrum, Olea and Schinus.

  6. Aquatic Plant Control Research Program. Large-Scale Operations Management Test (LSOMT) of Insects and Pathogens for Control of Waterhyacinth in Louisiana. Volume 1. Results for 1979-1981.

    DTIC Science & Technology

    1985-01-01

    RD-Ai56 759 AQUATIC PLANT CONTROL RESEARCH PROGRAM LARGE-SCALE 1/2 OPERATIONS MRNAGEMENT..(U) ARMY ENGINEER WATERAYS EXPERIMENT STATION VICKSBURG MS...PO Box 631, Vicksburg, Aquatic Plant Control Mississippi 39180-0631 and University of Research Program Tennessee-Chattanooga, Chattanooga, 12...19. KEY WORDS (Continue on reverse side if necesary nd identify by block number) - Aquatic plant control Louisiana Biological control Plant

  7. Using a Math Pre-Test in a Large General Education Geoscience Course: How Effective?

    NASA Astrophysics Data System (ADS)

    Richardson, R. M.

    2006-12-01

    Teaching large (150 or more students) General Education Geoscience courses presents many challenges, but one of the most important is how to effectively incorporate quantitative literacy. Many students are math phobic, and will run to General Education courses that minimize quantitative aspects. I will present results from one approach that we have used successfully for at least two years: a math pre-test. Our General Education Geoscience course has no prerequisites other than admission to the University, and is designed for first and second year non-science students. Fortunately, with limited exceptions, all entering students at the University of Arizona take a Math Readiness Test (MRT) for math placement. With the cooperation of the Mathematics Department, we have used old MRT exams to selectively use questions that are of the highest utility for the course material `understanding graphs, linear equations and extrapolations, scientific notation and large numbers, word problems, and scaling/unit conversions. We administer the exam in the first discussion section. Students receive full credit for a `serious effort', and we score the exam. In recent semesters the percentage of correct answers has varied from just under 50% to nearly 90% on individual questions. The pre-test has several important benefits. First, it lets students know clearly up front that there will be mathematics in the class. Second, it lets students know the range of skills expected to be successful. Third, because the average score is between 70-80% it gives students confidence that they can do the math in the course. Fourth, we contact all students who score less than 50%, and offer help, including referral to tutoring service in Mathematics. Feedback from students has been positive. Unfortunately, when we compared scores on the math pre-test to final grades in the course, we found essentially no correlation. We are exploring a number of possible explanations. We are also seeing if our math pre-test scores correlate with the initial MRT score, and overall student success.

  8. Inner-outer predictive wall model for wall-bounded turbulence in hypersonic flow

    NASA Astrophysics Data System (ADS)

    Martin, M. Pino; Helm, Clara M.

    2017-11-01

    The inner-outer predictive wall model of Mathis et al. is modified for hypersonic turbulent boundary layers. The model is based on a modulation of the energized motions in the inner layer by large scale momentum fluctuations in the logarithmic layer. Using direct numerical simulation (DNS) data of turbulent boundary layers with free stream Mach number 3 to 10, it is shown that the variation of the fluid properties in the compressible flows leads to large Reynolds number (Re) effects in the outer layer and facilitate the modulation observed in high Re incompressible flows. The modulation effect by the large scale increases with increasing free-stream Mach number. The model is extended to include spanwise and wall-normal velocity fluctuations and is generalized through Morkovin scaling. Temperature fluctuations are modeled using an appropriate Reynolds Analogy. Density fluctuations are calculated using an equation of state and a scaling with Mach number. DNS data are used to obtain the universal signal and parameters. The model is tested by using the universal signal to reproduce the flow conditions of Mach 3 and Mach 7 turbulent boundary layer DNS data and comparing turbulence statistics between the modeled flow and the DNS data. This work is supported by the Air Force Office of Scientific Research under Grant FA9550-17-1-0104.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    >Fundamental Alloying. Studies of crystal structures, reactions at metal surfaces, spectroscopy of molten salts, mechanical deformation, and alloy theory are reported. Long-Range Applied Metallurgy. A thermal comparator is described and the characteristic temperature of U0/sub 2/ determined. Sintering studies were carried out on ThO/sub 2/. The diffusion of fission products in fuel and of Al/sup 26/ and Mn/sup 54/ in Al and the reaction of Be with UC were studied. Transformation and oxidation data were obtained for a number of Zr alloys. Reactor Metallurgy. A large number of ceramic technology projects are described. Some corrosion data are given for metalsmore » exposed to impure He and molten fluorides. Studies were made of the fission-gas-retention Properties of ceramic fuel bodies. A large number of materials compatibility studies are described. The mechanical properties of some reactor materials were studied. Fabrication work was conducted to develop materials for application in low-, medium-, and high-temperature reactors or systems. A large number of new metallographic and nondestructive testing techniques are reported. Studies were carried out on the oxidation, carburization, and stability of alloys. Equipment for postirradiation examination is described. Preparation of some alloys and dispersion fuels by powder metallurgy methods was studied. The development of welding and brazing techniques for reactor materials is described. (D.L.C.)« less

  10. Improvement of reliability in multi-interferometer-based counterfactual deterministic communication with dissipation compensation.

    PubMed

    Liu, Chao; Liu, Jinhong; Zhang, Junxiang; Zhu, Shiyao

    2018-02-05

    The direct counterfactual quantum communication (DCQC) is a surprising phenomenon that quantum information can be transmitted without using any carriers of physical particles. The nested interferometers are promising devices for realizing DCQC as long as the number of interferometers goes to be infinity. Considering the inevitable loss or dissipation in practical experimental interferometers, we analyze the dependence of reliability on the number of interferometers, and show that the reliability of direct communication is being rapidly degraded with the large number of interferometers. Furthermore, we simulate and test this counterfactual deterministic communication protocol with a finite number of interferometers, and demonstrate the improvement of the reliability using dissipation compensation in interferometers.

  11. Factors affecting the sticking of insects on modified aircraft wings

    NASA Technical Reports Server (NTRS)

    Yi, O.; Chitsaz-Z, M. R.; Eiss, N. S.; Wightman, J. P.

    1988-01-01

    Previous work showed that the total number of insects sticking to an aluminum surface was reduced by coating the aluminum surface with elastomers. Due to a large number of possible experimental errors, no correlation between the modulus of elasticity, the elastomer, and the total number of insects sticking to a given elastomer was obtained. One of the errors assumed to be introduced during the road test is a variable insect flux so the number of insects striking one surface might be different from that striking another sample. To eliminate this source of error, the road test used to collect insects was simulated in a laboratory by development of an insect impacting technique using a pipe and high pressure compressed air. The insects are accelerated by a compressed air gun to high velocities and are then impacted with a stationary target on which the sample is mounted. The velocity of an object exiting from the pipe was determined and further improvement of the technique was achieved to obtain a uniform air velocity distribution.

  12. Applications of Magnetic Suspension Technology to Large Scale Facilities: Progress, Problems and Promises

    NASA Technical Reports Server (NTRS)

    Britcher, Colin P.

    1997-01-01

    This paper will briefly review previous work in wind tunnel Magnetic Suspension and Balance Systems (MSBS) and will examine the handful of systems around the world currently known to be in operational condition or undergoing recommissioning. Technical developments emerging from research programs at NASA and elsewhere will be reviewed briefly, where there is potential impact on large-scale MSBSS. The likely aerodynamic applications for large MSBSs will be addressed, since these applications should properly drive system designs. A recently proposed application to ultra-high Reynolds number testing will then be addressed in some detail. Finally, some opinions on the technical feasibility and usefulness of a large MSBS will be given.

  13. Dissociations and interactions between time, numerosity and space processing

    PubMed Central

    Cappelletti, Marinella; Freeman, Elliot D.; Cipolotti, Lisa

    2009-01-01

    This study investigated time, numerosity and space processing in a patient (CB) with a right hemisphere lesion. We tested whether these magnitude dimensions share a common magnitude system or whether they are processed by dimension-specific magnitude systems. Five experimental tasks were used: Tasks 1–3 assessed time and numerosity independently and time and numerosity jointly. Tasks 4 and 5 investigated space processing independently and space and numbers jointly. Patient CB was impaired at estimating time and at discriminating between temporal intervals, his errors being underestimations. In contrast, his ability to process numbers and space was normal. A unidirectional interaction between numbers and time was found in both the patient and the control subjects. Strikingly, small numbers were perceived as lasting shorter and large numbers as lasting longer. In contrast, number processing was not affected by time, i.e. short durations did not result in perceiving fewer numbers and long durations in perceiving more numbers. Numbers and space also interacted, with small numbers answered faster when presented on the left side of space, and the reverse for large numbers. Our results demonstrate that time processing can be selectively impaired. This suggests that mechanisms specific for time processing may be partially independent from those involved in processing numbers and space. However, the interaction between numbers and time and between numbers and space also suggests that although independent, there maybe some overlap between time, numbers and space. These data suggest a partly shared mechanism between time, numbers and space which may be involved in magnitude processing or may be recruited to perform cognitive operations on magnitude dimensions. PMID:19501604

  14. Antinuclear Antibodies predict a higher number of Pregnancy Loss in Unexplained Recurrent Pregnancy Loss.

    PubMed

    Sakthiswary, R; Rajalingam, S; Norazman, M R; Hussein, H

    The etiology of recurrent pregnancy loss (RPL) is unknown in a significant proportion of patients. Autoimmune processes have been implicated in the pathogenesis. The role of antinuclear antibody (ANA) in this context is largely undetermined. In an attempt to address the lack of evidence in this area, we explored the clinical significance of antinuclear antibody (ANA) in unexplained RPL. We studied 68 patients with RPL and 60 healthy controls from September 2005 to May 2012. All subjects were tested for ANA by immunofluorescence testing, and a titer of 1: 80 and above was considered positive. We compared the pregnancy outcome between the ANA positive and ANA negative RPL cases. The incidence of ANA positivity among the cases (35.3%) was significantly higher than the controls (13.3%) (p=0.005). ANA positive cases showed significantly higher number of RPL (p=0.006) and lower number of successful pregnancies (p=0.013) compared to the ANA negative cases . The ANA titre had a significant association with the number of RPL (p<0.05, r=0.724) but not with the number of successful pregnancies (p=0.054). ANA positivity predicts a less favorable pregnancy outcome in RPL. Our findings suggest that the ANA titre is a useful positive predictor of the number of RPL. Hence, ANA test is a potential prognostic tool for this condition which merits further research.

  15. A finite-volume ELLAM for three-dimensional solute-transport modeling

    USGS Publications Warehouse

    Russell, T.F.; Heberton, C.I.; Konikow, Leonard F.; Hornberger, G.Z.

    2003-01-01

    A three-dimensional finite-volume ELLAM method has been developed, tested, and successfully implemented as part of the U.S. Geological Survey (USGS) MODFLOW-2000 ground water modeling package. It is included as a solver option for the Ground Water Transport process. The FVELLAM uses space-time finite volumes oriented along the streamlines of the flow field to solve an integral form of the solute-transport equation, thus combining local and global mass conservation with the advantages of Eulerian-Lagrangian characteristic methods. The USGS FVELLAM code simulates solute transport in flowing ground water for a single dissolved solute constituent and represents the processes of advective transport, hydrodynamic dispersion, mixing from fluid sources, retardation, and decay. Implicit time discretization of the dispersive and source/sink terms is combined with a Lagrangian treatment of advection, in which forward tracking moves mass to the new time level, distributing mass among destination cells using approximate indicator functions. This allows the use of large transport time increments (large Courant numbers) with accurate results, even for advection-dominated systems (large Peclet numbers). Four test cases, including comparisons with analytical solutions and benchmarking against other numerical codes, are presented that indicate that the FVELLAM can usually yield excellent results, even if relatively few transport time steps are used, although the quality of the results is problem-dependent.

  16. Mite fauna and fungal flora in house dust from homes of asthmatic children.

    PubMed

    Ishii, A; Takaoka, M; Ichinoe, M; Kabasawa, Y; Ouchi, T

    1979-12-01

    Mite fauna and fungal flora in the house dust from homes of asthmatic children with positive and negative skin test to house dust allergen and non-asthmatic controls were examined. There was no conspicuous difference in mite species distribution among the three groups. Pyroglyphid mites dominate the mite fauna in house dust more than half of which being Dermatophagoides: D. pteronyssinus and D. farinae. There was no statistically significant difference in numbers between the two species and either species could dominate depending on the conditions of the individual houses. The average number of acarina in 0.5 g of fine dust did not differ statistically among the three groups; however, mite number per square meter floor differed between patients with positive skin test and negative skin test. The results suggest that house-cleaning might influence the possible sensitization of children. The genetic distribution of mould fungi in house dust was largely similar to that of airborne fungi. The average number of fungal colonies detected in 0.5 g of dust did not differ statistically among the three groups. Wallemia with its minute spores may cause sensitization but has so far been insufficiently investigated.

  17. A Benchmark for Comparing Different Approaches for Specifying and Verifying Real-Time Systems

    DTIC Science & Technology

    1993-01-01

    To be considered correct or useful, real - time systems must deliver results within specified time intervals, either without exception or with high...probability. Recently, a large number of formal methods have been invented for specifying and verifying real - time systems . It has been suggested that...these formal methods need to be tested out on actual real - time systems . Such testing will allow the scalability of the methods to be assessed and also

  18. Defense AT&L. Volume 40, Number 4, July-August 2011

    DTIC Science & Technology

    2011-08-01

    First Article Test First article testing ( FAT ) can ensure that the contractor can furnish a product that conforms to all contract requirements...for acceptance. It allows you to verify capability before com- mitting to a single vendor for a large quantity. On most occa- sions, FAT will increase...schedule and cost. To manufacture one item is usually very inefficient, so the cost to include FAT has to be considered. On the other hand, if you

  19. Investigations on the performance of chevron type plate heat exchangers

    NASA Astrophysics Data System (ADS)

    Dutta, Oruganti Yaga; Nageswara Rao, B.

    2018-01-01

    This paper presents empirical relations for the chevron type plate heat exchangers (PHEs) and demonstrated their validity through comparison of test data of PHEs. In order to examine the performance of PHEs, the pressure drop(Δ P), the overall heat transfer coefficient ( U m ) and the effectiveness ( ɛ) are estimated by considering the properties of plate material and working fluid, number of plates ( N t ) and chevron angle( β). It is a known fact that, large surface area of the plate provides more rate of heat transfer ( \\dot{Q} ) thereby more effectiveness ( ɛ). However, there is a possibility to achieve the required performance by increasing the number of plates without altering the plate dimensions, which avoids the new design of the system. Application of the Taguchi's design of experiments is examined with less number of experiments and demonstrated by setting the levels for the parameters and compared the test data with the estimated output responses.

  20. Development and testing of aluminum micro channel heat sink

    NASA Astrophysics Data System (ADS)

    Kumaraguruparan, G.; Sornakumar, T.

    2010-06-01

    Microchannel heat sinks constitute an innovative cooling technology for the removal of a large amount of heat from a small area and are suitable for electronics cooling. In the present work, Tool Steel D2 grade milling slitting saw type plain milling cutter is fabricated The microchannels are machined in aluminum work pieces to form the microchannel heat sink using the fabricated milling cutter in an horizontal milling machine. A new experimental set-up is fabricated to conduct the tests on the microchannel heat sink. The heat carried by the water increases with mass flow rate and heat input. The heat transfer coefficient and Nusselt number increases with mass flow rate and increased heat input. The pressure drop increases with Reynolds number and decreases with input heat. The friction factor decreases with Reynolds number and decreases with input heat. The thermal resistance decreases with pumping power and decreases with input heat.

  1. Statistical analysis of early failures in electromigration

    NASA Astrophysics Data System (ADS)

    Gall, M.; Capasso, C.; Jawarani, D.; Hernandez, R.; Kawasaki, H.; Ho, P. S.

    2001-07-01

    The detection of early failures in electromigration (EM) and the complicated statistical nature of this important reliability phenomenon have been difficult issues to treat in the past. A satisfactory experimental approach for the detection and the statistical analysis of early failures has not yet been established. This is mainly due to the rare occurrence of early failures and difficulties in testing of large sample populations. Furthermore, experimental data on the EM behavior as a function of varying number of failure links are scarce. In this study, a technique utilizing large interconnect arrays in conjunction with the well-known Wheatstone Bridge is presented. Three types of structures with a varying number of Ti/TiN/Al(Cu)/TiN-based interconnects were used, starting from a small unit of five lines in parallel. A serial arrangement of this unit enabled testing of interconnect arrays encompassing 480 possible failure links. In addition, a Wheatstone Bridge-type wiring using four large arrays in each device enabled simultaneous testing of 1920 interconnects. In conjunction with a statistical deconvolution to the single interconnect level, the results indicate that the electromigration failure mechanism studied here follows perfect lognormal behavior down to the four sigma level. The statistical deconvolution procedure is described in detail. Over a temperature range from 155 to 200 °C, a total of more than 75 000 interconnects were tested. None of the samples have shown an indication of early, or alternate, failure mechanisms. The activation energy of the EM mechanism studied here, namely the Cu incubation time, was determined to be Q=1.08±0.05 eV. We surmise that interface diffusion of Cu along the Al(Cu) sidewalls and along the top and bottom refractory layers, coupled with grain boundary diffusion within the interconnects, constitutes the Cu incubation mechanism.

  2. Experimental and Computational Evaluation of Flush-Mounted, S-Duct Inlets

    NASA Technical Reports Server (NTRS)

    Berrier, Bobby L.; Allan, Brian G.

    2004-01-01

    A new high Reynolds number test capability for boundary layer ingesting inlets has been developed for the NASA Langley Research Center 0.3-Meter Transonic Cryogenic Tunnel. Using this new capability. an experimental investigation of four S-duct inlet configurations was conducted. A computational study of one of the inlets was also conducted using a Navier-Stokes solver. The objectives of this investigation were to: 1) develop a new high Reynolds number inlet test capability for flush-mounted inlets; 2) provide a database for CFD tool validation; 3) evaluate the performance of S-duct inlets with large amounts of boundary layer ingestion; and 4) provide a baseline inlet for future inlet flow-control studies. Tests were conducted at Mach numbers from 0.25 to 0.83. Reynolds numbers (based on duct exit diameter) from 5.1 million to a full-scale value of 13.9 million, and inlet mass-flow ratios from 0.39 to 1.58 depending on Mach number. Results of the experimental study indicate that inlet pressure recovery generally decreased and inlet distortion generally increased with increasing Mach number. Except at low Mach numbers, increasing inlet mass-flow increased pressure recovery and increased distortion. Increasing the amount of boundary layer ingestion or ingesting a boundary layer with a distorted profile decreased pressure recovery and increased distortion. Finally, increasing Reynolds number had almost no effect on inlet distortion but increased inlet recovery by about one-half percent at a Mach number near cruise. The computational results captured the inlet pressure recovery and distortion trends with Mach number and inlet mass-flow well: the reversal of the pressure recovery trend with increasing inlet mass-flow at low and high Mach numbers was predicted by CFD. However, CFD results were generally more pessimistic (larger losses) than measured experimentally.

  3. [Particulate distribution characteristics of Chinese phrase V diesel engine based on butanol-diesel blends].

    PubMed

    Lou, Di-Ming; Xu, Ning; Fan, Wen-Jia; Zhang, Tao

    2014-02-01

    With a common rail diesel engine without any modification and the engine exhaust particle number and particle size analyzer EEPS, this study used the air-fuel ratio to investigate the particulate number concentration, mass concentration and number distribution characteristics of a diesel engine fueled with butanol-diesel blends (Bu10, Bu15, Bu20, Bu30 and Bu40) and petroleum diesel. The results show: for all test fuels, the particle number distributions turn to be unimodal. With the increasing of butanol, numbers of nucleation mode particles and small accumulation mode particle decrease. At low speed and low load conditions, the number of large accumulation mode particle increases slightly, but under higher speed and load conditions, the number does not increase. When the fuels contain butanol, the total particle number concentration and mass concentration in all conditions decrease and that is more obvious at high speed load.

  4. Emissions of air pollutants from scented candles burning in a test chamber

    NASA Astrophysics Data System (ADS)

    Derudi, Marco; Gelosa, Simone; Sliepcevich, Andrea; Cattaneo, Andrea; Rota, Renato; Cavallo, Domenico; Nano, Giuseppe

    2012-08-01

    Burning of scented candles in indoor environment can release a large number of toxic chemicals. However, in spite of the large market penetration of scented candles, very few works investigated their organic pollutants emissions. This paper investigates volatile organic compounds emissions, with particular reference to the priority indoor pollutants identified by the European Commission, from the burning of scented candles in a laboratory-scale test chamber. It has been found that BTEX and PAHs emission factors show large differences among different candles, possibly due to the raw paraffinic material used, while aldehydes emission factors seem more related to the presence of additives. This clearly evidences the need for simple and cheap methodologies to measure the emission factors of commercial candles in order to foresee the expected pollutant concentration in a given indoor environment and compare it with health safety standards.

  5. Built-In Data-Flow Integration Testing in Large-Scale Component-Based Systems

    NASA Astrophysics Data System (ADS)

    Piel, Éric; Gonzalez-Sanchez, Alberto; Gross, Hans-Gerhard

    Modern large-scale component-based applications and service ecosystems are built following a number of different component models and architectural styles, such as the data-flow architectural style. In this style, each building block receives data from a previous one in the flow and sends output data to other components. This organisation expresses information flows adequately, and also favours decoupling between the components, leading to easier maintenance and quicker evolution of the system. Integration testing is a major means to ensure the quality of large systems. Their size and complexity, together with the fact that they are developed and maintained by several stake holders, make Built-In Testing (BIT) an attractive approach to manage their integration testing. However, so far no technique has been proposed that combines BIT and data-flow integration testing. We have introduced the notion of a virtual component in order to realize such a combination. It permits to define the behaviour of several components assembled to process a flow of data, using BIT. Test-cases are defined in a way that they are simple to write and flexible to adapt. We present two implementations of our proposed virtual component integration testing technique, and we extend our previous proposal to detect and handle errors in the definition by the user. The evaluation of the virtual component testing approach suggests that more issues can be detected in systems with data-flows than through other integration testing approaches.

  6. Results of the NASP Ames Integrated Mixing Hypersonic Engine (AIMHYE) Scramjet Test Program

    NASA Technical Reports Server (NTRS)

    Cavolowsky, John A.; Loomis, Mark P.; Deiwert, George S.

    1995-01-01

    This paper describes the test techniques and results from the National Aerospace Plane Government Work Package 53, the Ames Integrated Mixing Hypersonic Engine (AIMHYE) Scramjet Test program conducted in the NASA Ames 16-Inch Combustion Driven Shock Tunnel. This was a series of near full-scale scramjet combustor tests with the objective to obtain high speed combustor and nozzle data from an engine with injector configurations similar to the NASP E21 and E22a designs. The experimental test approach was to use a large combustor model (80-100% throat height) designed and fabricated for testing in the semi-free jet mode. The conditions tested were similar to the "blue book" conditions at Mach 12, 14, and 16. GWP 53 validated use of large, long test time impulse facilities, specifically the Ames 16-Inch Shock Tunnel, for high Mach number scramjet propulsion testing an integrated test rig (inlet, combustor, and nozzle). Discussion of key features of the test program will include: effects of the 2-D combustor inlet pressure profile; performance of large injectors' fueling system that included nozzlettes, base injection, and film cooling; and heat transfer measurements to the combustor. Significant instrumentation development and application efforts include the following: combustor force balance application for measurement of combustor drag for comparison with integrated point measurements of skin friction; nozzle metric strip for measuring thrust with comparison to integrated pressure measurements; and nonintrusive optical fiber-based diode laser absorption measurements of combustion products for determination of combustor performance. Direct measurements will be reported for specific test article configurations and compared with CFD solutions.

  7. An Evaluation of Systematic Tuberculosis Screening at Private Facilities in Karachi, Pakistan

    PubMed Central

    Creswell, Jacob; Khowaja, Saira; Codlin, Andrew; Hashmi, Rabia; Rasheed, Erum; Khan, Mubashir; Durab, Irfan; Mergenthaler, Christina; Hussain, Owais; Khan, Faisal; Khan, Aamir J.

    2014-01-01

    Background In Pakistan, like many Asian countries, a large proportion of healthcare is provided through the private sector. We evaluated a systematic screening strategy to identify people with tuberculosis in private facilities in Karachi and assessed the approaches' ability to diagnose patients earlier in their disease progression. Methods and Findings Lay workers at 89 private clinics and a large hospital outpatient department screened all attendees for tuberculosis using a mobile phone-based questionnaire during one year. The number needed to screen to detect a case of tuberculosis was calculated. To evaluate early diagnosis, we tested for differences in cough duration and smear grading by screening facility. 529,447 people were screened, 1,010 smear-positive tuberculosis cases were detected and 942 (93.3%) started treatment, representing 58.7% of all smear-positive cases notified in the intervention area. The number needed to screen to detect a smear-positive case was 124 (prevalence 806/100,000) at the hospital and 763 (prevalence 131/100,000) at the clinics; however, ten times the number of individuals were screened in clinics. People with smear-positive TB detected at the hospital were less likely to report cough lasting 2–3 weeks (RR 0.66 95%CI [0.49–0.90]) and more likely to report cough duration >3 weeks (RR 1.10 95%CI [1.03–1.18]). Smear-positive cases at the clinics were less likely to have a +3 grade (RR 0.76 95%CI [0.63–0.92]) and more likely to have +1 smear grade (RR 1.24 95%CI [1.02–1.51]). Conclusions Tuberculosis screening at private facilities is acceptable and can yield large numbers of previously undiagnosed cases. Screening at general practitioner clinics may find cases earlier than at hospitals although more people must be screened to identify a case of tuberculosis. Limitations include lack of culture testing, therefore underestimating true TB prevalence. Using more sensitive and specific screening and diagnostic tests such as chest x-ray and Xpert MTB/RIF may improve results. PMID:24705600

  8. An evaluation of systematic tuberculosis screening at private facilities in Karachi, Pakistan.

    PubMed

    Creswell, Jacob; Khowaja, Saira; Codlin, Andrew; Hashmi, Rabia; Rasheed, Erum; Khan, Mubashir; Durab, Irfan; Mergenthaler, Christina; Hussain, Owais; Khan, Faisal; Khan, Aamir J

    2014-01-01

    In Pakistan, like many Asian countries, a large proportion of healthcare is provided through the private sector. We evaluated a systematic screening strategy to identify people with tuberculosis in private facilities in Karachi and assessed the approaches' ability to diagnose patients earlier in their disease progression. Lay workers at 89 private clinics and a large hospital outpatient department screened all attendees for tuberculosis using a mobile phone-based questionnaire during one year. The number needed to screen to detect a case of tuberculosis was calculated. To evaluate early diagnosis, we tested for differences in cough duration and smear grading by screening facility. 529,447 people were screened, 1,010 smear-positive tuberculosis cases were detected and 942 (93.3%) started treatment, representing 58.7% of all smear-positive cases notified in the intervention area. The number needed to screen to detect a smear-positive case was 124 (prevalence 806/100,000) at the hospital and 763 (prevalence 131/100,000) at the clinics; however, ten times the number of individuals were screened in clinics. People with smear-positive TB detected at the hospital were less likely to report cough lasting 2-3 weeks (RR 0.66 95%CI [0.49-0.90]) and more likely to report cough duration >3 weeks (RR 1.10 95%CI [1.03-1.18]). Smear-positive cases at the clinics were less likely to have a +3 grade (RR 0.76 95%CI [0.63-0.92]) and more likely to have +1 smear grade (RR 1.24 95%CI [1.02-1.51]). Tuberculosis screening at private facilities is acceptable and can yield large numbers of previously undiagnosed cases. Screening at general practitioner clinics may find cases earlier than at hospitals although more people must be screened to identify a case of tuberculosis. Limitations include lack of culture testing, therefore underestimating true TB prevalence. Using more sensitive and specific screening and diagnostic tests such as chest x-ray and Xpert MTB/RIF may improve results.

  9. Towards a theory of tiered testing.

    PubMed

    Hansson, Sven Ove; Rudén, Christina

    2007-06-01

    Tiered testing is an essential part of any resource-efficient strategy for the toxicity testing of a large number of chemicals, which is required for instance in the risk management of general (industrial) chemicals, In spite of this, no general theory seems to be available for the combination of single tests into efficient tiered testing systems. A first outline of such a theory is developed. It is argued that chemical, toxicological, and decision-theoretical knowledge should be combined in the construction of such a theory. A decision-theoretical approach for the optimization of test systems is introduced. It is based on expected utility maximization with simplified assumptions covering factual and value-related information that is usually missing in the development of test systems.

  10. Influenza newspaper reports and the influenza epidemic: an observational study in Fukuoka City, Japan.

    PubMed

    Hagihara, Akihito; Onozuka, Daisuke; Miyazaki, Shougo; Abe, Takeru

    2015-12-30

    We examined whether the weekly number of newspaper articles reporting on influenza was related to the incidence of influenza in a large city. Prospective, non-randomised, observational study. Registry data of influenza cases in Fukuoka City, Japan. A total of 83,613 cases of influenza cases that occurred between October 1999 and March 2007 in Fukuoka City, Japan. A linear model with autoregressive time series errors was fitted to time series data on the incidence of influenza and the accumulated number of influenza-related newspaper articles with different time lags in Fukuoka City, Japan. In order to obtain further evidence that the number of newspaper articles a week with specific time lags is related to the incidence of influenza, Granger causality was also tested. Of the 16 models including 'number of newspaper articles' with different time lags between 2 and 17 weeks (xt-2 to t-17), the β coefficients of 'number of newspaper articles' at time lags between t-5 and t-13 were significant. However, the β coefficients of 'number of newspaper articles' that are significant with respect to the Granger causality tests (p<0.05) were the weekly number of newspaper articles at time lags between t-6 and t-10 (time shift of 10 weeks, β=-0.301, p<0.01; time shift of 9 weeks, β=-0.200, p<0.01; time shift of 8 weeks, β=-0.156, p<0.01; time shift of 7 weeks, β=-0.122, p<0.05; time shift of 6 weeks, β=-0.113, p<0.05). We found that the number of newspaper articles reporting on influenza in a week was related to the incidence of influenza 6-10 weeks after media coverage in a large city in Japan. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  11. Influenza newspaper reports and the influenza epidemic: an observational study in Fukuoka City, Japan

    PubMed Central

    Hagihara, Akihito; Onozuka, Daisuke; Miyazaki, Shougo; Abe, Takeru

    2015-01-01

    Objectives We examined whether the weekly number of newspaper articles reporting on influenza was related to the incidence of influenza in a large city. Design Prospective, non-randomised, observational study. Setting Registry data of influenza cases in Fukuoka City, Japan. Participants A total of 83 613 cases of influenza cases that occurred between October 1999 and March 2007 in Fukuoka City, Japan. Main outcome measure A linear model with autoregressive time series errors was fitted to time series data on the incidence of influenza and the accumulated number of influenza-related newspaper articles with different time lags in Fukuoka City, Japan. In order to obtain further evidence that the number of newspaper articles a week with specific time lags is related to the incidence of influenza, Granger causality was also tested. Results Of the 16 models including ‘number of newspaper articles’ with different time lags between 2 and 17 weeks (xt-2 to t-17), the β coefficients of ‘number of newspaper articles’ at time lags between t-5 and t-13 were significant. However, the β coefficients of ‘number of newspaper articles’ that are significant with respect to the Granger causality tests (p<0.05) were the weekly number of newspaper articles at time lags between t-6 and t-10 (time shift of 10 weeks, β=−0.301, p<0.01; time shift of 9 weeks, β=−0.200, p<0.01; time shift of 8 weeks, β=−0.156, p<0.01; time shift of 7 weeks, β=−0.122, p<0.05; time shift of 6 weeks, β=−0.113, p<0.05). Conclusions We found that the number of newspaper articles reporting on influenza in a week was related to the incidence of influenza 6–10 weeks after media coverage in a large city in Japan. PMID:26719323

  12. Measuring working memory capacity in children using adaptive tasks: Example validation of an adaptive complex span.

    PubMed

    Gonthier, Corentin; Aubry, Alexandre; Bourdin, Béatrice

    2018-06-01

    Working memory tasks designed for children usually present trials in order of ascending difficulty, with testing discontinued when the child fails a particular level. Unfortunately, this procedure comes with a number of issues, such as decreased engagement from high-ability children, vulnerability of the scores to temporary mind-wandering, and large between-subjects variations in number of trials, testing time, and proactive interference. To circumvent these problems, the goal of the present study was to demonstrate the feasibility of assessing working memory using an adaptive testing procedure. The principle of adaptive testing is to dynamically adjust the level of difficulty as the task progresses to match the participant's ability. We used this method to develop an adaptive complex span task (the ACCES) comprising verbal and visuo-spatial subtests. The task presents a fixed number of trials to all participants, allows for partial credit scoring, and can be used with children regardless of ability level. The ACCES demonstrated satisfying psychometric properties in a sample of 268 children aged 8-13 years, confirming the feasibility of using adaptive tasks to measure working memory capacity in children. A free-to-use implementation of the ACCES is provided.

  13. Review of Skin Friction Measurements Including Recent High-Reynolds Number Results from NASA Langley NTF

    NASA Technical Reports Server (NTRS)

    Watson, Ralph D.; Hall, Robert M.; Anders, John B.

    2000-01-01

    This paper reviews flat plate skin friction data from early correlations of drag on plates in water to measurements in the cryogenic environment of The NASA Langley National Transonic Facility (NTF) in late 1996. The flat plate (zero pressure gradient with negligible surface curvature) incompressible skin friction at high Reynolds numbers is emphasized in this paper, due to its importance in assessing the accuracy of measurements, and as being important to the aerodynamics of large scale vehicles. A correlation of zero pressure gradient skin friction data minimizing extraneous effects between tests is often used as the first step in the calculation of skin friction in complex flows. Early data compiled by Schoenherr for a range of momentum thickness Reynolds numbers, R(sub Theta) from 860 to 370,000 contained large scatter, but has proved surprisingly accurate in its correlated form. Subsequent measurements in wind tunnels under more carefully controlled conditions have provided inputs to this database, usually to a maximum R(sub Theta) of about 40,000. Data on a large axisymmetric model in the NASA Langley National Transonic Facility extends the upper limit in incompressible R(sub Theta) to 619,800 using the van Driest transformation. Previous data, test techniques, and error sources ar discussed, and the NTF data will be discussed in detail. The NTF Preston tube and Clauser inferred data accuracy is estimated to be within -2 percent of a power-law curve fit, and falls above the Spalding theory by 1 percent at R(sub Theta) of about 600,000.

  14. Characterizing the Response of Composite Panels to a Pyroshock Induced Environment Using Design of Experiments Methodology

    NASA Technical Reports Server (NTRS)

    Parsons, David S.; Ordway, David; Johnson, Kenneth

    2013-01-01

    This experimental study seeks to quantify the impact various composite parameters have on the structural response of a composite structure in a pyroshock environment. The prediction of an aerospace structure's response to pyroshock induced loading is largely dependent on empirical databases created from collections of development and flight test data. While there is significant structural response data due to pyroshock induced loading for metallic structures, there is much less data available for composite structures. One challenge of developing a composite pyroshock response database as well as empirical prediction methods for composite structures is the large number of parameters associated with composite materials. This experimental study uses data from a test series planned using design of experiments (DOE) methods. Statistical analysis methods are then used to identify which composite material parameters most greatly influence a flat composite panel's structural response to pyroshock induced loading. The parameters considered are panel thickness, type of ply, ply orientation, and pyroshock level induced into the panel. The results of this test will aid in future large scale testing by eliminating insignificant parameters as well as aid in the development of empirical scaling methods for composite structures' response to pyroshock induced loading.

  15. Characterizing the Response of Composite Panels to a Pyroshock Induced Environment using Design of Experiments Methodology

    NASA Technical Reports Server (NTRS)

    Parsons, David S.; Ordway, David O.; Johnson, Kenneth L.

    2013-01-01

    This experimental study seeks to quantify the impact various composite parameters have on the structural response of a composite structure in a pyroshock environment. The prediction of an aerospace structure's response to pyroshock induced loading is largely dependent on empirical databases created from collections of development and flight test data. While there is significant structural response data due to pyroshock induced loading for metallic structures, there is much less data available for composite structures. One challenge of developing a composite pyroshock response database as well as empirical prediction methods for composite structures is the large number of parameters associated with composite materials. This experimental study uses data from a test series planned using design of experiments (DOE) methods. Statistical analysis methods are then used to identify which composite material parameters most greatly influence a flat composite panel's structural response to pyroshock induced loading. The parameters considered are panel thickness, type of ply, ply orientation, and pyroshock level induced into the panel. The results of this test will aid in future large scale testing by eliminating insignificant parameters as well as aid in the development of empirical scaling methods for composite structures' response to pyroshock induced loading.

  16. Imputation of Missing Genotypes From Sparse to High Density Using Long-Range Phasing

    USDA-ARS?s Scientific Manuscript database

    Related individuals in a population share long chromosome segments which trace to a common ancestor. We describe a long-range phasing algorithm that makes use of this property to phase whole chromosomes and simultaneously impute a large number of missing markers. We test our method by imputing marke...

  17. Computing Integrated Ratings from Heterogeneous Phenotypic Assessments: A Case Study of Lettuce Postharvest Quality and Downy Mildew Resistance

    USDA-ARS?s Scientific Manuscript database

    Comparing performance of a large number of accessions simultaneously is not always possible. Typically, only subsets of all accessions are tested in separate trials with only some (or none) of the accessions overlapping between subsets. Using standard statistical approaches to combine data from such...

  18. Data retrieval system provides unlimited hardware design information

    NASA Technical Reports Server (NTRS)

    Rawson, R. D.; Swanson, R. L.

    1967-01-01

    Data is input to magnetic tape on a single format card that specifies the system, location, and component, the test point identification number, the operators initial, the date, a data code, and the data itself. This method is efficient for large volume data storage and retrieval, and permits output variations without continuous program modifications.

  19. Evaluation of Microelectrode Array Data using Bayesian Modeling as an Approach to Screening and Prioritization for Neurotoxicity Testing*

    EPA Science Inventory

    The need to assess large numbers of chemicals for their potential toxicities has resulted in increased emphasis on medium- and high-throughput in vitro screening approaches. For such approaches to be useful, efficient and reliable data analysis and hit detection methods are also ...

  20. Multimodal Reading Comprehension: Curriculum Expectations and Large-Scale Literacy Testing Practices

    ERIC Educational Resources Information Center

    Unsworth, Len

    2014-01-01

    Interpreting the image-language interface in multimodal texts is now well recognized as a crucial aspect of reading comprehension in a number of official school syllabi such as the recently published Australian Curriculum: English (ACE). This article outlines the relevant expected student learning outcomes in this curriculum and draws attention to…

  1. Cottowood Breeding Strategies for the Future

    Treesearch

    D. T. Cooper

    1976-01-01

    A large number of genotypes of eastern cottonwood of diverse parentage should be evaluated followed by multiple-stage selection for the most important characters to obtain substantial gains per sexual cycle while retaining genetic diversity. More intensive testing should be practiced in selecting clones tor commercial use than for use as parents of the next generation...

  2. Transcriptome amplification coupled with nanopore sequencing as a surveillance tool for plant pathogens in plant and insect tissues

    USDA-ARS?s Scientific Manuscript database

    There are many plant pathogen-specific diagnostic assays, based on PCR and immune-detection. However, the ability to test for large numbers of pathogens simultaneously is lacking. Next generation sequencing (NGS) allows one to detect all organisms within a given sample, but has computational limitat...

  3. Evidence for Different Components in Children's Visuospatial Working Memory

    ERIC Educational Resources Information Center

    Mammarella, Irene C.; Pazzaglia, Francesca; Cornoldi, Cesare

    2008-01-01

    There are a large number of studies demonstrating that visuospatial working memory (VSWM) involves different subcomponents, but there is no agreement on the identity of these dimensions. The present study attempts to combine different theoretical accounts by measuring VSWM. A battery composed of 13 tests was used to assess working memory and, in…

  4. USE OF HIGH CONTENT IMAGE ANALYSES TO DETECT CHEMICAL-MEDIATED EFFECTS ON NEURITE SUB-POPULATIONS IN PRIMARY RAT CORTICAL NEURONS

    EPA Science Inventory

    Traditional developmental neurotoxicity tests performed in vivo are costly, time-consuming and utilize a large number of animals. In order to address these inefficiencies, in vitro models of neuronal development have been used in a first tier screening approach for developmenta...

  5. TOXCAST, A TOOL FOR CATEGORIZATION AND PRIORITIZATION OF CHEMICAL HAZARD BASED ON MULTI-DIMENSIONAL INFORMATION DOMAINS

    EPA Science Inventory

    Across several EPA Program Offices (e.g., OPPTS, OW, OAR), there is a clear need to develop strategies and methods to screen large numbers of chemicals for potential toxicity, and to use the resulting information to prioritize the use of testing resources towards those entities a...

  6. Use of the disease severity index for null hypothesis testing

    USDA-ARS?s Scientific Manuscript database

    A disease severity index (DSI) is a single number for summarizing a large amount of disease severity information. It is used to indicate relative resistance of cultivars, to relate disease severity to yield loss, or to compare treatments. The DSI has most often been based on a special type of ordina...

  7. LES tests on airfoil trailing edge serration

    NASA Astrophysics Data System (ADS)

    Zhu, Wei Jun; Shen, Wen Zhong

    2016-09-01

    In the present study, a large number of acoustic simulations are carried out for a low noise airfoil with different Trailing Edge Serrations (TES). The Ffowcs Williams-Hawkings (FWH) acoustic analogy is used for noise prediction at trailing edge. The acoustic solver is running on the platform of our in-house incompressible flow solver EllipSys3D. The flow solution is first obtained from the Large Eddy Simulation (LES), the acoustic part is then carried out based on the instantaneous hydrodynamic pressure and velocity field. To obtain the time history data of sound pressure, the flow quantities are integrated around the airfoil surface through the FWH approach. For all the simulations, the chord based Reynolds number is around 1.5x106. In the test matrix, the effects from angle of attack, the TE flap angle, the length/width of the TES are investigated. Even though the airfoil under investigation is already optimized for low noise emission, most numerical simulations and wind tunnel experiments show that the noise level is further decreased by adding the TES device.

  8. Application and Analysis of Measurement Model for Calibrating Spatial Shear Surface in Triaxial Test

    NASA Astrophysics Data System (ADS)

    Zhang, Zhihua; Qiu, Hongsheng; Zhang, Xiedong; Zhang, Hang

    2017-12-01

    Discrete element method has great advantages in simulating the contacts, fractures, large displacement and deformation between particles. In order to analyze the spatial distribution of the shear surface in the three-dimensional triaxial test, a measurement model is inserted in the numerical triaxial model which is generated by weighted average assembling method. Due to the non-visibility of internal shear surface in laboratory, it is largely insufficient to judge the trend of internal shear surface only based on the superficial cracks of sheared sample, therefore, the measurement model is introduced. The trend of the internal shear zone is analyzed according to the variations of porosity, coordination number and volumetric strain in each layer. It shows that as a case study on confining stress of 0.8 MPa, the spatial shear surface is calibrated with the results of the rotated particle distribution and the theoretical value with the specific characteristics of the increase of porosity, the decrease of coordination number, and the increase of volumetric strain, which represents the measurement model used in three-dimensional model is applicable.

  9. LeRC NATR Free-Jet Development

    NASA Technical Reports Server (NTRS)

    Long-Davis, M.; Cooper, B. A.

    1999-01-01

    The Nozzle Acoustic Test Rig (NATR) was developed to provide additional test capabilities at Lewis needed to meet HSR program goals. The NATR is a large f ree-jet facility (free-jet diameter = 53 in.) with a design Mach number of 0.3. It is located inside a geodesic dome, adjacent to the existing Powered Lift Facility (PLF). The NATR allows nozzle concepts to be acoustically assessed for far-field (approximately 50 feet) noise characteristics under conditions simulating forward flight. An ejector concept was identified as a means of supplying the required airflow for this free-jet facility. The primary stream is supplied through a circular array of choked nozzles and the resulting low pressure in the constant, annular- area mixing section causes a "pumping" action that entrains the secondary stream. The mixed flow expands through an annular diffuser and into a plenum chamber. Once inside the plenum, the flow passes over a honeycomb/screen combination intended to remove large disturbances and provide uniform flow. The flow accelerates through an elliptical contraction section where it achieves a free-jet Mach number of up to 0.3.

  10. Contact Binaries on Their Way Towards Merging

    NASA Astrophysics Data System (ADS)

    Gazeas, K.

    2015-07-01

    Contact binaries are the most frequently observed type of eclipsing star system. They are small, cool, low-mass binaries belonging to a relatively old stellar population. They follow certain empirical relationships that closely connect a number of physical parameters with each other, largely because of constraints coming from the Roche geometry. As a result, contact binaries provide an excellent test of stellar evolution, specifically for stellar merger scenarios. Observing campaigns by many authors have led to the cataloging of thousands of contact binaries and enabled statistical studies of many of their properties. A large number of contact binaries have been found to exhibit extraordinary behavior, requiring follow-up observations to study their peculiarities in detail. For example, a doubly-eclipsing quadruple system consisting of a contact binary and a detached binary is a highly constrained system offering an excellent laboratory to test evolutionary theories for binaries. A new observing project was initiated at the University of Athens in 2012 in order to investigate the possible lower limit for the orbital period of binary systems before coalescence, prior to merging.

  11. Pressure Distribution Over Airfoils at High Speeds

    NASA Technical Reports Server (NTRS)

    Briggs, L J; Dryden, H L

    1927-01-01

    This report deals with the pressure distribution over airfoils at high speeds, and describes an extension of an investigation of the aerodynamic characteristics of certain airfoils which was presented in NACA Technical Report no. 207. The results presented in report no. 207 have been confirmed and extended to higher speeds through a more extensive and systematic series of tests. Observations were also made of the air flow near the surface of the airfoils, and the large changes in lift coefficients were shown to be associated with a sudden breaking away of the flow from the upper surface. The tests were made on models of 1-inch chord and comparison with the earlier measurements on models of 3-inch chord shows that the sudden change in the lift coefficient is due to compressibility and not to a change in the Reynolds number. The Reynolds number still has a large effect, however, on the drag coefficient. The pressure distribution observations furnish the propeller designer with data on the load distribution at high speeds, and also give a better picture of the air-flow changes.

  12. Development of high impedance measurement system for water leakage detection in implantable neuroprosthetic devices.

    PubMed

    Yousif, Aziz; Kelly, Shawn K

    2016-08-01

    There has been a push for a greater number of channels in implantable neuroprosthetic devices; but, that number has largely been limited by current hermetic packaging technology. Microfabricated packaging is becoming reality, but a standard testing system is needed to prepare these devices for clinical trials. Impedance measurements of electrodes built into the packaging layers may give an early warning of device failure and predict device lifetime. Because the impedance magnitudes of such devices can be on the order of gigaohms, a versatile system was designed to accommodate ultra-high impedances and allow future integrated circuit implementation in current neural prosthetic technologies. Here we present the circuitry, control software, and preliminary testing results of our designed system.

  13. Boundary-layer measurements on a transonic low-aspect ratio wing

    NASA Technical Reports Server (NTRS)

    Keener, Earl R.

    1985-01-01

    Tabulations and plots are presented of boundary-layer velocity and flow-direction surveys from wind-tunnel tests of a large-scale (0.90 m semi-span) model of the NASA/Lockheed Wing C. This wing is a generic, transonic, supercritical, highly three-dimensional, low-aspect-ratio configuration designed with the use of a three-dimensional, transonic full-potential-flow wing code (FLO22). Tests were conducted at the design angle of attack of 5 deg over a Mach number range from 0.25 to 0.96 and a Reynolds number range of 3.4x10 to the 6th power. Wing pressures were measured at five span stations, and boundary-layer surveys were measured at the midspan station. The data are presented without analysis.

  14. Slicing of Silicon into Sheet Material. Silicon Sheet Growth Development for the Large Area Silicon Sheet Task of the Low Cost Solar Array Project

    NASA Technical Reports Server (NTRS)

    Fleming, J. R.; Holden, S. C.; Wolfson, R. G.

    1979-01-01

    The use of multiblade slurry sawing to produce silicon wafers from ingots was investigated. The commercially available state of the art process was improved by 20% in terms of area of silicon wafers produced from an ingot. The process was improved 34% on an experimental basis. Economic analyses presented show that further improvements are necessary to approach the desired wafer costs, mostly reduction in expendable materials costs. Tests which indicate that such reduction is possible are included, although demonstration of such reduction was not completed. A new, large capacity saw was designed and tested. Performance comparable with current equipment (in terms of number of wafers/cm) was demonstrated.

  15. Building Stakeholder Partnerships for an On-Site HIV Testing Programme

    PubMed Central

    Woods, William J.; Erwin, Kathleen; Lazarus, Margery; Serice, Heather; Grinstead, Olga; Binson, Diane

    2009-01-01

    Because of the large number of individuals at risk for HIV infection who visit gay saunas and sex clubs, these venues are useful settings in which to offer HIV outreach programmes for voluntary counselling and testing (VCT). Nevertheless, establishing a successful VCT programme in such a setting can be a daunting challenge, in large part because there are many barriers to managing the various components likely to be involved. Using qualitative data from a process evaluation of a new VCT programme at a gay sauna in California, USA, we describe how the various stakeholders overcame barriers of disparate interests and responsibilities to work together to successfully facilitate a regular and frequent on-site VCT programme that was fully utilized by patrons. PMID:18432424

  16. Robust Airfoil Optimization in High Resolution Design Space

    NASA Technical Reports Server (NTRS)

    Li, Wu; Padula, Sharon L.

    2003-01-01

    The robust airfoil shape optimization is a direct method for drag reduction over a given range of operating conditions and has three advantages: (1) it prevents severe degradation in the off-design performance by using a smart descent direction in each optimization iteration, (2) it uses a large number of B-spline control points as design variables yet the resulting airfoil shape is fairly smooth, and (3) it allows the user to make a trade-off between the level of optimization and the amount of computing time consumed. The robust optimization method is demonstrated by solving a lift-constrained drag minimization problem for a two-dimensional airfoil in viscous flow with a large number of geometric design variables. Our experience with robust optimization indicates that our strategy produces reasonable airfoil shapes that are similar to the original airfoils, but these new shapes provide drag reduction over the specified range of Mach numbers. We have tested this strategy on a number of advanced airfoil models produced by knowledgeable aerodynamic design team members and found that our strategy produces airfoils better or equal to any designs produced by traditional design methods.

  17. Comment on "Universal relation between skewness and kurtosis in complex dynamics"

    NASA Astrophysics Data System (ADS)

    Celikoglu, Ahmet; Tirnakli, Ugur

    2015-12-01

    In a recent paper [M. Cristelli, A. Zaccaria, and L. Pietronero, Phys. Rev. E 85, 066108 (2012), 10.1103/PhysRevE.85.066108], the authors analyzed the relation between skewness and kurtosis for complex dynamical systems, and they identified two power-law regimes of non-Gaussianity, one of which scales with an exponent of 2 and the other with 4 /3 . They concluded that the observed relation is a universal fact in complex dynamical systems. In this Comment, we test the proposed universal relation between skewness and kurtosis with a large number of synthetic data, and we show that in fact it is not a universal relation and originates only due to the small number of data points in the datasets considered. The proposed relation is tested using a family of non-Gaussian distribution known as q -Gaussians. We show that this relation disappears for sufficiently large datasets provided that the fourth moment of the distribution is finite. We find that kurtosis saturates to a single value, which is of course different from the Gaussian case (K =3 ), as the number of data is increased, and this indicates that the kurtosis will converge to a finite single value if all moments of the distribution up to fourth are finite. The converged kurtosis value for the finite fourth-moment distributions and the number of data points needed to reach this value depend on the deviation of the original distribution from the Gaussian case.

  18. Improved technique that allows the performance of large-scale SNP genotyping on DNA immobilized by FTA technology.

    PubMed

    He, Hongbin; Argiro, Laurent; Dessein, Helia; Chevillard, Christophe

    2007-01-01

    FTA technology is a novel method designed to simplify the collection, shipment, archiving and purification of nucleic acids from a wide variety of biological sources. The number of punches that can normally be obtained from a single specimen card are often however, insufficient for the testing of the large numbers of loci required to identify genetic factors that control human susceptibility or resistance to multifactorial diseases. In this study, we propose an improved technique to perform large-scale SNP genotyping. We applied a whole genome amplification method to amplify DNA from buccal cell samples stabilized using FTA technology. The results show that using the improved technique it is possible to perform up to 15,000 genotypes from one buccal cell sample. Furthermore, the procedure is simple. We consider this improved technique to be a promising methods for performing large-scale SNP genotyping because the FTA technology simplifies the collection, shipment, archiving and purification of DNA, while whole genome amplification of FTA card bound DNA produces sufficient material for the determination of thousands of SNP genotypes.

  19. Set size and culture influence children's attention to number.

    PubMed

    Cantrell, Lisa; Kuwabara, Megumi; Smith, Linda B

    2015-03-01

    Much research evidences a system in adults and young children for approximately representing quantity. Here we provide evidence that the bias to attend to discrete quantity versus other dimensions may be mediated by set size and culture. Preschool-age English-speaking children in the United States and Japanese-speaking children in Japan were tested in a match-to-sample task where number was pitted against cumulative surface area in both large and small numerical set comparisons. Results showed that children from both cultures were biased to attend to the number of items for small sets. Large set responses also showed a general attention to number when ratio difficulty was easy. However, relative to the responses for small sets, attention to number decreased for both groups; moreover, both U.S. and Japanese children showed a significant bias to attend to total amount for difficult numerical ratio distances, although Japanese children shifted attention to total area at relatively smaller set sizes than U.S. children. These results add to our growing understanding of how quantity is represented and how such representation is influenced by context--both cultural and perceptual. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Safety and Abuse Testing of Energizer LiFeS2 AA Cells

    NASA Technical Reports Server (NTRS)

    Jeevarajan, Judith A.; Baldwin, Laura; Bragg, Bobby J.

    2003-01-01

    The LiFeS2 test program was part of the study on state-of-the-art batteries/cells available in the commercial market. It was carried out in an effort to replace alkaline AA cells for Shuttle and Station applications. A large number of alkaline cells are used for numerous Shuttle and Station applications as loose cells. Other government agencies reported good performance and abuse tolerance of the AA LiFeS2 cells. In this study, only abuse testing was performed on the cells to determine their tolerance. The tests carried out were over-discharge, external short circuit, heat-to-vent, vibration and drop.

  1. Large-Scale Boundary-Layer Control Tests on Two Wings in the NACA 20-Foot Wind Tunnel, Special Report

    NASA Technical Reports Server (NTRS)

    Freeman, Hugh B.

    1935-01-01

    Tests were made in the N.A.C.A. 20-foot wind tunnel on: (1) a wing, of 6.5-foot span, 5.5-foot chord, and 30 percent maximum thickness, fitted with large end plates and (2) a 16-foot span 2.67-foot chord wing of 15 percent maximum thickness to determine the increase in lift obtainable by removing the boundary layer and the power required for the blower. The results of the tests on the stub wing appeared more favorable than previous small-scale tests and indicated that: (1) the suction method was considerably superior to the pressure method, (2) single slots were more effective than multiple slots (where the same pressure was applied to all slots), the slot efficiency increased rapidly for increasing slot widths up to 2 percent of the wing chord and remained practically constant for all larger widths tested, (3) suction pressure and power requirements were quite low (a computation for a light airplane showed that a lift coefficient of 3.0 could be obtained with a suction as low as 2.3 times the dynamic pressure and a power expenditure less than 3 percent of the rated engine power), and (4) the volume of air required to be drawn off was quite high (approximately 0.5 cubic feet per second per unit wing area for an airplane landing at 40 miles per hour with a lift coefficient of 3,0), indicating that considerable duct area must be provided in order to prevent flow losses inside the wing and insure uniform distribution of suction along the span. The results from the tests of the large-span wing were less favorable than those on the stub wing. The reasons for this were, probably: (1) the uneven distribution of suction along the span, (2) the flow losses inside the wing, (3) the small radius of curvature of the leading edge of the wing section, and (4) the low Reynolds Number of these tests, which was about one half that of the stub wing. The results showed a large increase in the maximum lift coefficient with an increase in Reynolds Number in the range of the tests. The results of drag tests showed that the profile drag of the wing was reduced and the L/D ratio was increased throughout the range of lift coefficients corresponding to take-off and climb but that the minimum drag was increased. The slot arrangement that is best for low drag is not the same, however, as that for maximum lift.

  2. So ware-Defined Network Solutions for Science Scenarios: Performance Testing Framework and Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Settlemyer, Bradley; Kettimuthu, R.; Boley, Josh

    High-performance scientific work flows utilize supercomputers, scientific instruments, and large storage systems. Their executions require fast setup of a small number of dedicated network connections across the geographically distributed facility sites. We present Software-Defined Network (SDN) solutions consisting of site daemons that use dpctl, Floodlight, ONOS, or OpenDaylight controllers to set up these connections. The development of these SDN solutions could be quite disruptive to the infrastructure, while requiring a close coordination among multiple sites; in addition, the large number of possible controller and device combinations to investigate could make the infrastructure unavailable to regular users for extended periods ofmore » time. In response, we develop a Virtual Science Network Environment (VSNE) using virtual machines, Mininet, and custom scripts that support the development, testing, and evaluation of SDN solutions, without the constraints and expenses of multi-site physical infrastructures; furthermore, the chosen solutions can be directly transferred to production deployments. By complementing VSNE with a physical testbed, we conduct targeted performance tests of various SDN solutions to help choose the best candidates. In addition, we propose a switching response method to assess the setup times and throughput performances of different SDN solutions, and present experimental results that show their advantages and limitations.« less

  3. Assessing copy number from exome sequencing and exome array CGH based on CNV spectrum in a large clinical cohort.

    PubMed

    Retterer, Kyle; Scuffins, Julie; Schmidt, Daniel; Lewis, Rachel; Pineda-Alvarez, Daniel; Stafford, Amanda; Schmidt, Lindsay; Warren, Stephanie; Gibellini, Federica; Kondakova, Anastasia; Blair, Amanda; Bale, Sherri; Matyakhina, Ludmila; Meck, Jeanne; Aradhya, Swaroop; Haverfield, Eden

    2015-08-01

    Detection of copy-number variation (CNV) is important for investigating many genetic disorders. Testing a large clinical cohort by array comparative genomic hybridization provides a deep perspective on the spectrum of pathogenic CNV. In this context, we describe a bioinformatics approach to extract CNV information from whole-exome sequencing and demonstrate its utility in clinical testing. Exon-focused arrays and whole-genome chromosomal microarray analysis were used to test 14,228 and 14,000 individuals, respectively. Based on these results, we developed an algorithm to detect deletions/duplications in whole-exome sequencing data and a novel whole-exome array. In the exon array cohort, we observed a positive detection rate of 2.4% (25 duplications, 318 deletions), of which 39% involved one or two exons. Chromosomal microarray analysis identified 3,345 CNVs affecting single genes (18%). We demonstrate that our whole-exome sequencing algorithm resolves CNVs of three or more exons. These results demonstrate the clinical utility of single-exon resolution in CNV assays. Our whole-exome sequencing algorithm approaches this resolution but is complemented by a whole-exome array to unambiguously identify intragenic CNVs and single-exon changes. These data illustrate the next advancements in CNV analysis through whole-exome sequencing and whole-exome array.Genet Med 17 8, 623-629.

  4. Mind as Space

    NASA Astrophysics Data System (ADS)

    McKinstry, Chris

    The present article describes a possible method for the automatic discovery of a universal human semantic-affective hyperspatial approximation of the human subcognitive substrate - the associative network which French (1990) asserts is the ultimate foundation of the human ability to pass the Turing Test - that does not require a machine to have direct human experience or a physical human body. This method involves automatic programming - such as Koza's genetic programming (1992) - guided in the discovery of the proposed universal hypergeometry by feedback from a Minimum Intelligent Signal Test or MIST (McKinstry, 1997) constructed from a very large number of human validated probabilistic propositions collected from a large population of Internet users. It will be argued that though a lifetime of human experience is required to pass a rigorous Turing Test, a probabilistic propositional approximation of this experience can be constructed via public participation on the Internet, and then used as a fitness function to direct the artificial evolution of a universal hypergeometry capable of classifying arbitrary propositions. A model of this hypergeometry will be presented; it predicts Miller's "Magical Number Seven" (1956) as the size of human short-term memory from fundamental hypergeometric properties. A system that can lead to the generation of novel propositions or "artificial thoughts" will also be described.

  5. Hypothesis test of mediation effect in causal mediation model with high-dimensional continuous mediators.

    PubMed

    Huang, Yen-Tsung; Pan, Wen-Chi

    2016-06-01

    Causal mediation modeling has become a popular approach for studying the effect of an exposure on an outcome through a mediator. However, current methods are not applicable to the setting with a large number of mediators. We propose a testing procedure for mediation effects of high-dimensional continuous mediators. We characterize the marginal mediation effect, the multivariate component-wise mediation effects, and the L2 norm of the component-wise effects, and develop a Monte-Carlo procedure for evaluating their statistical significance. To accommodate the setting with a large number of mediators and a small sample size, we further propose a transformation model using the spectral decomposition. Under the transformation model, mediation effects can be estimated using a series of regression models with a univariate transformed mediator, and examined by our proposed testing procedure. Extensive simulation studies are conducted to assess the performance of our methods for continuous and dichotomous outcomes. We apply the methods to analyze genomic data investigating the effect of microRNA miR-223 on a dichotomous survival status of patients with glioblastoma multiforme (GBM). We identify nine gene ontology sets with expression values that significantly mediate the effect of miR-223 on GBM survival. © 2015, The International Biometric Society.

  6. FDTD method for laser absorption in metals for large scale problems.

    PubMed

    Deng, Chun; Ki, Hyungson

    2013-10-21

    The FDTD method has been successfully used for many electromagnetic problems, but its application to laser material processing has been limited because even a several-millimeter domain requires a prohibitively large number of grids. In this article, we present a novel FDTD method for simulating large-scale laser beam absorption problems, especially for metals, by enlarging laser wavelength while maintaining the material's reflection characteristics. For validation purposes, the proposed method has been tested with in-house FDTD codes to simulate p-, s-, and circularly polarized 1.06 μm irradiation on Fe and Sn targets, and the simulation results are in good agreement with theoretical predictions.

  7. Solar Sail Loads, Dynamics, and Membrane Studies

    NASA Technical Reports Server (NTRS)

    Slade, K. N.; Belvin, W. K.; Behun, V.

    2002-01-01

    While a number of solar sail missions have been proposed recently, these missions have not been selected for flight validation. Although the reasons for non-selection are varied, principal among them is the lack of subsystem integration and ground testing. This paper presents some early results from a large-scale ground testing program for integrated solar sail systems. In this series of tests, a 10 meter solar sail tested is subjected to dynamic excitation both in ambient atmospheric and vacuum conditions. Laser vibrometry is used to determine resonant frequencies and deformation shapes. The results include some low-order sail modes which only can be seen in vacuum, pointing to the necessity of testing in that environment.

  8. Endocrine disrupters--testing strategies to assess human hazard.

    PubMed

    Baker, V A

    2001-01-01

    During the last decade an hypothesis has been developed linking certain chemicals (natural and synthetic) to observed and suspected adverse effects on reproduction in both wildlife and humans. The issue of 'endocrine disruption' originally focused on chemicals that mimic the action of the natural hormone oestrogen. However, the concern is now encompassing effects on the whole endocrine system. In response to public awareness, regulatory agencies (including the US EPA) and the OECD are formulating potential testing strategies and have begun the process of validating defined tests to systematically assess chemicals for their endocrine-disrupting activities. In order to investigate chemicals that have the potential to cause endocrine disruption, a large number of in vitro and in vivo assays have been identified. In vitro test systems (particularly when used in combination) offer the possibility of providing an early screen for large numbers of chemicals and can be useful in characterising the mechanism of action and potency. In vitro assays in widespread use for the screening/characterisation of endocrine disrupting potential include hormone receptor ligand binding assays (determination of the ability of a chemical to bind to the hormone receptor), cell proliferation assays (analysis of the ability of a chemical to stimulate growth of oestrogen sensitive cells), reporter gene assays in yeast or mammalian cells (analysis of the ability of a chemical to stimulate the transcription of a reporter gene construct in cell culture), and the analysis of the regulation of endogenous oestrogen sensitive genes in cell lines. However, in vitro assays do not always reliably predict the outcome in vivo due to differences in metabolic capabilities of the test systems used and the diverse range of mechanisms by which endocrine disrupting chemicals may act. Therefore a complementary battery of short- and long-term in vitro and in vivo assays (that assess both receptor and non-receptor mediated mechanisms of action) seems the most appropriate way at present of assessing the potential endocrine disrupting activities of chemicals. At Unilever we have used a combination of in vitro assays (receptor binding, reporter gene and cell proliferation assays) together with short-term in vivo tests (uterotrophic assay in immature rodents) to examine the oestrogenic potential of a large number of chemicals. An evaluation of the advantages and limitations of these methods is provided. Finally, any potential test system needs to be validated and standardized before the information generated can be for the identification of hazard, and possibly for risk assessment purposes.

  9. Supersonic aerodynamic characteristics of a maneuvering canard-controlled missile with fixed and free-rolling tail fins

    NASA Technical Reports Server (NTRS)

    Blair, A. B., Jr.

    1990-01-01

    Wind tunnel investigations were conducted on a generic cruciform canard-controlled missile configuration. The model featured fixed or free-rolling tail-fin afterbodies to provide an expanded aerodynamic data base with particular emphasis on alleviating large induced rolling moments and/or for providing canard roll control throughout the entire test angle-of-attack range. The tests were conducted in the NASA Langley Unitary Plan Wind Tunnel at Mach numbers from 2.50 to 3.50 at a constant Reynolds number per foot of 2.00 x 10 to the 6th. Selected test results are presented to show the effects of a fixed or free-rolling tail-fin afterbody on the static longitudinal and lateral-directional aerodynamic characteristics of a canard-controlled missile with pitch, yaw, and roll control at model roll angles of 0 deg and 45 deg.

  10. Ground-Handling Forces on a 1/40-scale Model of the U. S. Airship "Akron."

    NASA Technical Reports Server (NTRS)

    Silverstein, Abe; Gulick, B G

    1937-01-01

    This report presents the results of full-scale wind tunnel tests conducted to determine the ground-handling forces on a 1/40-scale model of the U. S. Airship "Akron." Ground-handling conditions were simulated by establishing a velocity gradient above a special ground board in the tunnel comparable with that encountered over a landing field. The tests were conducted at Reynolds numbers ranging from 5,000,000 to 19,000,000 at each of six angles of yaw between 0 degree and 180 degrees and at four heights of the model above the ground board. The ground-handling forces vary greatly with the angle of yaw and reach large values at appreciable angles of yaw. Small changes in height, pitch, or roll did not critically affect the forces on the model. In the range of Reynolds numbers tested, no significant variation of the forces with the scale was disclosed.

  11. The Mach number of the cosmic flow - A critical test for current theories

    NASA Technical Reports Server (NTRS)

    Ostriker, Jeremiah P.; Suto, Yusushi

    1990-01-01

    A new cosmological, self-contained test using the ratio of mean velocity and the velocity dispersion in the mean flow frame of a group of test objects is presented. To allow comparison with linear theory, the velocity field must first be smoothed on a suitable scale. In the context of linear perturbation theory, the Mach number M(R) which measures the ratio of power on scales larger than to scales smaller than the patch size R, is independent of the perturbation amplitude and also of bias. An apparent inconsistency is found for standard values of power-law index n = 1 and cosmological density parameter Omega = 1, when comparing values of M(R) predicted by popular models with tentative available observations. Nonstandard models based on adiabatic perturbations with either negative n or small Omega value also fail, due to creation of unacceptably large microwave background fluctuations.

  12. Aerodynamics, aeroelasticity, and stability of hang gliders. Experimental results. [Ames 7- by 10-ft wind tunnel tests

    NASA Technical Reports Server (NTRS)

    Kroo, I. M.

    1981-01-01

    One-fifth-scale models of three basic ultralight glider designs were constructed to simulate the elastic properties of full scale gliders and were tested at Reynolds numbers close to full scale values. Twenty-four minor modifications were made to the basic configurations in order to evaluate the effects of twist, reflex, dihedral, and various stability enhancement devices. Longitudinal and lateral data were obtained at several speeds through an angle of attack range of -30 deg to +45 deg with sideslip angles of up to 20 deg. The importance of vertical center of gravity displacement is discussed. Lateral data indicate that effective dihedral is lost at low angles of attack for nearly all of the configurations tested. Drag data suggest that lift-dependent viscous drag is a large part of the glider's total drag as is expected for thin, cambered sections at these relatively low Reynolds numbers.

  13. Demographic and psychological variables affecting test subject evaluations of ride quality

    NASA Technical Reports Server (NTRS)

    Duncan, N. C.; Conley, H. W.

    1975-01-01

    Ride-quality experiments similar in objectives, design, and procedure were conducted, one using the U.S. Air Force Total In-Flight Simulator and the other using the Langley Passenger Ride Quality Apparatus to provide the motion environments. Large samples (80 or more per experiment) of test subjects were recruited from the Tidewater Virginia area and asked to rate the comfort (on a 7-point scale) of random aircraft motion typical of that encountered during STOL flights. Test subject characteristics of age, sex, and previous flying history (number of previous airplane flights) were studied in a two by three by three factorial design. Correlations were computed between one dependent measure, the subject's mean comfort rating, and various demographic characteristics, attitudinal variables, and the scores on Spielberger's State-Trait Anxiety Inventory. An effect of sex was found in one of the studies. Males made higher (more uncomfortable) ratings of the ride than females. Age and number of previous flights were not significantly related to comfort ratings. No significant interactions between the variables of age, sex, or previous number of flights were observed.

  14. Large number discrimination by mosquitofish.

    PubMed

    Agrillo, Christian; Piffer, Laura; Bisazza, Angelo

    2010-12-22

    Recent studies have demonstrated that fish display rudimentary numerical abilities similar to those observed in mammals and birds. The mechanisms underlying the discrimination of small quantities (<4) were recently investigated while, to date, no study has examined the discrimination of large numerosities in fish. Subjects were trained to discriminate between two sets of small geometric figures using social reinforcement. In the first experiment mosquitofish were required to discriminate 4 from 8 objects with or without experimental control of the continuous variables that co-vary with number (area, space, density, total luminance). Results showed that fish can use the sole numerical information to compare quantities but that they preferentially use cumulative surface area as a proxy of the number when this information is available. A second experiment investigated the influence of the total number of elements to discriminate large quantities. Fish proved to be able to discriminate up to 100 vs. 200 objects, without showing any significant decrease in accuracy compared with the 4 vs. 8 discrimination. The third experiment investigated the influence of the ratio between the numerosities. Performance was found to decrease when decreasing the numerical distance. Fish were able to discriminate numbers when ratios were 1:2 or 2:3 but not when the ratio was 3:4. The performance of a sample of undergraduate students, tested non-verbally using the same sets of stimuli, largely overlapped that of fish. Fish are able to use pure numerical information when discriminating between quantities larger than 4 units. As observed in human and non-human primates, the numerical system of fish appears to have virtually no upper limit while the numerical ratio has a clear effect on performance. These similarities further reinforce the view of a common origin of non-verbal numerical systems in all vertebrates.

  15. Wall-Resolved Large-Eddy Simulation of Flow Separation Over NASA Wall-Mounted Hump

    NASA Technical Reports Server (NTRS)

    Uzun, Ali; Malik, Mujeeb R.

    2017-01-01

    This paper reports the findings from a study that applies wall-resolved large-eddy simulation to investigate flow separation over the NASA wall-mounted hump geometry. Despite its conceptually simple flow configuration, this benchmark problem has proven to be a challenging test case for various turbulence simulation methods that have attempted to predict flow separation arising from the adverse pressure gradient on the aft region of the hump. The momentum-thickness Reynolds number of the incoming boundary layer has a value that is near the upper limit achieved by recent direct numerical simulation and large-eddy simulation of incompressible turbulent boundary layers. The high Reynolds number of the problem necessitates a significant number of grid points for wall-resolved calculations. The present simulations show a significant improvement in the separation-bubble length prediction compared to Reynolds-Averaged Navier-Stokes calculations. The current simulations also provide good overall prediction of the skin-friction distribution, including the relaminarization observed over the front portion of the hump due to the strong favorable pressure gradient. We discuss a number of problems that were encountered during the course of this work and present possible solutions. A systematic study regarding the effect of domain span, subgrid-scale model, tunnel back pressure, upstream boundary layer conditions and grid refinement is performed. The predicted separation-bubble length is found to be sensitive to the span of the domain. Despite the large number of grid points used in the simulations, some differences between the predictions and experimental observations still exist (particularly for Reynolds stresses) in the case of the wide-span simulation, suggesting that additional grid resolution may be required.

  16. Low-speed wind tunnel investigation of the lateral-directional characterisitcs of a large-scale variable wing-sweep fighter model in the high-lift configuration

    NASA Technical Reports Server (NTRS)

    Eckert, W. T.; Maki, R. L.

    1973-01-01

    The low-speed characteristics of a large-scale model of the F-14A aircraft were studied in tests conducted in the Ames Research Center 40- by 80-Foot Wind Tunnel. The primary purpose of the present tests was the determination of lateral-directional stability levels and control effectiveness of the aircraft in its high-lift configuration. Tests were conducted at wing angles of attack between minus 2 deg and 30 deg and with sideslip angles between minus 12 deg and 12 deg. Data were taken at a Reynolds number of 8.0 million based on a wing mean aerodynamic chord of 2.24 m (7.36 ft). The model configuration was changed as required to show the effects of direct lift control (spoilers) at yaw, yaw angle with speed brake deflected, and various amounts and combinations of roll control.

  17. Low Pressure Seeder Development for PIV in Large Scale Open Loop Wind Tunnels

    NASA Astrophysics Data System (ADS)

    Schmit, Ryan

    2010-11-01

    A low pressure seeding techniques have been developed for Particle Image Velocimetry (PIV) in large scale wind tunnel facilities was performed at the Subsonic Aerodynamic Research Laboratory (SARL) facility at Wright-Patterson Air Force Base. The SARL facility is an open loop tunnel with a 7 by 10 foot octagonal test section that has 56% optical access and the Mach number varies from 0.2 to 0.5. A low pressure seeder sprayer was designed and tested in the inlet of the wind tunnel. The seeder sprayer was designed to produce an even and uniform distribution of seed while reducing the seeders influence in the test section. ViCount Compact 5000 using Smoke Oil 180 was using as the seeding material. The results show that this low pressure seeder does produce streaky seeding but excellent PIV images are produced.

  18. Identifying fMRI Model Violations with Lagrange Multiplier Tests

    PubMed Central

    Cassidy, Ben; Long, Christopher J; Rae, Caroline; Solo, Victor

    2013-01-01

    The standard modeling framework in Functional Magnetic Resonance Imaging (fMRI) is predicated on assumptions of linearity, time invariance and stationarity. These assumptions are rarely checked because doing so requires specialised software, although failure to do so can lead to bias and mistaken inference. Identifying model violations is an essential but largely neglected step in standard fMRI data analysis. Using Lagrange Multiplier testing methods we have developed simple and efficient procedures for detecting model violations such as non-linearity, non-stationarity and validity of the common Double Gamma specification for hemodynamic response. These procedures are computationally cheap and can easily be added to a conventional analysis. The test statistic is calculated at each voxel and displayed as a spatial anomaly map which shows regions where a model is violated. The methodology is illustrated with a large number of real data examples. PMID:22542665

  19. Design of a fast computer-based partial discharge diagnostic system

    NASA Technical Reports Server (NTRS)

    Oliva, Jose R.; Karady, G. G.; Domitz, Stan

    1991-01-01

    Partial discharges cause progressive deterioration of insulating materials working in high voltage conditions and may lead ultimately to insulator failure. Experimental findings indicate that deterioration increases with the number of discharges and is consequently proportional to the magnitude and frequency of the applied voltage. In order to obtain a better understanding of the mechanisms of deterioration produced by partial discharges, instrumentation capable of individual pulse resolution is required. A new computer-based partial discharge detection system was designed and constructed to conduct long duration tests on sample capacitors. This system is capable of recording large number of pulses without dead time and producing valuable information related to amplitude, polarity, and charge content of the discharges. The operation of the system is automatic and no human supervision is required during the testing stage. Ceramic capacitors were tested at high voltage in long duration tests. The obtained results indicated that the charge content of partial discharges shift towards high levels of charge as the level of deterioration in the capacitor increases.

  20. Hypersonic research engine/aerothermodynamic integration model, experimental results. Volume 1: Mach 6 component integration

    NASA Technical Reports Server (NTRS)

    Andrews, E. H., Jr.; Mackley, E. A.

    1976-01-01

    The NASA Hypersonic Research Engine (HRE) Project was initiated for the purpose of advancing the technology of airbreathing propulsion for hypersonic flight. A large component (inlet, combustor, and nozzle) and structures development program was encompassed by the project. The tests of a full-scale (18 in. diameter cowl and 87 in. long) HRE concept, designated the Aerothermodynamic Integration Model (AIM), at Mach numbers of 5, 6, and 7. Computer program results for Mach 6 component integration tests are presented.

Top