Pharmacogenomics in diverse practice settings: implementation beyond major metropolitan areas
Dorfman, Elizabeth H; Trinidad, Susan Brown; Morales, Chelsea T; Howlett, Kevin; Burke, Wylie; Woodahl, Erica L
2015-01-01
Aim The limited formal study of the clinical feasibility of implementing pharmacogenomic tests has thus far focused on providers at large medical centers in urban areas. Our research focuses on small metropolitan, rural and tribal practice settings. Materials & methods We interviewed 17 healthcare providers in western Montana regarding pharmacogenomic testing. Results Participants were optimistic about the potential of pharmacogenomic tests, but noted unique barriers in small and rural settings including cost, adherence, patient acceptability and testing timeframe. Participants in tribal settings identified heightened sensitivity to genetics and need for community leadership approval as additional considerations. Conclusion Implementation differences in small metropolitan, rural and tribal communities may affect pharmacogenomic test adoption and utilization, potentially impacting many patients. PMID:25712186
Wickenberg-Bolin, Ulrika; Göransson, Hanna; Fryknäs, Mårten; Gustafsson, Mats G; Isaksson, Anders
2006-03-13
Supervised learning for classification of cancer employs a set of design examples to learn how to discriminate between tumors. In practice it is crucial to confirm that the classifier is robust with good generalization performance to new examples, or at least that it performs better than random guessing. A suggested alternative is to obtain a confidence interval of the error rate using repeated design and test sets selected from available examples. However, it is known that even in the ideal situation of repeated designs and tests with completely novel samples in each cycle, a small test set size leads to a large bias in the estimate of the true variance between design sets. Therefore different methods for small sample performance estimation such as a recently proposed procedure called Repeated Random Sampling (RSS) is also expected to result in heavily biased estimates, which in turn translates into biased confidence intervals. Here we explore such biases and develop a refined algorithm called Repeated Independent Design and Test (RIDT). Our simulations reveal that repeated designs and tests based on resampling in a fixed bag of samples yield a biased variance estimate. We also demonstrate that it is possible to obtain an improved variance estimate by means of a procedure that explicitly models how this bias depends on the number of samples used for testing. For the special case of repeated designs and tests using new samples for each design and test, we present an exact analytical expression for how the expected value of the bias decreases with the size of the test set. We show that via modeling and subsequent reduction of the small sample bias, it is possible to obtain an improved estimate of the variance of classifier performance between design sets. However, the uncertainty of the variance estimate is large in the simulations performed indicating that the method in its present form cannot be directly applied to small data sets.
Naugle, Alecia Larew; Barlow, Kristina E; Eblen, Denise R; Teter, Vanessa; Umholtz, Robert
2006-11-01
The U.S. Food Safety and Inspection Service (FSIS) tests sets of samples of selected raw meat and poultry products for Salmonella to ensure that federally inspected establishments meet performance standards defined in the pathogen reduction-hazard analysis and critical control point system (PR-HACCP) final rule. In the present report, sample set results are described and associations between set failure and set and establishment characteristics are identified for 4,607 sample sets collected from 1998 through 2003. Sample sets were obtained from seven product classes: broiler chicken carcasses (n = 1,010), cow and bull carcasses (n = 240), market hog carcasses (n = 560), steer and heifer carcasses (n = 123), ground beef (n = 2,527), ground chicken (n = 31), and ground turkey (n = 116). Of these 4,607 sample sets, 92% (4,255) were collected as part of random testing efforts (A sets), and 93% (4,166) passed. However, the percentage of positive samples relative to the maximum number of positive results allowable in a set increased over time for broilers but decreased or stayed the same for the other product classes. Three factors associated with set failure were identified: establishment size, product class, and year. Set failures were more likely early in the testing program (relative to 2003). Small and very small establishments were more likely to fail than large ones. Set failure was less likely in ground beef than in other product classes. Despite an overall decline in set failures through 2003, these results highlight the need for continued vigilance to reduce Salmonella contamination in broiler chicken and continued implementation of programs designed to assist small and very small establishments with PR-HACCP compliance issues.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-15
... proposed rules provide a clear set of guidelines for small businesses to understand and a bright-line test..., bright-line test for SBIR and STTR applicants to apply when determining eligibility with respect to size... owns 33% or more of the company) in order to create a bright-line test for applicants; (2) find...
Beran, Michael J; Parrish, Audrey E
2016-08-01
A key issue in understanding the evolutionary and developmental emergence of numerical cognition is to learn what mechanism(s) support perception and representation of quantitative information. Two such systems have been proposed, one for dealing with approximate representation of sets of items across an extended numerical range and another for highly precise representation of only small numbers of items. Evidence for the first system is abundant across species and in many tests with human adults and children, whereas the second system is primarily evident in research with children and in some tests with non-human animals. A recent paper (Choo & Franconeri, Psychonomic Bulletin & Review, 21, 93-99, 2014) with adult humans also reported "superprecise" representation of small sets of items in comparison to large sets of items, which would provide more support for the presence of a second system in human adults. We first presented capuchin monkeys with a test similar to that of Choo and Franconeri in which small or large sets with the same ratios had to be discriminated. We then presented the same monkeys with an expanded range of comparisons in the small number range (all comparisons of 1-9 items) and the large number range (all comparisons of 10-90 items in 10-item increments). Capuchin monkeys showed no increased precision for small over large sets in making these discriminations in either experiment. These data indicate a difference in the performance of monkeys to that of adult humans, and specifically that monkeys do not show improved discrimination performance for small sets relative to large sets when the relative numerical differences are held constant.
Small UAS Test Area at NASA's Dryden Flight Research Center
NASA Technical Reports Server (NTRS)
Bauer, Jeffrey T.
2008-01-01
This viewgraph presentation reviews the areas that Dryden Flight Research Center has set up for testing small Unmanned Aerial Systems (UAS). It also reviews the requirements and process to use an area for UAS test.
Bobby, Zachariah; Nandeesha, H; Sridhar, M G; Soundravally, R; Setiya, Sajita; Babu, M Sathish; Niranjan, G
2014-01-01
Graduate medical students often get less opportunity for clarifying their doubts and to reinforce their concepts after lecture classes. The Medical Council of India (MCI) encourages group discussions among students. We evaluated the effect of identifying mistakes in a given set of wrong statements and their correction by a small group discussion by graduate medical students as a revision exercise. At the end of a module, a pre-test consisting of multiple-choice questions (MCQs) was conducted. Later, a set of incorrect statements related to the topic was given to the students and they were asked to identify the mistakes and correct them in a small group discussion. The effects on low, medium and high achievers were evaluated by a post-test and delayed post-tests with the same set of MCQs. The mean post-test marks were significantly higher among all the three groups compared to the pre-test marks. The gain from the small group discussion was equal among low, medium and high achievers. The gain from the exercise was retained among low, medium and high achievers after 15 days. Identification of mistakes in statements and their correction by a small group discussion is an effective, but unconventional revision exercise in biochemistry. Copyright 2014, NMJI.
Comparison of eigenvectors for coupled seismo-electromagnetic layered-Earth modelling
NASA Astrophysics Data System (ADS)
Grobbe, N.; Slob, E. C.; Thorbecke, J. W.
2016-07-01
We study the accuracy and numerical stability of three eigenvector sets for modelling the coupled poroelastic and electromagnetic layered-Earth response. We use a known eigenvector set, its flux-normalized version and a newly derived flux-normalized set. The new set is chosen such that the system is properly uncoupled when the coupling between the poroelastic and electromagnetic fields vanishes. We carry out two different numerical stability tests: the first test focuses on the internal system, eigenvector and eigenvalue consistency; the second test investigates the stability and preciseness of the flux-normalized systems by looking at identity relations. We find that the known set shows the largest deviation for both tests, whereas the new set performs best. In two additional numerical modelling experiments, these numerical inaccuracies are shown to generate numerical noise levels comparable to small signals, such as signals coming from the important interface conversion responses, especially when the coupling coefficient is small. When coupling vanishes completely, the known set does not produce proper results. The new set produces numerically stable and accurate results in all situations. We therefore strongly recommend to use this newly derived set for future layered-Earth seismo-electromagnetic modelling experiments.
Eblen, Denise R; Barlow, Kristina E; Naugle, Alecia Larew
2006-11-01
The U.S. Food Safety and Inspection Service (FSIS) pathogen reduction-hazard analysis critical control point systems final rule, published in 1996, established Salmonella performance standards for broiler chicken, cow and bull, market hog, and steer and heifer carcasses and for ground beef, chicken, and turkey meat. In 1998, the FSIS began testing to verify that establishments are meeting performance standards. Samples are collected in sets in which the number of samples is defined but varies according to product class. A sample set fails when the number of positive Salmonella samples exceeds the maximum number of positive samples allowed under the performance standard. Salmonella sample sets collected at 1,584 establishments from 1998 through 2003 were examined to identify factors associated with failure of one or more sets. Overall, 1,282 (80.9%) of establishments never had failed sets. In establishments that did experience set failure(s), generally the failed sets were collected early in the establishment testing history, with the exception of broiler establishments where failure(s) occurred both early and late in the course of testing. Small establishments were more likely to have experienced a set failure than were large or very small establishments, and broiler establishments were more likely to have failed than were ground beef, market hog, or steer-heifer establishments. Agency response to failed Salmonella sample sets in the form of in-depth verification reviews and related establishment-initiated corrective actions have likely contributed to declines in the number of establishments that failed sets. A focus on food safety measures in small establishments and broiler processing establishments should further reduce the number of sample sets that fail to meet the Salmonella performance standard.
Comparison of Two Procedures for Analyzing Small Sets of Repeated Measures Data
ERIC Educational Resources Information Center
Vallejo, Guillermo; Livacic-Rojas, Pablo
2005-01-01
This article compares two methods for analyzing small sets of repeated measures data under normal and non-normal heteroscedastic conditions: a mixed model approach with the Kenward-Roger correction and a multivariate extension of the modified Brown-Forsythe (BF) test. These procedures differ in their assumptions about the covariance structure of…
Cheng, Phillip M; Tejura, Tapas K; Tran, Khoa N; Whang, Gilbert
2018-05-01
The purpose of this pilot study is to determine whether a deep convolutional neural network can be trained with limited image data to detect high-grade small bowel obstruction patterns on supine abdominal radiographs. Grayscale images from 3663 clinical supine abdominal radiographs were categorized into obstructive and non-obstructive categories independently by three abdominal radiologists, and the majority classification was used as ground truth; 74 images were found to be consistent with small bowel obstruction. Images were rescaled and randomized, with 2210 images constituting the training set (39 with small bowel obstruction) and 1453 images constituting the test set (35 with small bowel obstruction). Weight parameters for the final classification layer of the Inception v3 convolutional neural network, previously trained on the 2014 Large Scale Visual Recognition Challenge dataset, were retrained on the training set. After training, the neural network achieved an AUC of 0.84 on the test set (95% CI 0.78-0.89). At the maximum Youden index (sensitivity + specificity-1), the sensitivity of the system for small bowel obstruction is 83.8%, with a specificity of 68.1%. The results demonstrate that transfer learning with convolutional neural networks, even with limited training data, may be used to train a detector for high-grade small bowel obstruction gas patterns on supine radiographs.
Li, Der-Chiang; Liu, Chiao-Wen; Hu, Susan C
2011-05-01
Medical data sets are usually small and have very high dimensionality. Too many attributes will make the analysis less efficient and will not necessarily increase accuracy, while too few data will decrease the modeling stability. Consequently, the main objective of this study is to extract the optimal subset of features to increase analytical performance when the data set is small. This paper proposes a fuzzy-based non-linear transformation method to extend classification related information from the original data attribute values for a small data set. Based on the new transformed data set, this study applies principal component analysis (PCA) to extract the optimal subset of features. Finally, we use the transformed data with these optimal features as the input data for a learning tool, a support vector machine (SVM). Six medical data sets: Pima Indians' diabetes, Wisconsin diagnostic breast cancer, Parkinson disease, echocardiogram, BUPA liver disorders dataset, and bladder cancer cases in Taiwan, are employed to illustrate the approach presented in this paper. This research uses the t-test to evaluate the classification accuracy for a single data set; and uses the Friedman test to show the proposed method is better than other methods over the multiple data sets. The experiment results indicate that the proposed method has better classification performance than either PCA or kernel principal component analysis (KPCA) when the data set is small, and suggest creating new purpose-related information to improve the analysis performance. This paper has shown that feature extraction is important as a function of feature selection for efficient data analysis. When the data set is small, using the fuzzy-based transformation method presented in this work to increase the information available produces better results than the PCA and KPCA approaches. Copyright © 2011 Elsevier B.V. All rights reserved.
Effect of simulated forward airspeed on small-scale-model externally blown flap noise
NASA Technical Reports Server (NTRS)
Goodykoontz, J. H.; Dorsch, R. G.; Olsen, W. A.
1976-01-01
Noise tests were conducted on a small-scale model of an externally blown flap lift augmentation system. The nozzle/wing model was subjected to external flow that simulated takeoff and landing flight velocities by placing it in a 33-centimeter-diameter free jet. The results showed that external flow attenuated the noise associated with the various configurations tested. The amount of attenuation depended on flap setting. More attenuation occurred with a trailing-flap setting of 20 deg than with one of 60 deg. Noise varied with relative velocity as a function of the trailing-flap setting and the angle from the nozzle inlet.
Tygert, Mark
2010-09-21
We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).
Maintaining Equivalent Cut Scores for Small Sample Test Forms
ERIC Educational Resources Information Center
Dwyer, Andrew C.
2016-01-01
This study examines the effectiveness of three approaches for maintaining equivalent performance standards across test forms with small samples: (1) common-item equating, (2) resetting the standard, and (3) rescaling the standard. Rescaling the standard (i.e., applying common-item equating methodology to standard setting ratings to account for…
Achievement, attributions, self-efficacy, and goal setting by accounting undergraduates.
Cheng, Pi-Yueh; Chiou, Wen-Bin
2010-02-01
Correlations were examined between two measures of accounting self-efficacy achievement goal setting, attributions, and scores on the Accounting Practice Achievement Test, obtained 1 yr. apart for 124 freshmen in junior college. Analysis indicated favorable attribution contributed to a higher mean score on accounting self-efficacy. Students with higher perceived self-efficacy performed better on the proficiency tests. Those with higher self-efficacy also set higher goals for subsequent achievement tests. Moreover, students who set higher achievement goals performed better. Goal setting mediated the relation of initial self-efficacy with subsequent test performance. However, the amount of variance accounted for by self-efficacy was small. An effective method for enhancing performance on an accounting achievement test might be to increase beneficial attributions, self-efficacy in accounting, and to encourage setting reasonable achievement goals.
Ngo, Tuan Anh; Lu, Zhi; Carneiro, Gustavo
2017-01-01
We introduce a new methodology that combines deep learning and level set for the automated segmentation of the left ventricle of the heart from cardiac cine magnetic resonance (MR) data. This combination is relevant for segmentation problems, where the visual object of interest presents large shape and appearance variations, but the annotated training set is small, which is the case for various medical image analysis applications, including the one considered in this paper. In particular, level set methods are based on shape and appearance terms that use small training sets, but present limitations for modelling the visual object variations. Deep learning methods can model such variations using relatively small amounts of annotated training, but they often need to be regularised to produce good generalisation. Therefore, the combination of these methods brings together the advantages of both approaches, producing a methodology that needs small training sets and produces accurate segmentation results. We test our methodology on the MICCAI 2009 left ventricle segmentation challenge database (containing 15 sequences for training, 15 for validation and 15 for testing), where our approach achieves the most accurate results in the semi-automated problem and state-of-the-art results for the fully automated challenge. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Judd, M.; Wolf, S. W. D.; Goodyer, M. J.
1976-01-01
A method has been developed for accurately computing the imaginary flow fields outside a flexible walled test section, applicable to lifting and non-lifting models. The tolerances in the setting of the flexible walls introduce only small levels of aerodynamic interference at the model. While it is not possible to apply corrections for the interference effects, they may be reduced by improving the setting accuracy of the portions of wall immediately above and below the model. Interference effects of the truncation of the length of the streamlined portion of a test section are brought to an acceptably small level by the use of a suitably long test section with the model placed centrally.
NASA Astrophysics Data System (ADS)
Wang, Xuan-yu; Hu, Rui; Wang, Rui-xin
2015-10-01
A simple method has been set up to quickly test the emissivity with an infrared thermal imaging system within a small distance according to the theory of measuring temperature by infrared system, which is based on the Planck radiation law and Lambert-beer law. The object's temperature is promoted and held on by a heater while a temperature difference has been formed between the target and environment. The emissivity of human skin, galvanized iron plate, black rubber and liquid water has been tested under the condition that the emissivity is set in 1.0 and the testing distance is 1m. According to the invariance of human's body temperature, a testing curve is established to describe that the thermal imaging temperatures various with the emissivity which is set in from 0.9 to 1.0. As a result, the method has been verified. The testing results show that the emissivity of human skin is 0.95. The emissivity of galvanized iron plate, black rubber and liquid water decreases with the increase of object's temperature. The emissivity of galvanized iron plate is far smaller than the one of human skin, black rubber or water. The emissivity of water slowly linearly decreases with the increase of its temperature. By the study, within a small distance and clean atmosphere, the infrared emissivity of objects may be expediently tested with an infrared thermal imaging system according to the method, which is promoting the object's temperature to make it different from the environment temperature, then simultaneously measures the environmental temperature, the real temperature and thermal imaging temperature of the object when the emissivity is set in 1.0 and the testing distance is 1.0m.
Arroz, Erin; Jordan, Michael; Dumancas, Gerard G
2017-07-01
An ultraviolet visible (UV-Vis) spectrophotometric and partial least squares (PLS) chemometric method was developed for the simultaneous determination of erythrosine B (red), Brilliant Blue, and tartrazine (yellow) dyes. A training set (n = 64) was generated using a full factorial design and its accuracy was tested in a test set (n = 13) using a Box-Behnken design. The test set garnered a root mean square error (RMSE) of 1.79 × 10 -7 for blue, 4.59 × 10 -7 for red, and 1.13 × 10 -6 for yellow dyes. The relatively small RMSE suggests only a small difference between predicted versus measured concentrations, demonstrating the accuracy of our model. The relative error of prediction (REP) for the test set were 11.73%, 19.52%, 19.38%, for blue, red, and yellow dyes, respectively. A comparable overlay between the actual candy samples and their replicated synthetic spectra were also obtained indicating the model as a potentially accurate method for determining concentrations of dyes in food samples.
Combination of large and small basis sets in electronic structure calculations on large systems
NASA Astrophysics Data System (ADS)
Røeggen, Inge; Gao, Bin
2018-04-01
Two basis sets—a large and a small one—are associated with each nucleus of the system. Each atom has its own separate one-electron basis comprising the large basis set of the atom in question and the small basis sets for the partner atoms in the complex. The perturbed atoms in molecules and solids model is at core of the approach since it allows for the definition of perturbed atoms in a system. It is argued that this basis set approach should be particularly useful for periodic systems. Test calculations are performed on one-dimensional arrays of H and Li atoms. The ground-state energy per atom in the linear H array is determined versus bond length.
Using Small-Scale Randomized Controlled Trials to Evaluate the Efficacy of New Curricular Materials
Bass, Kristin M.; Stark, Louisa A.
2014-01-01
How can researchers in K–12 contexts stay true to the principles of rigorous evaluation designs within the constraints of classroom settings and limited funding? This paper explores this question by presenting a small-scale randomized controlled trial (RCT) designed to test the efficacy of curricular supplemental materials on epigenetics. The researchers asked whether the curricular materials improved students’ understanding of the content more than an alternative set of activities. The field test was conducted in a diverse public high school setting with 145 students who were randomly assigned to a treatment or comparison condition. Findings indicate that students in the treatment condition scored significantly higher on the posttest than did students in the comparison group (effect size: Cohen's d = 0.40). The paper discusses the strengths and limitations of the RCT, the contextual factors that influenced its enactment, and recommendations for others wishing to conduct small-scale rigorous evaluations in educational settings. Our intention is for this paper to serve as a case study for university science faculty members who wish to employ scientifically rigorous evaluations in K–12 settings while limiting the scope and budget of their work. PMID:25452482
NASA Technical Reports Server (NTRS)
Rakoczy, John; Heater, Daniel; Lee, Ashley
2013-01-01
Marshall Space Flight Center's (MSFC) Small Projects Rapid Integration and Test Environment (SPRITE) is a Hardware-In-The-Loop (HWIL) facility that provides rapid development, integration, and testing capabilities for small projects (CubeSats, payloads, spacecraft, and launch vehicles). This facility environment focuses on efficient processes and modular design to support rapid prototyping, integration, testing and verification of small projects at an affordable cost, especially compared to larger type HWIL facilities. SPRITE (Figure 1) consists of a "core" capability or "plant" simulation platform utilizing a graphical programming environment capable of being rapidly re-configured for any potential test article's space environments, as well as a standard set of interfaces (i.e. Mil-Std 1553, Serial, Analog, Digital, etc.). SPRITE also allows this level of interface testing of components and subsystems very early in a program, thereby reducing program risk.
III. FROM SMALL TO BIG: METHODS FOR INCORPORATING LARGE SCALE DATA INTO DEVELOPMENTAL SCIENCE.
Davis-Kean, Pamela E; Jager, Justin
2017-06-01
For decades, developmental science has been based primarily on relatively small-scale data collections with children and families. Part of the reason for the dominance of this type of data collection is the complexity of collecting cognitive and social data on infants and small children. These small data sets are limited in both power to detect differences and the demographic diversity to generalize clearly and broadly. Thus, in this chapter we will discuss the value of using existing large-scale data sets to tests the complex questions of child development and how to develop future large-scale data sets that are both representative and can answer the important questions of developmental scientists. © 2017 The Society for Research in Child Development, Inc.
Altering Test Environments for Reducing Test Anxiety and for Improving Academic Performance.
ERIC Educational Resources Information Center
Bushnell, Don D.
To test the effects of altering situational variables in stressful examinations on high test anxious and low test anxious undergraduates, mid-terms and final examinations were administered in two environmental settings: large lecture halls and small language laboratories. Mean test scores for high test anxious students in the language labs were…
Automating Media Centers and Small Libraries: A Microcomputer-Based Approach.
ERIC Educational Resources Information Center
Meghabghab, Dania Bilal
Although the general automation process can be applied to most libraries, small libraries and media centers require a customized approach. Using a systematic approach, this guide covers each step and aspect of automation in a small library setting, and combines the principles of automation with field- tested activities. After discussing needs…
Liver Rapid Reference Set Application: Hemken - Abbott (2015) — EDRN Public Portal
The aim for this testing is to find a small panel of biomarkers (n=2-5) that can be tested on the Abbott ARCHITECT automated immunoassay platform for the early detection of hepatocellular carcinoma (HCC). This panel of biomarkers should perform significantly better than alpha-fetoprotein (AFP) alone based on multivariate statistical analysis. This testing of the EDRN reference set will help expedite the selection of a small panel of ARCHITECT biomarkers for the early detection of HCC. The panel of ARCHITECT biomarkers Abbott plans to test include: AFP, protein induced by vitamin K absence or antagonist-II (PIVKA-II), golgi protein 73 (GP73), hepatocellular growth factor (HGF), dipeptidyl peptidase 4 (DPP4) and DPP4/seprase (surface expressed protease) heterodimer hybrid. PIVKA-II is abnormal des-carboxylated prothrombin (DCP) present in vitamin K deficiency.
Multiscale 3D Shape Analysis using Spherical Wavelets
Nain, Delphine; Haker, Steven; Bobick, Aaron; Tannenbaum, Allen
2013-01-01
Shape priors attempt to represent biological variations within a population. When variations are global, Principal Component Analysis (PCA) can be used to learn major modes of variation, even from a limited training set. However, when significant local variations exist, PCA typically cannot represent such variations from a small training set. To address this issue, we present a novel algorithm that learns shape variations from data at multiple scales and locations using spherical wavelets and spectral graph partitioning. Our results show that when the training set is small, our algorithm significantly improves the approximation of shapes in a testing set over PCA, which tends to oversmooth data. PMID:16685992
Multiscale 3D shape analysis using spherical wavelets.
Nain, Delphine; Haker, Steven; Bobick, Aaron; Tannenbaum, Allen R
2005-01-01
Shape priors attempt to represent biological variations within a population. When variations are global, Principal Component Analysis (PCA) can be used to learn major modes of variation, even from a limited training set. However, when significant local variations exist, PCA typically cannot represent such variations from a small training set. To address this issue, we present a novel algorithm that learns shape variations from data at multiple scales and locations using spherical wavelets and spectral graph partitioning. Our results show that when the training set is small, our algorithm significantly improves the approximation of shapes in a testing set over PCA, which tends to oversmooth data.
Evaluating the uniformity of color spaces and performance of color difference formulae
NASA Astrophysics Data System (ADS)
Lian, Yusheng; Liao, Ningfang; Wang, Jiajia; Tan, Boneng; Liu, Zilong
2010-11-01
Using small color difference data sets (Macadam ellipses dataset and RIT-DuPont suprathreshold color difference ellipses dataset), and large color difference data sets (Munsell Renovation Data and OSA Uniform Color Scales dataset), the uniformity of several color spaces and performance of color difference formulae based on these color spaces are evaluated. The color spaces used are CIELAB, DIN99d, IPT, and CIECAM02-UCS. It is found that the uniformity of lightness is better than saturation and hue. Overall, for all these color spaces, the uniformity in the blue area is inferior to the other area. The uniformity of CIECAM02-UCS is superior to the other color spaces for the whole color-difference range from small to large. The uniformity of CIELAB and IPT for the large color difference data sets is better than it for the small color difference data sets, but the DIN99d is opposite. Two common performance factors (PF/3 and STRESS) and the statistical F-test are calculated to test the performance of color difference formula. The results show that the performance of color difference formulae based on these four color spaces is consistent with the uniformity of these color spaces.
Experimental Spin Testing of Integrally Damped Composite Plates
NASA Technical Reports Server (NTRS)
Kosmatka, John
1998-01-01
The experimental behavior of spinning laminated composite pretwisted plates (turbo-fan blade-like) with small (less than 10% by volume) integral viscoelastic damping patches was investigated at NASA-Lewis Research Center. Ten different plate sets were experimentally spin tested and the resulting data was analyzed. The first-four plate sets investigated tailoring patch locations and definitions to damp specific modes on spinning flat graphite/epoxy plates as a function of rotational speed. The remaining six plate sets investigated damping patch size and location on specific modes of pretwisted (30 degrees) graphite/epoxy plates. The results reveal that: (1) significant amount of damping can be added using a small amount of damping material, (2) the damped plates experienced no failures up to the tested 28,000 g's and 750,000 cycles, (3) centrifugal loads caused an increase in bending frequencies and corresponding reductions in bending damping levels that are proportional to the bending stiffness increase, and (4) the centrifugal loads caused a decrease in torsion natural frequency and increase in damping levels of pretwisted composite plates.
Testing Bell's inequality with cosmic photons: closing the setting-independence loophole.
Gallicchio, Jason; Friedman, Andrew S; Kaiser, David I
2014-03-21
We propose a practical scheme to use photons from causally disconnected cosmic sources to set the detectors in an experimental test of Bell's inequality. In current experiments, with settings determined by quantum random number generators, only a small amount of correlation between detector settings and local hidden variables, established less than a millisecond before each experiment, would suffice to mimic the predictions of quantum mechanics. By setting the detectors using pairs of quasars or patches of the cosmic microwave background, observed violations of Bell's inequality would require any such coordination to have existed for billions of years-an improvement of 20 orders of magnitude.
ERIC Educational Resources Information Center
Van Blankenstein, Floris M.; Dolmans, Diana H. J. M.; Van der Vleuten, Cees P. M.; Schmidt, Henk G.
2013-01-01
This study set out to test whether relevant prior knowledge would moderate a positive effect on academic achievement of elaboration during small-group discussion. In a 2 × 2 experimental design, 66 undergraduate students observed a video showing a small-group problem-based discussion about thunder and lightning. In the video, a teacher asked…
The Advantages of Using Planned Comparisons over Post Hoc Tests.
ERIC Educational Resources Information Center
Kuehne, Carolyn C.
There are advantages to using a priori or planned comparisons rather than omnibus multivariate analysis of variance (MANOVA) tests followed by post hoc or a posteriori testing. A small heuristic data set is used to illustrate these advantages. An omnibus MANOVA test was performed on the data followed by a post hoc test (discriminant analysis). A…
Does sensitivity measured from screening test-sets predict clinical performance?
NASA Astrophysics Data System (ADS)
Soh, BaoLin P.; Lee, Warwick B.; Mello-Thoms, Claudia R.; Tapia, Kriscia A.; Ryan, John; Hung, Wai Tak; Thompson, Graham J.; Heard, Rob; Brennan, Patrick C.
2014-03-01
Aim: To examine the relationship between sensitivity measured from the BREAST test-set and clinical performance. Background: Although the UK and Australia national breast screening programs have regarded PERFORMS and BREAST test-set strategies as possible methods of estimating readers' clinical efficacy, the relationship between test-set and real life performance results has never been satisfactorily understood. Methods: Forty-one radiologists from BreastScreen New South Wales participated in this study. Each reader interpreted a BREAST test-set which comprised sixty de-identified mammographic examinations sourced from the BreastScreen Digital Imaging Library. Spearman's rank correlation coefficient was used to compare the sensitivity measured from the BREAST test-set with screen readers' clinical audit data. Results: Results shown statistically significant positive moderate correlations between test-set sensitivity and each of the following metrics: rate of invasive cancer per 10 000 reads (r=0.495; p < 0.01); rate of small invasive cancer per 10 000 reads (r=0.546; p < 0.001); detection rate of all invasive cancers and DCIS per 10 000 reads (r=0.444; p < 0.01). Conclusion: Comparison between sensitivity measured from the BREAST test-set and real life detection rate demonstrated statistically significant positive moderate correlations which validated that such test-set strategies can reflect readers' clinical performance and be used as a quality assurance tool. The strength of correlation demonstrated in this study was higher than previously found by others.
Learning Data Set Influence on Identification Accuracy of Gas Turbine Neural Network Model
NASA Astrophysics Data System (ADS)
Kuznetsov, A. V.; Makaryants, G. M.
2018-01-01
There are many gas turbine engine identification researches via dynamic neural network models. It should minimize errors between model and real object during identification process. Questions about training data set processing of neural networks are usually missed. This article presents a study about influence of data set type on gas turbine neural network model accuracy. The identification object is thermodynamic model of micro gas turbine engine. The thermodynamic model input signal is the fuel consumption and output signal is the engine rotor rotation frequency. Four types input signals was used for creating training and testing data sets of dynamic neural network models - step, fast, slow and mixed. Four dynamic neural networks were created based on these types of training data sets. Each neural network was tested via four types test data sets. In the result 16 transition processes from four neural networks and four test data sets from analogous solving results of thermodynamic model were compared. The errors comparison was made between all neural network errors in each test data set. In the comparison result it was shown error value ranges of each test data set. It is shown that error values ranges is small therefore the influence of data set types on identification accuracy is low.
Mini-tapping sugar maples for sap-sugar testing
William J. Gabriel
1982-01-01
Describes a technique using cannulas, surgical tubing, and small containers to obtain sap samples for use in determining the sugar content of sap in small sugar maple trees. This technique is used on trees directly exposed to the weather, and sets a minimum tappable tree diameter of 1.5 cm.
NASA Technical Reports Server (NTRS)
Wolf, S. W. D.
1984-01-01
Self streamlining two dimensional flexible walled test sections eliminate the uncertainties found in data from conventional test sections particularly at transonic speeds. The test section sidewalls are rigid, while the floor and ceiling are flexible and are positioned to streamline shapes by a system of jacks, without reference to the model. The walls are therefore self streamlining. Data are taken from the model when the walls are good streamlines such that the inevitable residual wall induced interference is acceptably small and correctable. Successful two dimensional validation testing at low speeds has led to the development of a new transonic flexible walled test section. Tunnel setting times are minimized by the development of a rapid wall setting strategy coupled with on line computer control of wall shapes using motorized jacks. Two dimensional validation testing using symmetric and cambered aerofoils in the Mach number range up to about 0.85 where the walls are just supercritical, shows good agreement with reference data using small height-chord ratios between 1.5 and unity.
Accurate Methods for Large Molecular Systems (Preprint)
2009-01-06
tensor, EFP calculations are basis set dependent. The smallest recommended basis set is 6- 31++G( d , p )52 The dependence of the computational cost of...and second order perturbation theory (MP2) levels with the 6-31G( d , p ) basis set. Additional SFM tests are presented for a small set of alpha...helices using the 6-31++G( d , p ) basis set. The larger 6-311++G(3df,2p) basis set is employed for creating all EFPs used for non- bonded interactions, since
Firefighters as distributors of workplace safety and health information to small businesses
Keller, Brenna M.; Cunningham, Thomas R.
2016-01-01
Background Small businesses bear a large burden of injury and death, and are difficult to reach with occupational safety and health (OSH) information. The National Institute for Occupational Safety and Health (NIOSH) developed a pilot study testing the feasibility of fire departments disseminating OSH information to small businesses during fire inspections. Methods Two sets of postcards were developed with unique, trackable URLs for the NIOSH Small Business Resource Guide. One set was distributed by firefighters, the other was mailed to small businesses. Participating inspectors were met with to discuss their experience. Results Neither distribution method resulted in a substantial number of site visits. Inspectors believed distributing postcards was an easy addition to their duties, and saw value in safety information. Conclusions There are barriers beyond awareness of availability that prevent small business owners from seeking OSH information. Research should focus on identifying barriers and developing better OSH information diffusion mechanisms. PMID:27594768
Quality Control for Scoring Tests Administered in Continuous Mode: An NCME Instructional Module
ERIC Educational Resources Information Center
Allalouf, Avi; Gutentag, Tony; Baumer, Michal
2017-01-01
Quality control (QC) in testing is paramount. QC procedures for tests can be divided into two types. The first type, one that has been well researched, is QC for tests administered to large population groups on few administration dates using a small set of test forms (e.g., large-scale assessment). The second type is QC for tests, usually…
A Battery Certification Testbed for Small Satellite Missions
NASA Technical Reports Server (NTRS)
Cameron, Zachary; Kulkarni, Chetan S.; Luna, Ali Guarneros; Goebel, Kai; Poll, Scott
2015-01-01
A battery pack consisting of standard cylindrical 18650 lithium-ion cells has been chosen for small satellite missions based on previous flight heritage and compliance with NASA battery safety requirements. However, for batteries that transit through the International Space Station (ISS), additional certification tests are required for individual cells as well as the battery packs. In this manuscript, we discuss the development of generalized testbeds for testing and certifying different types of batteries critical to small satellite missions. Test procedures developed and executed for this certification effort include: a detailed physical inspection before and after experiments; electrical cycling characterization at the cell and pack levels; battery-pack overcharge, over-discharge, external short testing; battery-pack vacuum leak and vibration testing. The overall goals of these certification procedures are to conform to requirements set forth by the agency and identify unique safety hazards. The testbeds, procedures, and experimental results are discussed for batteries chosen for small satellite missions to be launched from the ISS.
An Analysis of Testing Time within a Mastery-Based Medical School Course.
ERIC Educational Resources Information Center
Wade, David R.; Williams, Reed G.
1979-01-01
Southern Illinois University School of Medicine's personalized teaching system has the following features: students are provided with behavioral objectives prior to instruction; passing levels for tests are set in advance and are independent of class performance; and the program is divided into small units and students are tested frequently. (LBH)
Set size and culture influence children's attention to number.
Cantrell, Lisa; Kuwabara, Megumi; Smith, Linda B
2015-03-01
Much research evidences a system in adults and young children for approximately representing quantity. Here we provide evidence that the bias to attend to discrete quantity versus other dimensions may be mediated by set size and culture. Preschool-age English-speaking children in the United States and Japanese-speaking children in Japan were tested in a match-to-sample task where number was pitted against cumulative surface area in both large and small numerical set comparisons. Results showed that children from both cultures were biased to attend to the number of items for small sets. Large set responses also showed a general attention to number when ratio difficulty was easy. However, relative to the responses for small sets, attention to number decreased for both groups; moreover, both U.S. and Japanese children showed a significant bias to attend to total amount for difficult numerical ratio distances, although Japanese children shifted attention to total area at relatively smaller set sizes than U.S. children. These results add to our growing understanding of how quantity is represented and how such representation is influenced by context--both cultural and perceptual. Copyright © 2014 Elsevier Inc. All rights reserved.
Fryer-Edwards, Kelly; Arnold, Robert M; Baile, Walter; Tulsky, James A; Petracca, Frances; Back, Anthony
2006-07-01
Small-group teaching is particularly suited for complex skills such as communication. Existing work has identified the basic elements of small-group teaching, but few descriptions of higher-order teaching practices exist in the medical literature. Thus the authors developed an empirically driven and theoretically grounded model for small-group communication-skills teaching. Between 2002 and 2005, teaching observations were collected over 100 hours of direct contact time between four expert facilitators and 120 medical oncology fellows participating in Oncotalk, a semiannual, four-day retreat focused on end-of-life communication skills. The authors conducted small-group teaching observations, semistructured interviews with faculty participants, video or audio recording with transcript review, and evaluation of results by faculty participants. Teaching skills observed during the retreats included a linked set of reflective, process-oriented teaching practices: identifying a learning edge, proposing and testing hypotheses, and calibrating learner self-assessments. Based on observations and debriefings with facilitators, the authors developed a conceptual model of teaching that illustrates an iterative loop of teaching practices aimed at enhancing learners' engagement and self-efficacy. Through longitudinal, empirical observations, this project identified a set of specific teaching skills for small-group settings with applicability to other clinical teaching settings. This study extends current theory and teaching practice prescriptions by describing specific teaching practices required for effective teaching. These reflective teaching practices, while developed for communication skills training, may be useful for teaching other challenging topics such as ethics and professionalism.
NASA Technical Reports Server (NTRS)
Miller, D. P.; Prahst, P. S.
1995-01-01
An axial compressor test rig has been designed for the operation of small turbomachines. A flow test was run to calibrate and determine the source and magnitudes of the loss mechanisms in the compressor inlet for a highly loaded two-stage axial compressor test. Several flow conditions and inlet guide vane (IGV) angle settings were established, for which detailed surveys were completed. Boundary layer bleed was also provided along the casing of the inlet behind the support struts and ahead of the IGV. Several computational fluid dynamics (CFD) calculations were made for selected flow conditions established during the test. Good agreement between the CFD and test data were obtained for these test conditions.
Aggregation Bias and the Analysis of Necessary and Sufficient Conditions in fsQCA
ERIC Educational Resources Information Center
Braumoeller, Bear F.
2017-01-01
Fuzzy-set qualitative comparative analysis (fsQCA) has become one of the most prominent methods in the social sciences for capturing causal complexity, especially for scholars with small- and medium-"N" data sets. This research note explores two key assumptions in fsQCA's methodology for testing for necessary and sufficient…
Advantages of Synthetic Noise and Machine Learning for Analyzing Radioecological Data Sets.
Shuryak, Igor
2017-01-01
The ecological effects of accidental or malicious radioactive contamination are insufficiently understood because of the hazards and difficulties associated with conducting studies in radioactively-polluted areas. Data sets from severely contaminated locations can therefore be small. Moreover, many potentially important factors, such as soil concentrations of toxic chemicals, pH, and temperature, can be correlated with radiation levels and with each other. In such situations, commonly-used statistical techniques like generalized linear models (GLMs) may not be able to provide useful information about how radiation and/or these other variables affect the outcome (e.g. abundance of the studied organisms). Ensemble machine learning methods such as random forests offer powerful alternatives. We propose that analysis of small radioecological data sets by GLMs and/or machine learning can be made more informative by using the following techniques: (1) adding synthetic noise variables to provide benchmarks for distinguishing the performances of valuable predictors from irrelevant ones; (2) adding noise directly to the predictors and/or to the outcome to test the robustness of analysis results against random data fluctuations; (3) adding artificial effects to selected predictors to test the sensitivity of the analysis methods in detecting predictor effects; (4) running a selected machine learning method multiple times (with different random-number seeds) to test the robustness of the detected "signal"; (5) using several machine learning methods to test the "signal's" sensitivity to differences in analysis techniques. Here, we applied these approaches to simulated data, and to two published examples of small radioecological data sets: (I) counts of fungal taxa in samples of soil contaminated by the Chernobyl nuclear power plan accident (Ukraine), and (II) bacterial abundance in soil samples under a ruptured nuclear waste storage tank (USA). We show that the proposed techniques were advantageous compared with the methodology used in the original publications where the data sets were presented. Specifically, our approach identified a negative effect of radioactive contamination in data set I, and suggested that in data set II stable chromium could have been a stronger limiting factor for bacterial abundance than the radionuclides 137Cs and 99Tc. This new information, which was extracted from these data sets using the proposed techniques, can potentially enhance the design of radioactive waste bioremediation.
Advantages of Synthetic Noise and Machine Learning for Analyzing Radioecological Data Sets
Shuryak, Igor
2017-01-01
The ecological effects of accidental or malicious radioactive contamination are insufficiently understood because of the hazards and difficulties associated with conducting studies in radioactively-polluted areas. Data sets from severely contaminated locations can therefore be small. Moreover, many potentially important factors, such as soil concentrations of toxic chemicals, pH, and temperature, can be correlated with radiation levels and with each other. In such situations, commonly-used statistical techniques like generalized linear models (GLMs) may not be able to provide useful information about how radiation and/or these other variables affect the outcome (e.g. abundance of the studied organisms). Ensemble machine learning methods such as random forests offer powerful alternatives. We propose that analysis of small radioecological data sets by GLMs and/or machine learning can be made more informative by using the following techniques: (1) adding synthetic noise variables to provide benchmarks for distinguishing the performances of valuable predictors from irrelevant ones; (2) adding noise directly to the predictors and/or to the outcome to test the robustness of analysis results against random data fluctuations; (3) adding artificial effects to selected predictors to test the sensitivity of the analysis methods in detecting predictor effects; (4) running a selected machine learning method multiple times (with different random-number seeds) to test the robustness of the detected “signal”; (5) using several machine learning methods to test the “signal’s” sensitivity to differences in analysis techniques. Here, we applied these approaches to simulated data, and to two published examples of small radioecological data sets: (I) counts of fungal taxa in samples of soil contaminated by the Chernobyl nuclear power plan accident (Ukraine), and (II) bacterial abundance in soil samples under a ruptured nuclear waste storage tank (USA). We show that the proposed techniques were advantageous compared with the methodology used in the original publications where the data sets were presented. Specifically, our approach identified a negative effect of radioactive contamination in data set I, and suggested that in data set II stable chromium could have been a stronger limiting factor for bacterial abundance than the radionuclides 137Cs and 99Tc. This new information, which was extracted from these data sets using the proposed techniques, can potentially enhance the design of radioactive waste bioremediation. PMID:28068401
Explaining the Gap in Black-White Scores on IQ and College Admission Tests.
ERIC Educational Resources Information Center
Cross, Theodore, Ed.
1998-01-01
Argues that differences in black performance and white performance on standardized tests likely comes from deeply rooted environmental forces such as expectations of one's life being restricted to a small and poorly rewarded set of social roles. Issues of test bias, the influence of caste-like minorities, the conflict between African American…
Soh, BaoLin Pauline; Lee, Warwick Bruce; Mello-Thoms, Claudia; Tapia, Kriscia; Ryan, John; Hung, Wai Tak; Thompson, Graham; Heard, Rob; Brennan, Patrick
2015-08-01
Test sets have been increasingly utilised to augment clinical audit in breast screening programmes; however, their relationship has never been satisfactorily understood. This study examined the relationship between mammographic test set performance and clinical audit data. Clinical audit data over a 2-year period was generated for each of 20 radiologists. Sixty mammographic examinations, consisting of 40 normal and 20 cancer cases, formed the test set. Readers located any identifiable cancer, and levels of confidence were scored from 2 to 5, where a score of 3 and above is considered a recall rating. Jackknifing free response operating characteristic (JAFROC) figure-of-merit (FOM), location sensitivity and specificity were calculated for individual readers and then compared with clinical audit values using Spearman's rho. JAFROC FOM showed significant correlations to: recall rate at a first round of screening (r = 0.51; P = 0.02); rate of small invasive cancers per 10 000 reads (r = 0.5; P = 0.02); percentage of all cancers read that were not recalled (r = -0.51; P = 0.02); and sensitivity (r = 0.51; P = 0.02). Location sensitivity demonstrated significant correlations with: rate of small invasive cancers per 10 000 reads (r = 0.46; P = 0.04); rate of DCIS (ductal carcinoma in situ) per 10 000 reads (r = 0.44; P = 0.05); detection rate of all invasive cancers and DCIS per 10 000 reads (r = 0.54; P = 0.01); percentage of all cancers read that were not recalled (r = -0.57; P = 0.009); and sensitivity (r = 0.57; P = 0.009). No other significant relationships were noted. Performance indicators from test set demonstrate significant correlations with specific aspects of clinical performance, although caution needs to be exercised when generalising test set specificity to the clinical situation. © 2015 The Royal Australian and New Zealand College of Radiologists.
Performance analysis of SA-3 missile second stage
NASA Technical Reports Server (NTRS)
Helmy, A. M.
1981-01-01
One SA-3 missile was disassembled. The constituents of the second stage were thoroughly investigated for geometrical details. The second stage slotted composite propellant grain was subjected to mechanical properties testing, physiochemical analyses, and burning rate measurements at different conditions. To determine the propellant performance parameters, the slotted composite propellant grain was machined into a set of small-size tubular grains. These grains were fired in a small size rocket motor with a set of interchangeable nozzles with different throat diameters. The firings were carried out at three different conditions. The data from test motor firings, physiochemical properties of the propellant, burning rate measurement results and geometrical details of the second stage motor, were used as input data in a computer program to compute the internal ballistic characteristics of the second stage.
Initial Investigation into the Psychoacoustic Properties of Small Unmanned Aerial System Noise
NASA Technical Reports Server (NTRS)
Christian, Andrew; Cabell, Randolph
2017-01-01
For the past several years, researchers at NASA Langley have been engaged in a series of projects to study the degree to which existing facilities and capabilities, originally created for work on full-scale aircraft, are extensible to smaller scales --those of the small unmanned aerial systems (sUAS, also UAVs and, colloquially, `drones') that have been showing up in the nation's airspace of late. This paper follows an e ort that has led to an initial human{subject psychoacoustic test regarding the annoyance generated by sUAS noise. This e ort spans three phases: 1. The collection of the sounds through field recordings. 2. The formulation and execution of a psychoacoustic test using those recordings. 3. The initial analysis of the data from that test. The data suggests a lack of parity between the noise of the recorded sUAS and that of a set of road vehicles that were also recorded and included in the test, as measured by a set of contemporary noise metrics. Future work, including the possibility of further human subject testing, is discussed in light of this suggestion.
Using Small-Scale Randomized Controlled Trials to Evaluate the Efficacy of New Curricular Materials
ERIC Educational Resources Information Center
Drits-Esser, Dina; Bass, Kristin M.; Stark, Louisa A.
2014-01-01
How can researchers in K-12 contexts stay true to the principles of rigorous evaluation designs within the constraints of classroom settings and limited funding? This paper explores this question by presenting a small-scale randomized controlled trial (RCT) designed to test the efficacy of curricular supplemental materials on epigenetics. The…
Xenon monitoring and the Comprehensive Nuclear-Test-Ban Treaty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowyer, Theodore W.
How do you monitor (verify) a CTBT? It is a difficult challenge to monitor the entire world for nuclear tests, regardless of size. Nuclear tests 'normally' occur underground, above ground or underwater. Setting aside very small tests (let's limit our thinking to 1 kiloton or more), nuclear tests shake the ground, emit large amounts of radioactivity, and make loud noises if in the atmosphere (or hydroacoustic waves if underwater)
ERIC Educational Resources Information Center
Papenberg, Martin; Musch, Jochen
2017-01-01
In multiple-choice tests, the quality of distractors may be more important than their number. We therefore examined the joint influence of distractor quality and quantity on test functioning by providing a sample of 5,793 participants with five parallel test sets consisting of items that differed in the number and quality of distractors.…
Internal Temperature Control For Vibration Testers
NASA Technical Reports Server (NTRS)
Dean, Richard J.
1996-01-01
Vibration test fixtures with internal thermal-transfer capabilities developed. Made of aluminum for rapid thermal transfer. Small size gives rapid response to changing temperatures, with better thermal control. Setup quicker and internal ducting facilitates access to parts being tested. In addition, internal flows smaller, so less energy consumed in maintaining desired temperature settings.
Stress assessment in small ruminants kept on city farms in southern Germany.
Schilling, Anna-Katarina; Reese, Sven; Palme, Rupert; Erhard, Michael; Wöhr, Anna-Caroline
2015-01-01
Sheep and goats are frequently used in nonhuman animal-assisted activities on city farms. There are few data available on this type of usage of small ruminants. Health evaluations, behavioral observations (feeding, resting, comfort, explorative and social behaviors), behavioral tests (human approach tests and touch test), and measurements of fecal cortisol metabolites and heart rate were performed to assess stress levels in 25 sheep and 32 goats on 7 city farms and 2 activity playgrounds in Germany. No evidence was found that the animals suffered from major distress. Health evaluations, behavioral observations, and behavioral tests proved to be the methods of stress assessment most suitable for routine on-farm checks in these settings.
Predicting Fatigue Lives Of Metal-Matrix/Fiber Composites
NASA Technical Reports Server (NTRS)
Bartolotta, Paul A.
1994-01-01
Method of prediction of fatigue lives of intermetallic-matrix/fiber composite parts at high temperatures styled after method of universal slopes. It suffices to perform relatively small numbers of fatigue tests. Data from fatigue tests correlated with tensile-test data by fitting universal-slopes equation to both sets of data. Thereafter, universal-slopes equation used to predict fatigue lives from tensile properties.
A primer set to determine sex in the small Indian mongoose, Herpestes auropunctatus.
Murata, C; Ogura, G; Kuroiwa, A
2011-03-01
To enable the accurate sexing of individuals of introduced populations of the small Indian mongoose, Herpestes auropunctatus, we designed a primer set for the amplification of the sex-specific fragments EIF2S3Y and EIF2S3X. Using this primer set, the expected amplification products were obtained for all samples of genomic DNA tested: males yielded two bands and females a single band. Sequencing of each PCR product confirmed that the 769-bp fragment amplified from DNA samples of both sexes was derived from EIF2S3X, whereas the 546-bp fragment amplified only from male DNA samples was derived from EIF2S3Y. The results indicated that this primer set is useful for sex identification in this species. © 2010 Blackwell Publishing Ltd.
Rohde, Palle Duun; Demontis, Ditte; Cuyabano, Beatriz Castro Dias; Børglum, Anders D; Sørensen, Peter
2016-08-01
Schizophrenia is a psychiatric disorder with large personal and social costs, and understanding the genetic etiology is important. Such knowledge can be obtained by testing the association between a disease phenotype and individual genetic markers; however, such single-marker methods have limited power to detect genetic markers with small effects. Instead, aggregating genetic markers based on biological information might increase the power to identify sets of genetic markers of etiological significance. Several set test methods have been proposed: Here we propose a new set test derived from genomic best linear unbiased prediction (GBLUP), the covariance association test (CVAT). We compared the performance of CVAT to other commonly used set tests. The comparison was conducted using a simulated study population having the same genetic parameters as for schizophrenia. We found that CVAT was among the top performers. When extending CVAT to utilize a mixture of SNP effects, we found an increase in power to detect the causal sets. Applying the methods to a Danish schizophrenia case-control data set, we found genomic evidence for association of schizophrenia with vitamin A metabolism and immunological responses, which previously have been implicated with schizophrenia based on experimental and observational studies. Copyright © 2016 by the Genetics Society of America.
Geist, Kamile; Hitchcock, John H
2014-01-01
The profession would benefit from greater and routine generation of causal evidence pertaining to the impact of music therapy interventions on client outcomes. One way to meet this goal is to revisit the use of Single Case Designs (SCDs) in clinical practice and research endeavors in music therapy. Given the appropriate setting and goals, this design can be accomplished with small sample sizes and it is often appropriate for studying music therapy interventions. In this article, we promote and discuss implementation of SCD studies in music therapy settings, review the meaning of internal study validity and by extension the notion of causality, and describe two of the most commonly used SCDs to demonstrate how they can help generate causal evidence to inform the field. In closing, we describe the need for replication and future meta-analysis of SCD studies completed in music therapy settings. SCD studies are both feasible and appropriate for use in music therapy clinical practice settings, particularly for testing effectiveness of interventions for individuals or small groups. © the American Music Therapy Association 2014. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Xu, Fang; Wallace, Robyn C.; Garvin, William; Greenlund, Kurt J.; Bartoli, William; Ford, Derek; Eke, Paul; Town, G. Machell
2016-01-01
Public health researchers have used a class of statistical methods to calculate prevalence estimates for small geographic areas with few direct observations. Many researchers have used Behavioral Risk Factor Surveillance System (BRFSS) data as a basis for their models. The aims of this study were to 1) describe a new BRFSS small area estimation (SAE) method and 2) investigate the internal and external validity of the BRFSS SAEs it produced. The BRFSS SAE method uses 4 data sets (the BRFSS, the American Community Survey Public Use Microdata Sample, Nielsen Claritas population totals, and the Missouri Census Geographic Equivalency File) to build a single weighted data set. Our findings indicate that internal and external validity tests were successful across many estimates. The BRFSS SAE method is one of several methods that can be used to produce reliable prevalence estimates in small geographic areas. PMID:27418213
Developing a Suitable Model for Water Uptake for Biodegradable Polymers Using Small Training Sets.
Valenzuela, Loreto M; Knight, Doyle D; Kohn, Joachim
2016-01-01
Prediction of the dynamic properties of water uptake across polymer libraries can accelerate polymer selection for a specific application. We first built semiempirical models using Artificial Neural Networks and all water uptake data, as individual input. These models give very good correlations (R (2) > 0.78 for test set) but very low accuracy on cross-validation sets (less than 19% of experimental points within experimental error). Instead, using consolidated parameters like equilibrium water uptake a good model is obtained (R (2) = 0.78 for test set), with accurate predictions for 50% of tested polymers. The semiempirical model was applied to the 56-polymer library of L-tyrosine-derived polyarylates, identifying groups of polymers that are likely to satisfy design criteria for water uptake. This research demonstrates that a surrogate modeling effort can reduce the number of polymers that must be synthesized and characterized to identify an appropriate polymer that meets certain performance criteria.
A Summary Catalogue of Microbial Drinking Water Tests for Low and Medium Resource Settings
Bain, Robert; Bartram, Jamie; Elliott, Mark; Matthews, Robert; McMahan, Lanakila; Tung, Rosalind; Chuang, Patty; Gundry, Stephen
2012-01-01
Microbial drinking-water quality testing plays an essential role in measures to protect public health. However, such testing remains a significant challenge where resources are limited. With a wide variety of tests available, researchers and practitioners have expressed difficulties in selecting the most appropriate test(s) for a particular budget, application and setting. To assist the selection process we identified the characteristics associated with low and medium resource settings and we specified the basic information that is needed for different forms of water quality monitoring. We then searched for available faecal indicator bacteria tests and collated this information. In total 44 tests have been identified, 18 of which yield a presence/absence result and 26 of which provide enumeration of bacterial concentration. The suitability of each test is assessed for use in the three settings. The cost per test was found to vary from $0.60 to $5.00 for a presence/absence test and from $0.50 to $7.50 for a quantitative format, though it is likely to be only a small component of the overall costs of testing. This article presents the first comprehensive catalogue of the characteristics of available and emerging low-cost tests for faecal indicator bacteria. It will be of value to organizations responsible for monitoring national water quality, water service providers, researchers and policy makers in selecting water quality tests appropriate for a given setting and application. PMID:22754460
A summary catalogue of microbial drinking water tests for low and medium resource settings.
Bain, Robert; Bartram, Jamie; Elliott, Mark; Matthews, Robert; McMahan, Lanakila; Tung, Rosalind; Chuang, Patty; Gundry, Stephen
2012-05-01
Microbial drinking-water quality testing plays an essential role in measures to protect public health. However, such testing remains a significant challenge where resources are limited. With a wide variety of tests available, researchers and practitioners have expressed difficulties in selecting the most appropriate test(s) for a particular budget, application and setting. To assist the selection process we identified the characteristics associated with low and medium resource settings and we specified the basic information that is needed for different forms of water quality monitoring. We then searched for available faecal indicator bacteria tests and collated this information. In total 44 tests have been identified, 18 of which yield a presence/absence result and 26 of which provide enumeration of bacterial concentration. The suitability of each test is assessed for use in the three settings. The cost per test was found to vary from $0.60 to $5.00 for a presence/absence test and from $0.50 to $7.50 for a quantitative format, though it is likely to be only a small component of the overall costs of testing. This article presents the first comprehensive catalogue of the characteristics of available and emerging low-cost tests for faecal indicator bacteria. It will be of value to organizations responsible for monitoring national water quality, water service providers, researchers and policy makers in selecting water quality tests appropriate for a given setting and application.
NASA Technical Reports Server (NTRS)
Misoda, J.; Magliozzi, B.
1973-01-01
The development is described of improved, low noise level fan and pump concepts for the space shuttle. In addition, a set of noise design criteria for small fans and pumps was derived. The concepts and criteria were created by obtaining Apollo hardware test data to correlate and modify existing noise estimating procedures. A set of space shuttle selection criteria was used to determine preliminary fan and pump concepts. These concepts were tested and modified to obtain noise sources and characteristics which yield the design criteria and quiet, efficient space shuttle fan and pump concepts.
Kerschbamer, Rudolf
2015-05-01
This paper proposes a geometric delineation of distributional preference types and a non-parametric approach for their identification in a two-person context. It starts with a small set of assumptions on preferences and shows that this set (i) naturally results in a taxonomy of distributional archetypes that nests all empirically relevant types considered in previous work; and (ii) gives rise to a clean experimental identification procedure - the Equality Equivalence Test - that discriminates between archetypes according to core features of preferences rather than properties of specific modeling variants. As a by-product the test yields a two-dimensional index of preference intensity.
1959-11-01
Multi-Axis Test Facility, Space Progress Report, November 1, 1959: The Multi Axis Space Test Inertia Facility [MASTIF], informally referred to as the Gimbal Rig, was installed inside the Altitude Wind Tunnel. The rig, which spun on three axis simultaneously, was used to train the Mercury astronauts on how to bring a spinning spacecraft under control and to determine the effects of rapid spinning on the astronaut's eyesight and psyche. Small gaseous nitrogen jets were operated by the pilot to gain control of the rig after it had been set in motion. Part 1 shows pilot Joe Algranti in the rig as it rotates over one, two, and three axis. It also has overall views of the test set-up with researchers and technicians on the test platform. Part 2 shows Algranti being secured in the rig prior to the test. The rig is set in motion and the pilot slowly brings it under control. The Mercury astronauts trained on the MASTIF in early spring of 1960.
Questionnaire-based assessment of executive functioning: Psychometrics.
Castellanos, Irina; Kronenberger, William G; Pisoni, David B
2018-01-01
The psychometric properties of the Learning, Executive, and Attention Functioning (LEAF) scale were investigated in an outpatient clinical pediatric sample. As a part of clinical testing, the LEAF scale, which broadly measures neuropsychological abilities related to executive functioning and learning, was administered to parents of 118 children and adolescents referred for psychological testing at a pediatric psychology clinic; 85 teachers also completed LEAF scales to assess reliability across different raters and settings. Scores on neuropsychological tests of executive functioning and academic achievement were abstracted from charts. Psychometric analyses of the LEAF scale demonstrated satisfactory internal consistency, parent-teacher inter-rater reliability in the small to large effect size range, and test-retest reliability in the large effect size range, similar to values for other executive functioning checklists. Correlations between corresponding subscales on the LEAF and other behavior checklists were large, while most correlations with neuropsychological tests of executive functioning and achievement were significant but in the small to medium range. Results support the utility of the LEAF as a reliable and valid questionnaire-based assessment of delays and disturbances in executive functioning and learning. Applications and advantages of the LEAF and other questionnaire measures of executive functioning in clinical neuropsychology settings are discussed.
Code of Federal Regulations, 2013 CFR
2013-01-01
...-disabled veteran-owned small business set-aside, WOSB or EDWOSB set-aside, or 8(a) contract? 121.406... items under a small business set-aside, service-disabled veteran-owned small business set-aside, WOSB or... small business set-aside, service-disabled veteran-owned small business set-aside, WOSB or EDWOSB set...
Code of Federal Regulations, 2014 CFR
2014-01-01
...-disabled veteran-owned small business set-aside, WOSB or EDWOSB set-aside, or 8(a) contract? 121.406... items under a small business set-aside, service-disabled veteran-owned small business set-aside, WOSB or... small business set-aside, service-disabled veteran-owned small business set-aside, WOSB or EDWOSB set...
Code of Federal Regulations, 2012 CFR
2012-01-01
...-disabled veteran-owned small business set-aside, WOSB or EDWOSB set-aside, or 8(a) contract? 121.406... items under a small business set-aside, service-disabled veteran-owned small business set-aside, WOSB or... small business set-aside, service-disabled veteran-owned small business set-aside, WOSB or EDWOSB set...
NASA Astrophysics Data System (ADS)
Sangsawang, T.
2018-02-01
This research has the following purposes: 1) to find the efficiency of the self-learning activity set on development of skill in using fine motor of children with intellectual disabilities., 2) to compare the abilities to use the small muscles after the study more than before the study of children with intellectual disabilities, who made study with the self-learning activity on development of small muscles use., 3) to study the satisfaction of the children with intellectual disabilities using the self-learning activity on development of small muscles use. The sample groups on the research are the children with intellectual disabilities of the special education Maha Chakri Sirindhorn Provincial Nakhon Nayok Center in the school year 2016, for 7 children. The tools used on the research consist of the self-learning activity on development of small muscles use for the children with intellectual disabilities of the special, the observation form of abilities of small muscles before and after using the activity set and the observation form of satisfaction of the children with intellectual disabilities of the special towards the self-learning activity set on development of small muscles for the children with intellectual disabilities of the special. The statistics used on the research include the percentage, mean value, standard deviation and the t-test for dependent sample. From the research, it was found that the self-learning activity set on development of small muscles use for children with intellectual disabilities of the special is efficient based on the criteria in average equal to 77.78/76.51, the educational coefficient of the student after the study higher than before the study with average points before the study equal to 55.14 and S.D. value equal to 3.72. The average points after the study equal to 68.86, S.D. value equal to 2.73, t-test value before and after the study equal to 7.94, which are different significantly on statistics at the level 0.05 and the satisfaction observation form of the student towards the self-learning activity on small muscles use for he down syndrome children with average value equal to 4.58 in the considerable level.
Close-range sensors for small unmanned bottom vehicles: update
NASA Astrophysics Data System (ADS)
Bernstein, Charles L.
2000-07-01
The Surf Zone Reconnaissance Project is developing sensors for small, autonomous, Underwater Bottom-crawling Vehicles. The objective is to enable small, crawling robots to autonomously detect and classify mines and obstacles on the ocean bottom in depths between 0 and 10 feet. We have identified a promising set of techniques that will exploit the electromagnetic, shape, texture, image, and vibratory- modal features of this images. During FY99 and FY00 we have worked toward refining these techniques. Signature data sets have been collected for a standard target set to facilitate the development of sensor fusion and target detection and classification algorithms. Specific behaviors, termed microbehaviors, are developed to utilize the robot's mobility to position and operate the sensors. A first generation, close-range sensor suite, composed of 5 sensors, will be completed and tested on a crawling platform in FY00, and will be further refined and demonstrated in FY01 as part of the Mine Countermeasures 6.3 core program sponsored by the Office of Naval Research.
48 CFR 19.502-2 - Total small business set-asides.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Total small business set... SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 19.502-2 Total small business set... exclusively for small business concerns and shall be set aside for small business unless the contracting...
A small-diameter NMR logging tool for groundwater investigations
Walsh, David; Turner, Peter; Grunewald, Elliot; Zhang, Hong; Butler, James J.; Reboulet, Ed; Knobbe, Steve; Christy, Tom; Lane, John W.; Johnson, Carole D.; Munday, Tim; Fitzpatrick, Andrew
2013-01-01
A small-diameter nuclear magnetic resonance (NMR) logging tool has been developed and field tested at various sites in the United States and Australia. A novel design approach has produced relatively inexpensive, small-diameter probes that can be run in open or PVC-cased boreholes as small as 2 inches in diameter. The complete system, including surface electronics and various downhole probes, has been successfully tested in small-diameter monitoring wells in a range of hydrogeological settings. A variant of the probe that can be deployed by a direct-push machine has also been developed and tested in the field. The new NMR logging tool provides reliable, direct, and high-resolution information that is of importance for groundwater studies. Specifically, the technology provides direct measurement of total water content (total porosity in the saturated zone or moisture content in the unsaturated zone), and estimates of relative pore-size distribution (bound vs. mobile water content) and hydraulic conductivity. The NMR measurements show good agreement with ancillary data from lithologic logs, geophysical logs, and hydrogeologic measurements, and provide valuable information for groundwater investigations.
A Generally Robust Approach for Testing Hypotheses and Setting Confidence Intervals for Effect Sizes
ERIC Educational Resources Information Center
Keselman, H. J.; Algina, James; Lix, Lisa M.; Wilcox, Rand R.; Deering, Kathleen N.
2008-01-01
Standard least squares analysis of variance methods suffer from poor power under arbitrarily small departures from normality and fail to control the probability of a Type I error when standard assumptions are violated. This article describes a framework for robust estimation and testing that uses trimmed means with an approximate degrees of…
Economic assessments of small-scale drinking-water interventions in pursuit of MDG target 7C.
Cameron, John; Jagals, Paul; Hunter, Paul R; Pedley, Steve; Pond, Katherine
2011-12-01
This paper uses an applied rural case study of a safer water intervention in South Africa to illustrate how three levels of economic assessment can be used to understand the impact of the intervention on people's well-being. It is set in the context of Millennium Development Goal 7 which sets a target (7C) for safe drinking-water provision and the challenges of reaching people in remote rural areas with relatively small-scale schemes. The assessment moves from cost efficiency to cost effectiveness to a full social cost-benefit analysis (SCBA) with an associated sensitivity test. In addition to demonstrating techniques of analysis, the paper brings out many of the challenges in understanding how safer drinking-water impacts on people's livelihoods. The SCBA shows the case study intervention is justified economically, though the sensitivity test suggests 'downside' vulnerability. Copyright © 2011 Elsevier B.V. All rights reserved.
A SMALL-ANGLE DRILL-HOLE WHIPSTOCK
Nielsen, D.E.; Olsen, J.L.; Bennett, W.P.
1963-01-29
A small angle whipstock is described for accurately correcting or deviating a drill hole by a very small angle. The whipstock is primarily utilized when drilling extremely accurate, line-of-slight test holes as required for diagnostic studies related to underground nuclear test shots. The invention is constructed of a length of cylindrical pipe or casing, with a whipstock seating spike extending from the lower end. A wedge-shaped segment is secured to the outer circumference of the upper end of the cylinder at a position diametrically opposite the circumferential position of the spike. Pin means are provided for affixing the whipstock to a directional drill bit and stem to alloy orienting and setting the whipstock properly in the drill hole. (AEC)
Messiah College Biodiesel Fuel Generation Project Final Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zummo, Michael M; Munson, J; Derr, A
Many obvious and significant concerns arise when considering the concept of small-scale biodiesel production. Does the fuel produced meet the stringent requirements set by the commercial biodiesel industry? Is the process safe? How are small-scale producers collecting and transporting waste vegetable oil? How is waste from the biodiesel production process handled by small-scale producers? These concerns and many others were the focus of the research preformed in the Messiah College Biodiesel Fuel Generation project over the last three years. This project was a unique research program in which undergraduate engineering students at Messiah College set out to research the feasibilitymore » of small-biodiesel production for application on a campus of approximately 3000 students. This Department of Energy (DOE) funded research program developed out of almost a decade of small-scale biodiesel research and development work performed by students at Messiah College. Over the course of the last three years the research team focused on four key areas related to small-scale biodiesel production: Quality Testing and Assurance, Process and Processor Research, Process and Processor Development, and Community Education. The objectives for the Messiah College Biodiesel Fuel Generation Project included the following: 1. Preparing a laboratory facility for the development and optimization of processors and processes, ASTM quality assurance, and performance testing of biodiesel fuels. 2. Developing scalable processor and process designs suitable for ASTM certifiable small-scale biodiesel production, with the goals of cost reduction and increased quality. 3. Conduct research into biodiesel process improvement and cost optimization using various biodiesel feedstocks and production ingredients.« less
Clinical relevance of small copy-number variants in chromosomal microarray clinical testing.
Hollenbeck, Dana; Williams, Crescenda L; Drazba, Kathryn; Descartes, Maria; Korf, Bruce R; Rutledge, S Lane; Lose, Edward J; Robin, Nathaniel H; Carroll, Andrew J; Mikhail, Fady M
2017-04-01
The 2010 consensus statement on diagnostic chromosomal microarray (CMA) testing recommended an array resolution ≥400 kb throughout the genome as a balance of analytical and clinical sensitivity. In spite of the clear evidence for pathogenicity of large copy-number variants (CNVs) in neurodevelopmental disorders and/or congenital anomalies, the significance of small, nonrecurrent CNVs (<500 kb) has not been well established in a clinical setting. We investigated the clinical significance of all nonpolymorphic small, nonrecurrent CNVs (<500 kb) in patients referred for CMA clinical testing over a period of 6 years, from 2009 to 2014 (a total of 4,417 patients). We excluded from our study patients with benign or likely benign CNVs and patients with only recurrent microdeletions/microduplications <500 kb. In total, 383 patients (8.67%) were found to carry at least one small, nonrecurrent CNV, of whom 176 patients (3.98%) had one small CNV classified as a variant of uncertain significance (VUS), 45 (1.02%) had two or more small VUS CNVs, 20 (0.45%) had one small VUS CNV and a recurrent CNV, 113 (2.56%) had one small pathogenic or likely pathogenic CNV, 17 (0.38%) had two or more small pathogenic or likely pathogenic CNVs, and 12 (0.27%) had one small pathogenic or likely pathogenic CNV and a recurrent CNV. Within the pathogenic group, 80 of 142 patients (56% of all small pathogenic CNV cases) were found to have a single whole-gene or exonic deletion. The themes that emerged from our study are presented in the Discussion section. Our study demonstrates the diagnostic clinical relevance of small, nonrecurrent CNVs <500 kb during CMA clinical testing and underscores the need for careful clinical interpretation of these CNVs.Genet Med 19 4, 377-385.
48 CFR 819.502-2 - Total small business set-asides.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Total small business set... SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 819.502-2 Total small business set-asides. (a) When a total small business set-aside is made, one of the following statements, as applicable...
48 CFR 819.502-2 - Total small business set-asides.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Total small business set... SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 819.502-2 Total small business set-asides. (a) When a total small business set-aside is made, one of the following statements, as applicable...
48 CFR 819.502-2 - Total small business set-asides.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Total small business set... SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 819.502-2 Total small business set-asides. (a) When a total small business set-aside is made, one of the following statements, as applicable...
48 CFR 19.502-2 - Total small business set-asides.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Total small business set... SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 19.502-2 Total small business set... contracting officer does not proceed with the small business set-aside and purchases on an unrestricted basis...
48 CFR 19.502-2 - Total small business set-asides.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Total small business set... SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 19.502-2 Total small business set... contracting officer does not proceed with the small business set-aside and purchases on an unrestricted basis...
48 CFR 19.502-2 - Total small business set-asides.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Total small business set... SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 19.502-2 Total small business set... contracting officer does not proceed with the small business set-aside and purchases on an unrestricted basis...
48 CFR 819.502-2 - Total small business set-asides.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Total small business set... SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 819.502-2 Total small business set-asides. (a) When a total small business set-aside is made, one of the following statements, as applicable...
48 CFR 819.502-2 - Total small business set-asides.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Total small business set... SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 819.502-2 Total small business set-asides. (a) When a total small business set-aside is made, one of the following statements, as applicable...
TRAC-PF1 code verification with data from the OTIS test facility. [Once-Through Intergral System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Childerson, M.T.; Fujita, R.K.
1985-01-01
A computer code (TRAC-PF1/MOD1) developed for predicting transient thermal and hydraulic integral nuclear steam supply system (NSSS) response was benchmarked. Post-small break loss-of-coolant accident (LOCA) data from a scaled, experimental facility, designated the One-Through Integral System (OTIS), were obtained for the Babcock and Wilcox NSSS and compared to TRAC predictions. The OTIS tests provided a challenging small break LOCA data set for TRAC verification. The major phases of a small break LOCA observed in the OTIS tests included pressurizer draining and loop saturation, intermittent reactor coolant system circulation, boiler-condenser mode, and the initial stages of refill. The TRAC code wasmore » successful in predicting OTIS loop conditions (system pressures and temperatures) after modification of the steam generator model. In particular, the code predicted both pool and auxiliary-feedwater initiated boiler-condenser mode heat transfer.« less
SkData: data sets and algorithm evaluation protocols in Python
NASA Astrophysics Data System (ADS)
Bergstra, James; Pinto, Nicolas; Cox, David D.
2015-01-01
Machine learning benchmark data sets come in all shapes and sizes, whereas classification algorithms assume sanitized input, such as (x, y) pairs with vector-valued input x and integer class label y. Researchers and practitioners know all too well how tedious it can be to get from the URL of a new data set to a NumPy ndarray suitable for e.g. pandas or sklearn. The SkData library handles that work for a growing number of benchmark data sets (small and large) so that one-off in-house scripts for downloading and parsing data sets can be replaced with library code that is reliable, community-tested, and documented. The SkData library also introduces an open-ended formalization of training and testing protocols that facilitates direct comparison with published research. This paper describes the usage and architecture of the SkData library.
Strategy to discover diverse optimal molecules in the small molecule universe.
Rupakheti, Chetan; Virshup, Aaron; Yang, Weitao; Beratan, David N
2015-03-23
The small molecule universe (SMU) is defined as a set of over 10(60) synthetically feasible organic molecules with molecular weight less than ∼500 Da. Exhaustive enumerations and evaluation of all SMU molecules for the purpose of discovering favorable structures is impossible. We take a stochastic approach and extend the ACSESS framework ( Virshup et al. J. Am. Chem. Soc. 2013 , 135 , 7296 - 7303 ) to develop diversity oriented molecular libraries that can generate a set of compounds that is representative of the small molecule universe and that also biases the library toward favorable physical property values. We show that the approach is efficient compared to exhaustive enumeration and to existing evolutionary algorithms for generating such libraries by testing in the NKp fitness landscape model and in the fully enumerated GDB-9 chemical universe containing 3 × 10(5) molecules.
Strategy To Discover Diverse Optimal Molecules in the Small Molecule Universe
2015-01-01
The small molecule universe (SMU) is defined as a set of over 1060 synthetically feasible organic molecules with molecular weight less than ∼500 Da. Exhaustive enumerations and evaluation of all SMU molecules for the purpose of discovering favorable structures is impossible. We take a stochastic approach and extend the ACSESS framework (Virshup et al. J. Am. Chem. Soc.2013, 135, 7296–730323548177) to develop diversity oriented molecular libraries that can generate a set of compounds that is representative of the small molecule universe and that also biases the library toward favorable physical property values. We show that the approach is efficient compared to exhaustive enumeration and to existing evolutionary algorithms for generating such libraries by testing in the NKp fitness landscape model and in the fully enumerated GDB-9 chemical universe containing 3 × 105 molecules. PMID:25594586
The influence of perceptual load on age differences in selective attention.
Maylor, E A; Lavie, N
1998-12-01
The effect of perceptual load on age differences in visual selective attention was examined in 2 studies. In Experiment 1, younger and older adults made speeded choice responses indicating which of 2 target letters was present in a relevant set of letters in the center of the display while they attempted to ignore an irrelevant distractor in the periphery. The perceptual load of relevant processing was manipulated by varying the central set size. When the relevant set size was small, the adverse effect of an incompatible distractor was much greater for the older participants than for the younger ones. However, with larger relevant set sizes, this was no longer the case, with the distractor effect decreasing for older participants at lower levels of perceptual load than for younger ones. In Experiment 2, older adults were tested with the empty locations in the central set either unmarked (as in Experiment 1) or marked by small circles to form a group of 6 items irrespective of set size; the 2 conditions did not differ markedly, ruling out an explanation based entirely on perceptual grouping.
Preliminary noise tests of the engine-over-the-wing concept. 2: 10 deg - 20 deg flap position
NASA Technical Reports Server (NTRS)
Reshotko, M.; Olsen, W. A.; Dorsch, R. G.
1972-01-01
Preliminary acoustic tests of the engine-over-the-wing concept as a method for reducing the aerodynamic noise created by conventional and short takeoff aircraft are discussed. Tests were conducted with a small wing section model having two flaps which can be set for either the landing or takeoff positions. Data was acquired with the flaps set at 10 degrees and 20 degrees for takeoff and 30 and 60 degrees for landing. The engine exhaust was simulated by an air jet from a convergent nozzle. Far field noise data are presented for nominal pressure ratios of 1.25, 1.4 and 1.7 for both the flyover and sideline modes.
NASA Technical Reports Server (NTRS)
Miller, D. P.; Prahst, P. S.
1994-01-01
An axial compressor test rig has been designed for the operation of small turbomachines. The inlet region consisted of a long flowpath region with two series of support struts and a flapped inlet guide vane. A flow test was run to calibrate and determine the source and magnitudes of the loss mechanisms in the inlet for a highly loaded two-stage axial compressor test. Several flow conditions and IGV angle settings were established in which detailed surveys were completed. Boundary layer bleed was also provided along the casing of the inlet behind the support struts and ahead of the IGV. A detailed discussion of the flowpath design along with a summary of the experimental results are provided in Part 1.
Nazarian, Dalar; Ganesh, P.; Sholl, David S.
2015-09-30
We compiled a test set of chemically and topologically diverse Metal–Organic Frameworks (MOFs) with high accuracy experimentally derived crystallographic structure data. The test set was used to benchmark the performance of Density Functional Theory (DFT) functionals (M06L, PBE, PW91, PBE-D2, PBE-D3, and vdW-DF2) for predicting lattice parameters, unit cell volume, bonded parameters and pore descriptors. On average PBE-D2, PBE-D3, and vdW-DF2 predict more accurate structures, but all functionals predicted pore diameters within 0.5 Å of the experimental diameter for every MOF in the test set. The test set was also used to assess the variance in performance of DFT functionalsmore » for elastic properties and atomic partial charges. The DFT predicted elastic properties such as minimum shear modulus and Young's modulus can differ by an average of 3 and 9 GPa for rigid MOFs such as those in the test set. Moreover, we calculated the partial charges by vdW-DF2 deviate the most from other functionals while there is no significant difference between the partial charges calculated by M06L, PBE, PW91, PBE-D2 and PBE-D3 for the MOFs in the test set. We find that while there are differences in the magnitude of the properties predicted by the various functionals, these discrepancies are small compared to the accuracy necessary for most practical applications.« less
Predicting Mouse Liver Microsomal Stability with “Pruned” Machine Learning Models and Public Data
Perryman, Alexander L.; Stratton, Thomas P.; Ekins, Sean; Freundlich, Joel S.
2015-01-01
Purpose Mouse efficacy studies are a critical hurdle to advance translational research of potential therapeutic compounds for many diseases. Although mouse liver microsomal (MLM) stability studies are not a perfect surrogate for in vivo studies of metabolic clearance, they are the initial model system used to assess metabolic stability. Consequently, we explored the development of machine learning models that can enhance the probability of identifying compounds possessing MLM stability. Methods Published assays on MLM half-life values were identified in PubChem, reformatted, and curated to create a training set with 894 unique small molecules. These data were used to construct machine learning models assessed with internal cross-validation, external tests with a published set of antitubercular compounds, and independent validation with an additional diverse set of 571 compounds (PubChem data on percent metabolism). Results “Pruning” out the moderately unstable/moderately stable compounds from the training set produced models with superior predictive power. Bayesian models displayed the best predictive power for identifying compounds with a half-life ≥1 hour. Conclusions Our results suggest the pruning strategy may be of general benefit to improve test set enrichment and provide machine learning models with enhanced predictive value for the MLM stability of small organic molecules. This study represents the most exhaustive study to date of using machine learning approaches with MLM data from public sources. PMID:26415647
Predicting Mouse Liver Microsomal Stability with "Pruned" Machine Learning Models and Public Data.
Perryman, Alexander L; Stratton, Thomas P; Ekins, Sean; Freundlich, Joel S
2016-02-01
Mouse efficacy studies are a critical hurdle to advance translational research of potential therapeutic compounds for many diseases. Although mouse liver microsomal (MLM) stability studies are not a perfect surrogate for in vivo studies of metabolic clearance, they are the initial model system used to assess metabolic stability. Consequently, we explored the development of machine learning models that can enhance the probability of identifying compounds possessing MLM stability. Published assays on MLM half-life values were identified in PubChem, reformatted, and curated to create a training set with 894 unique small molecules. These data were used to construct machine learning models assessed with internal cross-validation, external tests with a published set of antitubercular compounds, and independent validation with an additional diverse set of 571 compounds (PubChem data on percent metabolism). "Pruning" out the moderately unstable / moderately stable compounds from the training set produced models with superior predictive power. Bayesian models displayed the best predictive power for identifying compounds with a half-life ≥1 h. Our results suggest the pruning strategy may be of general benefit to improve test set enrichment and provide machine learning models with enhanced predictive value for the MLM stability of small organic molecules. This study represents the most exhaustive study to date of using machine learning approaches with MLM data from public sources.
Grimme, Stefan; Brandenburg, Jan Gerit; Bannwarth, Christoph; Hansen, Andreas
2015-08-07
A density functional theory (DFT) based composite electronic structure approach is proposed to efficiently compute structures and interaction energies in large chemical systems. It is based on the well-known and numerically robust Perdew-Burke-Ernzerhoff (PBE) generalized-gradient-approximation in a modified global hybrid functional with a relatively large amount of non-local Fock-exchange. The orbitals are expanded in Ahlrichs-type valence-double zeta atomic orbital (AO) Gaussian basis sets, which are available for many elements. In order to correct for the basis set superposition error (BSSE) and to account for the important long-range London dispersion effects, our well-established atom-pairwise potentials are used. In the design of the new method, particular attention has been paid to an accurate description of structural parameters in various covalent and non-covalent bonding situations as well as in periodic systems. Together with the recently proposed three-fold corrected (3c) Hartree-Fock method, the new composite scheme (termed PBEh-3c) represents the next member in a hierarchy of "low-cost" electronic structure approaches. They are mainly free of BSSE and account for most interactions in a physically sound and asymptotically correct manner. PBEh-3c yields good results for thermochemical properties in the huge GMTKN30 energy database. Furthermore, the method shows excellent performance for non-covalent interaction energies in small and large complexes. For evaluating its performance on equilibrium structures, a new compilation of standard test sets is suggested. These consist of small (light) molecules, partially flexible, medium-sized organic molecules, molecules comprising heavy main group elements, larger systems with long bonds, 3d-transition metal systems, non-covalently bound complexes (S22 and S66×8 sets), and peptide conformations. For these sets, overall deviations from accurate reference data are smaller than for various other tested DFT methods and reach that of triple-zeta AO basis set second-order perturbation theory (MP2/TZ) level at a tiny fraction of computational effort. Periodic calculations conducted for molecular crystals to test structures (including cell volumes) and sublimation enthalpies indicate very good accuracy competitive to computationally more involved plane-wave based calculations. PBEh-3c can be applied routinely to several hundreds of atoms on a single processor and it is suggested as a robust "high-speed" computational tool in theoretical chemistry and physics.
48 CFR 19.506 - Withdrawing or modifying small business set-asides.
Code of Federal Regulations, 2010 CFR
2010-10-01
... a withdrawal of an individual small business set-aside by giving written notice to the agency small... small business set-asides. 19.506 Section 19.506 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 19.506...
48 CFR 19.506 - Withdrawing or modifying small business set-asides.
Code of Federal Regulations, 2014 CFR
2014-10-01
... small business set-asides. 19.506 Section 19.506 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 19.506 Withdrawing or modifying small business set-asides. (a) If, before award of a contract involving a small...
48 CFR 19.506 - Withdrawing or modifying small business set-asides.
Code of Federal Regulations, 2013 CFR
2013-10-01
... small business set-asides. 19.506 Section 19.506 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 19.506 Withdrawing or modifying small business set-asides. (a) If, before award of a contract involving a small...
48 CFR 19.506 - Withdrawing or modifying small business set-asides.
Code of Federal Regulations, 2011 CFR
2011-10-01
... small business set-asides. 19.506 Section 19.506 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 19.506 Withdrawing or modifying small business set-asides. (a) If, before award of a contract involving a small...
48 CFR 19.506 - Withdrawing or modifying small business set-asides.
Code of Federal Regulations, 2012 CFR
2012-10-01
... small business set-asides. 19.506 Section 19.506 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 19.506 Withdrawing or modifying small business set-asides. (a) If, before award of a contract involving a small...
Catherine M. Marx; Russell C. Moody
1981-01-01
A total of 180 small Douglas FirâLarch (DF-L) or Southern Pine (SP) glued-laminated beams were evaluated to determine the tension lamination quality necessary to obtain desired design stresses. The test beams had either the regular laminating grades of L1 DF-L/No. 1D SP or the special 302-24 laminating grade as tension laminations. Because an initial set of SP beams...
Modeling of 1.5 μm range gated imaging for small surface vessel identification
NASA Astrophysics Data System (ADS)
Espinola, Richard L.; Steinvall, Ove; Elmquist, Magnus; Karlsson, Kjell
2010-10-01
Within the framework of the NATO group (NATO SET-132/RTG-72) on imaging ladars, a test was performed to collect simultaneous multi-mode LADAR signatures of maritime objects entering and leaving San Diego Harbor. Beside ladars, passive sensors were also employed during the test which occurred during April 2009 from Point Loma and the harbor in San Diego. This paper will report on 1.5 μm gated imaging on a number of small civilian surface vessels with the aim to present human perception experimental results and comparisons with sensor performance models developed by US Army RDECOM CERDEC NVESD. We use controlled human perception tests to measure target identification performance and compare the experimental results with model predictions.
Basile, Benjamin M; Hampton, Robert R
2010-02-01
The combination of primacy and recency produces a U-shaped serial position curve typical of memory for lists. In humans, primacy is often thought to result from rehearsal, but there is little evidence for rehearsal in nonhumans. To further evaluate the possibility that rehearsal contributes to primacy in monkeys, we compared memory for lists of familiar stimuli (which may be easier to rehearse) to memory for unfamiliar stimuli (which are likely difficult to rehearse). Six rhesus monkeys saw lists of five images drawn from either large, medium, or small image sets. After presentation of each list, memory for one item was assessed using a serial probe recognition test. Across four experiments, we found robust primacy and recency with lists drawn from small and medium, but not large, image sets. This finding is consistent with the idea that familiar items are easier to rehearse and that rehearsal contributes to primacy, warranting further study of the possibility of rehearsal in monkeys. However, alternative interpretations are also viable and are discussed. Copyright 2009 Elsevier B.V. All rights reserved.
About-face on face recognition ability and holistic processing
Richler, Jennifer J.; Floyd, R. Jackie; Gauthier, Isabel
2015-01-01
Previous work found a small but significant relationship between holistic processing measured with the composite task and face recognition ability measured by the Cambridge Face Memory Test (CFMT; Duchaine & Nakayama, 2006). Surprisingly, recent work using a different measure of holistic processing (Vanderbilt Holistic Face Processing Test [VHPT-F]; Richler, Floyd, & Gauthier, 2014) and a larger sample found no evidence for such a relationship. In Experiment 1 we replicate this unexpected result, finding no relationship between holistic processing (VHPT-F) and face recognition ability (CFMT). A key difference between the VHPT-F and other holistic processing measures is that unique face parts are used on each trial in the VHPT-F, unlike in other tasks where a small set of face parts repeat across the experiment. In Experiment 2, we test the hypothesis that correlations between the CFMT and holistic processing tasks are driven by stimulus repetition that allows for learning during the composite task. Consistent with our predictions, CFMT performance was correlated with holistic processing in the composite task when a small set of face parts repeated over trials, but not when face parts did not repeat. A meta-analysis confirms that relationships between the CFMT and holistic processing depend on stimulus repetition. These results raise important questions about what is being measured by the CFMT, and challenge current assumptions about why faces are processed holistically. PMID:26223027
About-face on face recognition ability and holistic processing.
Richler, Jennifer J; Floyd, R Jackie; Gauthier, Isabel
2015-01-01
Previous work found a small but significant relationship between holistic processing measured with the composite task and face recognition ability measured by the Cambridge Face Memory Test (CFMT; Duchaine & Nakayama, 2006). Surprisingly, recent work using a different measure of holistic processing (Vanderbilt Holistic Face Processing Test [VHPT-F]; Richler, Floyd, & Gauthier, 2014) and a larger sample found no evidence for such a relationship. In Experiment 1 we replicate this unexpected result, finding no relationship between holistic processing (VHPT-F) and face recognition ability (CFMT). A key difference between the VHPT-F and other holistic processing measures is that unique face parts are used on each trial in the VHPT-F, unlike in other tasks where a small set of face parts repeat across the experiment. In Experiment 2, we test the hypothesis that correlations between the CFMT and holistic processing tasks are driven by stimulus repetition that allows for learning during the composite task. Consistent with our predictions, CFMT performance was correlated with holistic processing in the composite task when a small set of face parts repeated over trials, but not when face parts did not repeat. A meta-analysis confirms that relationships between the CFMT and holistic processing depend on stimulus repetition. These results raise important questions about what is being measured by the CFMT, and challenge current assumptions about why faces are processed holistically.
Gaining Insight Into Femtosecond-scale CMOS Effects using FPGAs
2015-03-24
paths or detecting gross path delay faults , but for characterizing subtle aging effects, there is a need to isolate very short paths and detect very...data using COTS FPGAs and novel self-test. Hardware experiments using a 28 nm FPGA demonstrate isolation of small sets of transistors, detection of...hold the static configuration data specifying the LUT function. A set of inverters drive the SRAM contents into a pass-gate multiplexor tree; we
Coordinated platooning with multiple speeds
Luo, Fengqiao; Larson, Jeffrey; Munson, Todd
2018-03-22
In a platoon, vehicles travel one after another with small intervehicle distances; trailing vehicles in a platoon save fuel because they experience less aerodynamic drag. This work presents a coordinated platooning model with multiple speed options that integrates scheduling, routing, speed selection, and platoon formation/dissolution in a mixed-integer linear program that minimizes the total fuel consumed by a set of vehicles while traveling between their respective origins and destinations. The performance of this model is numerically tested on a grid network and the Chicago-area highway network. We find that the fuel-savings factor of a multivehicle system significantly depends on themore » time each vehicle is allowed to stay in the network; this time affects vehicles’ available speed choices, possible routes, and the amount of time for coordinating platoon formation. For problem instances with a large number of vehicles, we propose and test a heuristic decomposed approach that applies a clustering algorithm to partition the set of vehicles and then routes each group separately. When the set of vehicles is large and the available computational time is small, the decomposed approach finds significantly better solutions than does the full model.« less
Coordinated platooning with multiple speeds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Fengqiao; Larson, Jeffrey; Munson, Todd
In a platoon, vehicles travel one after another with small intervehicle distances; trailing vehicles in a platoon save fuel because they experience less aerodynamic drag. This work presents a coordinated platooning model with multiple speed options that integrates scheduling, routing, speed selection, and platoon formation/dissolution in a mixed-integer linear program that minimizes the total fuel consumed by a set of vehicles while traveling between their respective origins and destinations. The performance of this model is numerically tested on a grid network and the Chicago-area highway network. We find that the fuel-savings factor of a multivehicle system significantly depends on themore » time each vehicle is allowed to stay in the network; this time affects vehicles’ available speed choices, possible routes, and the amount of time for coordinating platoon formation. For problem instances with a large number of vehicles, we propose and test a heuristic decomposed approach that applies a clustering algorithm to partition the set of vehicles and then routes each group separately. When the set of vehicles is large and the available computational time is small, the decomposed approach finds significantly better solutions than does the full model.« less
Code of Federal Regulations, 2010 CFR
2010-10-01
...). (i) Except as authorized by law, a contract may not be awarded as a result of a small business set... BUSINESS PROGRAMS Set-Asides for Small Business 19.501 General. (a) The purpose of small business set-asides is to award certain acquisitions exclusively to small business concerns. A “set-aside for small...
48 CFR 1319.202-70 - Small business set-aside review form.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Small business set-aside... COMMERCE SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Policies. 1319.202-70 Small business set-aside review form. Form CD 570, Small Business Set-Aside Review, shall be submitted for approval to the...
48 CFR 919.502-2 - Total small business set-asides.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Total small business set-asides. 919.502-2 Section 919.502-2 Federal Acquisition Regulations System DEPARTMENT OF ENERGY SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 919.502-2 Total small business set...
48 CFR 1319.202-70 - Small business set-aside review form.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Small business set-aside... COMMERCE SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Policies. 1319.202-70 Small business set-aside review form. Form CD 570, Small Business Set-Aside Review, shall be submitted for approval to the...
48 CFR 1419.506 - Withdrawing or modifying small business set-asides.
Code of Federal Regulations, 2014 CFR
2014-10-01
... small business set-asides. 1419.506 Section 1419.506 Federal Acquisition Regulations System DEPARTMENT OF THE INTERIOR SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 1419.506 Withdrawing or modifying small business set-asides. The HCA is authorized, without the power of redelegation...
48 CFR 1419.506 - Withdrawing or modifying small business set-asides.
Code of Federal Regulations, 2012 CFR
2012-10-01
... small business set-asides. 1419.506 Section 1419.506 Federal Acquisition Regulations System DEPARTMENT OF THE INTERIOR SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 1419.506 Withdrawing or modifying small business set-asides. The HCA is authorized, without the power of redelegation...
48 CFR 1319.202-70 - Small business set-aside review form.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Small business set-aside... COMMERCE SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Policies. 1319.202-70 Small business set-aside review form. Form CD 570, Small Business Set-Aside Review, shall be submitted for approval to the...
48 CFR 919.502-2 - Total small business set-asides.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Total small business set-asides. 919.502-2 Section 919.502-2 Federal Acquisition Regulations System DEPARTMENT OF ENERGY SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 919.502-2 Total small business set...
48 CFR 1419.506 - Withdrawing or modifying small business set-asides.
Code of Federal Regulations, 2013 CFR
2013-10-01
... small business set-asides. 1419.506 Section 1419.506 Federal Acquisition Regulations System DEPARTMENT OF THE INTERIOR SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 1419.506 Withdrawing or modifying small business set-asides. The HCA is authorized, without the power of redelegation...
48 CFR 1319.202-70 - Small business set-aside review form.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Small business set-aside... COMMERCE SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Policies. 1319.202-70 Small business set-aside review form. Form CD 570, Small Business Set-Aside Review, shall be submitted for approval to the...
48 CFR 919.502-2 - Total small business set-asides.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Total small business set-asides. 919.502-2 Section 919.502-2 Federal Acquisition Regulations System DEPARTMENT OF ENERGY SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 919.502-2 Total small business set...
48 CFR 1419.506 - Withdrawing or modifying small business set-asides.
Code of Federal Regulations, 2011 CFR
2011-10-01
... small business set-asides. 1419.506 Section 1419.506 Federal Acquisition Regulations System DEPARTMENT OF THE INTERIOR SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 1419.506 Withdrawing or modifying small business set-asides. The HCA is authorized, without the power of redelegation...
48 CFR 1319.202-70 - Small business set-aside review form.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Small business set-aside... COMMERCE SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Policies. 1319.202-70 Small business set-aside review form. Form CD 570, Small Business Set-Aside Review, shall be submitted for approval to the...
48 CFR 919.502-2 - Total small business set-asides.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Total small business set-asides. 919.502-2 Section 919.502-2 Federal Acquisition Regulations System DEPARTMENT OF ENERGY SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 919.502-2 Total small business set...
Nonlinear seismic analysis of a reactor structure impact between core components
NASA Technical Reports Server (NTRS)
Hill, R. G.
1975-01-01
The seismic analysis of the FFTF-PIOTA (Fast Flux Test Facility-Postirradiation Open Test Assembly), subjected to a horizontal DBE (Design Base Earthquake) is presented. The PIOTA is the first in a set of open test assemblies to be designed for the FFTF. Employing the direct method of transient analysis, the governing differential equations describing the motion of the system are set up directly and are implicitly integrated numerically in time. A simple lumped-nass beam model of the FFTF which includes small clearances between core components is used as a "driver" for a fine mesh model of the PIOTA. The nonlinear forces due to the impact of the core components and their effect on the PIOTA are computed.
Lewin, Matthew; Samuel, Stephen; Merkel, Janie; Bickler, Philip
2016-01-01
Snakebite remains a neglected medical problem of the developing world with up to 125,000 deaths each year despite more than a century of calls to improve snakebite prevention and care. An estimated 75% of fatalities from snakebite occur outside the hospital setting. Because phospholipase A2 (PLA2) activity is an important component of venom toxicity, we sought candidate PLA2 inhibitors by directly testing drugs. Surprisingly, varespladib and its orally bioavailable prodrug, methyl-varespladib showed high-level secretory PLA2 (sPLA2) inhibition at nanomolar and picomolar concentrations against 28 medically important snake venoms from six continents. In vivo proof-of-concept studies with varespladib had striking survival benefit against lethal doses of Micrurus fulvius and Vipera berus venom, and suppressed venom-induced sPLA2 activity in rats challenged with 100% lethal doses of M. fulvius venom. Rapid development and deployment of a broad-spectrum PLA2 inhibitor alone or in combination with other small molecule inhibitors of snake toxins (e.g., metalloproteases) could fill the critical therapeutic gap spanning pre-referral and hospital setting. Lower barriers for clinical testing of safety tested, repurposed small molecule therapeutics are a potentially economical and effective path forward to fill the pre-referral gap in the setting of snakebite. PMID:27571102
A low-cost machine vision system for the recognition and sorting of small parts
NASA Astrophysics Data System (ADS)
Barea, Gustavo; Surgenor, Brian W.; Chauhan, Vedang; Joshi, Keyur D.
2018-04-01
An automated machine vision-based system for the recognition and sorting of small parts was designed, assembled and tested. The system was developed to address a need to expose engineering students to the issues of machine vision and assembly automation technology, with readily available and relatively low-cost hardware and software. This paper outlines the design of the system and presents experimental performance results. Three different styles of plastic gears, together with three different styles of defective gears, were used to test the system. A pattern matching tool was used for part classification. Nine experiments were conducted to demonstrate the effects of changing various hardware and software parameters, including: conveyor speed, gear feed rate, classification, and identification score thresholds. It was found that the system could achieve a maximum system accuracy of 95% at a feed rate of 60 parts/min, for a given set of parameter settings. Future work will be looking at the effect of lighting.
Grear, Daniel A.; Dusek, Robert J.; Walsh, Daniel P.; Hall, Jeffrey S.
2017-01-01
We evaluated the potential transmission of avian influenza viruses (AIV) in wildlife species in three settings in association with an outbreak at a poultry facility: 1) small birds and small mammals on a poultry facility that was affected with highly pathogenic AIV (HPAIV) in April 2015; 2) small birds and small mammals on a nearby poultry facility that was unaffected by HPAIV; and 3) small birds, small mammals, and waterfowl in a nearby natural area. We live-captured small birds and small mammals and collected samples from hunter-harvested waterfowl to test for active viral shedding and evidence of exposure (serum antibody) to AIV and the H5N2 HPAIV that affected the poultry facility. We detected no evidence of shedding or specific antibody to AIV in small mammals and small birds 5 mo after depopulation of the poultry. We detected viral shedding and exposure to AIV in waterfowl and estimated approximately 15% viral shedding and 60% antibody prevalence. In waterfowl, we did not detect shedding or exposure to the HPAIV that affected the poultry facility. We also conducted camera trapping around poultry carcass depopulation composting barns and found regular visitation by four species of medium-sized mammals. We provide preliminary data suggesting that peridomestic wildlife were not an important factor in the transmission of AIV during the poultry outbreak, nor did small birds and mammals in natural wetland settings show wide evidence of AIV shedding or exposure, despite the opportunity for exposure.
Grear, Daniel A; Dusek, Robert J; Walsh, Daniel P; Hall, Jeffrey S
2017-01-01
We evaluated the potential transmission of avian influenza viruses (AIV) in wildlife species in three settings in association with an outbreak at a poultry facility: 1) small birds and small mammals on a poultry facility that was affected with highly pathogenic AIV (HPAIV) in April 2015; 2) small birds and small mammals on a nearby poultry facility that was unaffected by HPAIV; and 3) small birds, small mammals, and waterfowl in a nearby natural area. We live-captured small birds and small mammals and collected samples from hunter-harvested waterfowl to test for active viral shedding and evidence of exposure (serum antibody) to AIV and the H5N2 HPAIV that affected the poultry facility. We detected no evidence of shedding or specific antibody to AIV in small mammals and small birds 5 mo after depopulation of the poultry. We detected viral shedding and exposure to AIV in waterfowl and estimated approximately 15% viral shedding and 60% antibody prevalence. In waterfowl, we did not detect shedding or exposure to the HPAIV that affected the poultry facility. We also conducted camera trapping around poultry carcass depopulation composting barns and found regular visitation by four species of medium-sized mammals. We provide preliminary data suggesting that peridomestic wildlife were not an important factor in the transmission of AIV during the poultry outbreak, nor did small birds and mammals in natural wetland settings show wide evidence of AIV shedding or exposure, despite the opportunity for exposure.
NASA Technical Reports Server (NTRS)
Zhang, Yuhan; Lu, Dr. Thomas
2010-01-01
The objectives of this project were to develop a ROI (Region of Interest) detector using Haar-like feature similar to the face detection in Intel's OpenCV library, implement it in Matlab code, and test the performance of the new ROI detector against the existing ROI detector that uses Optimal Trade-off Maximum Average Correlation Height filter (OTMACH). The ROI detector included 3 parts: 1, Automated Haar-like feature selection in finding a small set of the most relevant Haar-like features for detecting ROIs that contained a target. 2, Having the small set of Haar-like features from the last step, a neural network needed to be trained to recognize ROIs with targets by taking the Haar-like features as inputs. 3, using the trained neural network from the last step, a filtering method needed to be developed to process the neural network responses into a small set of regions of interests. This needed to be coded in Matlab. All the 3 parts needed to be coded in Matlab. The parameters in the detector needed to be trained by machine learning and tested with specific datasets. Since OpenCV library and Haar-like feature were not available in Matlab, the Haar-like feature calculation needed to be implemented in Matlab. The codes for Adaptive Boosting and max/min filters in Matlab could to be found from the Internet but needed to be integrated to serve the purpose of this project. The performance of the new detector was tested by comparing the accuracy and the speed of the new detector against the existing OTMACH detector. The speed was referred as the average speed to find the regions of interests in an image. The accuracy was measured by the number of false positives (false alarms) at the same detection rate between the two detectors.
2011-01-01
Background The increase in the number of people with dementia will lead to greater demand for residential care. Currently, large nursing homes are trying to transform their traditional care for residents with dementia to a more home-like approach, by developing small-scale living facilities. It is often assumed that small-scale living will improve the quality of life of residents with dementia. However, little scientific evidence is currently available to test this. The following research question is addressed in this study: Which (combination of) changes in elements affects (different dimensions of) the quality of life of elderly residents with dementia in long-term care settings over the course of one year? Methods/design A longitudinal comparative study in traditional and small-scale long-term care settings, which follows a quasi-experimental design, will be carried out in Belgium and the Netherlands. To answer the research question, a model has been developed which incorporates relevant elements influencing quality of life in long-term care settings. Validated instruments will be used to evaluate the role of these elements, divided into environmental characteristics (country, type of ward, group size and nursing staff); basic personal characteristics (age, sex, cognitive decline, weight and activities of daily living); behavioural characteristics (behavioural problems and depression); behavioural interventions (use of restraints and use of psychotropic medication); and social interaction (social engagement and visiting frequency of relatives). The main outcome measure for residents in the model is quality of life. Data are collected at baseline, after six and twelve months, from residents living in either small-scale or traditional care settings. Discussion The results of this study will provide an insight into the determinants of quality of life for people with dementia living in traditional and small-scale long-term care settings in Belgium and the Netherlands. Possible relevant strengths and weaknesses of the study are discussed in this article. Trial registration ISRCTN: ISRCTN23772945 PMID:21539731
Gungor, Anil; Houser, Steven M; Aquino, Benjamin F; Akbar, Imran; Moinuddin, Rizwan; Mamikoglu, Bulent; Corey, Jacquelynne P
2004-01-01
Among the many methods of allergy diagnosis are intradermal testing (IDT) and skin-prick testing (SPT). The usefulness of IDT has been called into question by some authors, while others believe that studies demonstrating that SPT was superior might have been subject to bias. We conducted a study to compare the validity of SPT and IDT--specifically, the skin endpoint titration (SET) type of IDT--in diagnosing allergic rhinitis. We performed nasal provocation testing on 62 patients to establish an unbiased screening criterion for study entry. Acoustic rhinometric measurements of the nasal responses revealed that 34 patients tested positive and 28 negative. All patients were subsequently tested by SET and SPT. We found that SPT was more sensitive (85.3 vs 79.4%) and more specific (78.6 vs 67.9%) than SET as a screening procedure. The positive predictive value of SPT was greater than that of SET (82.9 vs 75.0%), as was the negative predictive value (81.5 vs 73.0%). None of these differences was statistically significant; because of the relatively small sample size, our study was powered to show only equivalency. The results of our study suggest that the information obtained by the SET method of IDT is comparable to that obtained by SPT in terms of sensitivity, specificity, and overall performance and that both SET and SPT correlate well with nasal provocation testing for ragweed. Therefore, the decision as to which to use can be based on other factors, such as the practitioner's training, the desire for quantitative results, the desire for rapid results, and the type of treatment (i.e., immunotherapy or pharmacotherapy) that is likely to be chosen on the basis of test results.
Multi-institutional MicroCT image comparison of image-guided small animal irradiators
NASA Astrophysics Data System (ADS)
Johnstone, Chris D.; Lindsay, Patricia; E Graves, Edward; Wong, Eugene; Perez, Jessica R.; Poirier, Yannick; Ben-Bouchta, Youssef; Kanesalingam, Thilakshan; Chen, Haijian; E Rubinstein, Ashley; Sheng, Ke; Bazalova-Carter, Magdalena
2017-07-01
To recommend imaging protocols and establish tolerance levels for microCT image quality assurance (QA) performed on conformal image-guided small animal irradiators. A fully automated QA software SAPA (small animal phantom analyzer) for image analysis of the commercial Shelley micro-CT MCTP 610 phantom was developed, in which quantitative analyses of CT number linearity, signal-to-noise ratio (SNR), uniformity and noise, geometric accuracy, spatial resolution by means of modulation transfer function (MTF), and CT contrast were performed. Phantom microCT scans from eleven institutions acquired with four image-guided small animal irradiator units (including the commercial PXi X-RAD SmART and Xstrahl SARRP systems) with varying parameters used for routine small animal imaging were analyzed. Multi-institutional data sets were compared using SAPA, based on which tolerance levels for each QA test were established and imaging protocols for QA were recommended. By analyzing microCT data from 11 institutions, we established image QA tolerance levels for all image quality tests. CT number linearity set to R 2 > 0.990 was acceptable in microCT data acquired at all but three institutions. Acceptable SNR > 36 and noise levels <55 HU were obtained at five of the eleven institutions, where failing scans were acquired with current-exposure time of less than 120 mAs. Acceptable spatial resolution (>1.5 lp mm-1 for MTF = 0.2) was obtained at all but four institutions due to their large image voxel size used (>0.275 mm). Ten of the eleven institutions passed the set QA tolerance for geometric accuracy (<1.5%) and nine of the eleven institutions passed the QA tolerance for contrast (>2000 HU for 30 mgI ml-1). We recommend performing imaging QA with 70 kVp, 1.5 mA, 120 s imaging time, 0.20 mm voxel size, and a frame rate of 5 fps for the PXi X-RAD SmART. For the Xstrahl SARRP, we recommend using 60 kVp, 1.0 mA, 240 s imaging time, 0.20 mm voxel size, and 6 fps. These imaging protocols should result in high quality images that pass the set tolerance levels on all systems. Average SAPA computation time for complete QA analysis for a 0.20 mm voxel, 400 slice Shelley phantom microCT data set was less than 20 s. We present image quality assurance recommendations for image-guided small animal radiotherapy systems that can aid researchers in maintaining high image quality, allowing for spatially precise conformal dose delivery to small animals.
A support vector machine based test for incongruence between sets of trees in tree space
2012-01-01
Background The increased use of multi-locus data sets for phylogenetic reconstruction has increased the need to determine whether a set of gene trees significantly deviate from the phylogenetic patterns of other genes. Such unusual gene trees may have been influenced by other evolutionary processes such as selection, gene duplication, or horizontal gene transfer. Results Motivated by this problem we propose a nonparametric goodness-of-fit test for two empirical distributions of gene trees, and we developed the software GeneOut to estimate a p-value for the test. Our approach maps trees into a multi-dimensional vector space and then applies support vector machines (SVMs) to measure the separation between two sets of pre-defined trees. We use a permutation test to assess the significance of the SVM separation. To demonstrate the performance of GeneOut, we applied it to the comparison of gene trees simulated within different species trees across a range of species tree depths. Applied directly to sets of simulated gene trees with large sample sizes, GeneOut was able to detect very small differences between two set of gene trees generated under different species trees. Our statistical test can also include tree reconstruction into its test framework through a variety of phylogenetic optimality criteria. When applied to DNA sequence data simulated from different sets of gene trees, results in the form of receiver operating characteristic (ROC) curves indicated that GeneOut performed well in the detection of differences between sets of trees with different distributions in a multi-dimensional space. Furthermore, it controlled false positive and false negative rates very well, indicating a high degree of accuracy. Conclusions The non-parametric nature of our statistical test provides fast and efficient analyses, and makes it an applicable test for any scenario where evolutionary or other factors can lead to trees with different multi-dimensional distributions. The software GeneOut is freely available under the GNU public license. PMID:22909268
Hannon, Peggy A.; Helfrich, Christian D.; Chan, K. Gary; Allen, Claire L.; Hammerback, Kristen; Kohn, Marlana J.; Parrish, Amanda T.; Weiner, Bryan J.; Harris, Jeffrey R.
2016-01-01
Purpose To develop a theory-based questionnaire to assess readiness for change in small workplaces adopting wellness programs. Design In developing our scale, we first tested items via “think-aloud” interviews. We tested the revised items in a cross-sectional quantitative telephone survey. Setting Small workplaces (20–250 employees) in low-wage industries. Subjects Decision-makers representing small workplaces in King County, Washington (think-aloud interviews, n=9) and the United States (telephone survey, n=201). Measures We generated items for each construct in Weiner’s theory of organizational readiness for change. We also measured workplace characteristics and current implementation of workplace wellness programs. Analysis We assessed reliability by coefficient alpha for each of the readiness questionnaire subscales. We tested the association of all subscales with employers’ current implementation of wellness policies, programs, and communications, and conducted a path analysis to test the associations in the theory of organizational readiness to change. Results Each of the readiness subscales exhibited acceptable internal reliability (coefficient alpha range = .75–.88) and was positively associated with wellness program implementation (p <.05). The path analysis was consistent with the theory of organizational readiness to change, except change efficacy did not predict change-related effort. Conclusion We developed a new questionnaire to assess small workplaces’ readiness to adopt and implement evidence-based wellness programs. Our findings also provide empirical validation of Weiner’s theory of readiness for change. PMID:26389975
Hannon, Peggy A; Helfrich, Christian D; Chan, K Gary; Allen, Claire L; Hammerback, Kristen; Kohn, Marlana J; Parrish, Amanda T; Weiner, Bryan J; Harris, Jeffrey R
2017-01-01
To develop a theory-based questionnaire to assess readiness for change in small workplaces adopting wellness programs. In developing our scale, we first tested items via "think-aloud" interviews. We tested the revised items in a cross-sectional quantitative telephone survey. The study setting comprised small workplaces (20-250 employees) in low-wage industries. Decision-makers representing small workplaces in King County, Washington (think-aloud interviews, n = 9), and the United States (telephone survey, n = 201) served as study subjects. We generated items for each construct in Weiner's theory of organizational readiness for change. We also measured workplace characteristics and current implementation of workplace wellness programs. We assessed reliability by coefficient alpha for each of the readiness questionnaire subscales. We tested the association of all subscales with employers' current implementation of wellness policies, programs, and communications, and conducted a path analysis to test the associations in the theory of organizational readiness to change. Each of the readiness subscales exhibited acceptable internal reliability (coefficient alpha range, .75-.88) and was positively associated with wellness program implementation ( p < .05). The path analysis was consistent with the theory of organizational readiness to change, except change efficacy did not predict change-related effort. We developed a new questionnaire to assess small workplaces' readiness to adopt and implement evidence-based wellness programs. Our findings also provide empirical validation of Weiner's theory of readiness for change.
A Maximum Entropy Test for Evaluating Higher-Order Correlations in Spike Counts
Onken, Arno; Dragoi, Valentin; Obermayer, Klaus
2012-01-01
Evaluating the importance of higher-order correlations of neural spike counts has been notoriously hard. A large number of samples are typically required in order to estimate higher-order correlations and resulting information theoretic quantities. In typical electrophysiology data sets with many experimental conditions, however, the number of samples in each condition is rather small. Here we describe a method that allows to quantify evidence for higher-order correlations in exactly these cases. We construct a family of reference distributions: maximum entropy distributions, which are constrained only by marginals and by linear correlations as quantified by the Pearson correlation coefficient. We devise a Monte Carlo goodness-of-fit test, which tests - for a given divergence measure of interest - whether the experimental data lead to the rejection of the null hypothesis that it was generated by one of the reference distributions. Applying our test to artificial data shows that the effects of higher-order correlations on these divergence measures can be detected even when the number of samples is small. Subsequently, we apply our method to spike count data which were recorded with multielectrode arrays from the primary visual cortex of anesthetized cat during an adaptation experiment. Using mutual information as a divergence measure we find that there are spike count bin sizes at which the maximum entropy hypothesis can be rejected for a substantial number of neuronal pairs. These results demonstrate that higher-order correlations can matter when estimating information theoretic quantities in V1. They also show that our test is able to detect their presence in typical in-vivo data sets, where the number of samples is too small to estimate higher-order correlations directly. PMID:22685392
36 CFR 223.103 - Award of small business set-aside sales.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 36 Parks, Forests, and Public Property 2 2010-07-01 2010-07-01 false Award of small business set....103 Award of small business set-aside sales. If timber is advertised as set aside for competitive bidding by small business concerns, award will be made to the highest bidder who qualifies as a small...
48 CFR 5119.1070-2 - Emerging small business set-aside.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 7 2013-10-01 2012-10-01 true Emerging small business set... ACQUISITION REGULATIONS SMALL BUSINESS AND SMALL DISADVANTAGED BUSINESS CONCERNS Small Business Competitiveness Demonstration Program 5119.1070-2 Emerging small business set-aside. (a)(S-90) Solicitations for...
2014-02-11
ISS038-E-044916 (11 Feb. 2014) --- A set of NanoRacks CubeSats is photographed by an Expedition 38 crew member after the deployment by the Small Satellite Orbital Deployer (SSOD). The CubeSats program contains a variety of experiments such as Earth observations and advanced electronics testing.
Zörnig, Peter
2015-08-01
We present integer programming models for some variants of the farthest string problem. The number of variables and constraints is substantially less than that of the integer linear programming models known in the literature. Moreover, the solution of the linear programming-relaxation contains only a small proportion of noninteger values, which considerably simplifies the rounding process. Numerical tests have shown excellent results, especially when a small set of long sequences is given.
A Paper and Plastic Device for Performing Recombinase Polymerase Amplification of HIV DNA
Rohrman, Brittany A.; Richards-Kortum, Rebecca R.
2013-01-01
Despite the importance of early diagnosis and treatment of HIV, only a small fraction of HIV-exposed infants in low- and middle-income countries are tested for the disease. The gold standard for early infant diagnosis, DNA PCR, requires resources that are unavailable in poor settings, and no point-of-care HIV DNA test is currently available. We have developed a device constructed of layers of paper, glass fiber, and plastic that is capable of performing isothermal, enzymatic amplification of HIV DNA. The device is inexpensive, small, light-weight, and easy to assemble. The device stores lyophilized enzymes, facilitates mixing of reaction components, and supports recombinase polymerase amplification in five steps of operation. Using commercially available lateral flow strips as a detection method, we demonstrate the ability of our device to amplify 10 copies of HIV DNA to detectable levels in 15 minutes. Our results suggest that our device, which is designed to be used after DNA extraction from dried-blood spots, may serve in conjunction with lateral flow strips as part of a point-of-care HIV DNA test to be used in low resource settings. PMID:22733333
A paper and plastic device for performing recombinase polymerase amplification of HIV DNA.
Rohrman, Brittany A; Richards-Kortum, Rebecca R
2012-09-07
Despite the importance of early diagnosis and treatment of HIV, only a small fraction of HIV-exposed infants in low- and middle-income countries are tested for the disease. The gold standard for early infant diagnosis, DNA PCR, requires resources that are unavailable in poor settings, and no point-of-care HIV DNA test is currently available. We have developed a device constructed of layers of paper, glass fiber, and plastic that is capable of performing isothermal, enzymatic amplification of HIV DNA. The device is inexpensive, small, light-weight, and easy to assemble. The device stores lyophilized enzymes, facilitates mixing of reaction components, and supports recombinase polymerase amplification in five steps of operation. Using commercially available lateral flow strips as a detection method, we demonstrate the ability of our device to amplify 10 copies of HIV DNA to detectable levels in 15 min. Our results suggest that our device, which is designed to be used after DNA extraction from dried-blood spots, may serve in conjunction with lateral flow strips as part of a point-of-care HIV DNA test to be used in low resource settings.
Economic analysis of ALK testing and crizotinib therapy for advanced non-small-cell lung cancer.
Lu, Shun; Zhang, Jie; Ye, Ming; Wang, Baoai; Wu, Bin
2016-06-01
The economic outcome of crizotinib in advanced non-small-cell lung cancer harboring anaplastic lymphoma kinase rearrangement would be investigated. Based on a mathematical model, the economic outcome of three techniques for testing ALK gene rearrangement combing with crizotinib would be evaluated and compared with traditional regimen. The impact of the crizotinib patient assistance program (PAP) was assessed. Ventana immunohistochemistry, quantitative real-time reverse transcription-polymerase chain reaction and IHC testing plus fluorescent in situ hybridization confirmation for anaplastic lymphoma kinase testing following crizotinib treatment leaded to the incremental cost-effectiveness ratios of US$16,820 and US$223,242, US$24,424 and US$223,271, and US$16,850 and US$254,668 per quality-adjusted life-year gained with and without PAP, respectively. Gene-guided crizotinib therapy might be a cost-effective alternative comparing with the traditional regimen in the PAP setting.
Zhang, Xinyuan; Zheng, Nan; Rosania, Gus R
2008-09-01
Cell-based molecular transport simulations are being developed to facilitate exploratory cheminformatic analysis of virtual libraries of small drug-like molecules. For this purpose, mathematical models of single cells are built from equations capturing the transport of small molecules across membranes. In turn, physicochemical properties of small molecules can be used as input to simulate intracellular drug distribution, through time. Here, with mathematical equations and biological parameters adjusted so as to mimic a leukocyte in the blood, simulations were performed to analyze steady state, relative accumulation of small molecules in lysosomes, mitochondria, and cytosol of this target cell, in the presence of a homogenous extracellular drug concentration. Similarly, with equations and parameters set to mimic an intestinal epithelial cell, simulations were also performed to analyze steady state, relative distribution and transcellular permeability in this non-target cell, in the presence of an apical-to-basolateral concentration gradient. With a test set of ninety-nine monobasic amines gathered from the scientific literature, simulation results helped analyze relationships between the chemical diversity of these molecules and their intracellular distributions.
Learning About Cockpit Automation: From Piston Trainer to Jet Transport
NASA Technical Reports Server (NTRS)
Casner, Stephen M.
2003-01-01
Two experiments explored the idea of providing cockpit automation training to airline-bound student pilots using cockpit automation equipment commonly found in small training airplanes. In a first experiment, pilots mastered a set of tasks and maneuvers using a GPS navigation computer, autopilot, and flight director system installed in a small training airplane Students were then tested on their ability to complete a similar set of tasks using the cockpit automation system found in a popular jet transport aircraft. Pilot were able to successfully complete 77% of all tasks in the jet transport on their first attempt. An analysis of a control group suggests that the pilot's success was attributable to the application of automation principles they had learned in the small airplane. A second experiment looked at two different ways of delivering small-aeroplane cockpit automation training: a self-study method, and a dual instruction method. The results showed a slight advantage for the self-study method. Overall, the results of the two studies cast a strong vote for the incorporation of cockpit automation training in curricula designed for pilot who will later transition to the jet fleet.
Experimental Optimization of a Free-to-Rotate Wing for Small UAS
NASA Technical Reports Server (NTRS)
Logan, Michael J.; DeLoach, Richard; Copeland, Tiwana; Vo, Steven
2014-01-01
This paper discusses an experimental investigation conducted to optimize a free-to-rotate wing for use on a small unmanned aircraft system (UAS). Although free-to-rotate wings have been used for decades on various small UAS and small manned aircraft, little is known about how to optimize these unusual wings for a specific application. The paper discusses some of the design rationale of the basic wing. In addition, three main parameters were selected for "optimization", wing camber, wing pivot location, and wing center of gravity (c.g.) location. A small apparatus was constructed to enable some simple experimental analysis of these parameters. A design-of-experiment series of tests were first conducted to discern which of the main optimization parameters were most likely to have the greatest impact on the outputs of interest, namely, some measure of "stability", some measure of the lift being generated at the neutral position, and how quickly the wing "recovers" from an upset. A second set of tests were conducted to develop a response-surface numerical representation of these outputs as functions of the three primary inputs. The response surface numerical representations are then used to develop an "optimum" within the trade space investigated. The results of the optimization are then tested experimentally to validate the predictions.
Back-pressure Effect on Shock-Train Location in a Scramjet Engine Isolator
2010-03-01
valves .......................................................................................... 57 Side project: making an actuator stand...21 Figure 8. Main manual shut off valve ...................................................................................22 Figure 9 . A small...characteristic about this wind tunnel. With Mach 1.8 nozzle, prior to test runs, the upstream regulator pressure valve (Figure 9 ) was set at
Baltzer, Lars
2011-06-01
A new concept for protein recognition and binding is highlighted. The conjugation of small organic molecules or short peptides to polypeptides from a designed set provides binder molecules that bind proteins with high affinities, and with selectivities that are equal to those of antibodies. The small organic molecules or peptides need to bind the protein targets but only with modest affinities and selectivities, because conjugation to the polypeptides results in molecules with dramatically improved binder performance. The polypeptides are selected from a set of only sixteen sequences designed to bind, in principle, any protein. The small number of polypeptides used to prepare high-affinity binders contrasts sharply with the huge libraries used in binder technologies based on selection or immunization. Also, unlike antibodies and engineered proteins, the polypeptides have unordered three-dimensional structures and adapt to the proteins to which they bind. Binder molecules for the C-reactive protein, human carbonic anhydrase II, acetylcholine esterase, thymidine kinase 1, phosphorylated proteins, the D-dimer, and a number of antibodies are used as examples to demonstrate that affinities are achieved that are higher than those of the small molecules or peptides by as much as four orders of magnitude. Evaluation by pull-down experiments and ELISA-based tests in human serum show selectivities to be equal to those of antibodies. Small organic molecules and peptides are readily available from pools of endogenous ligands, enzyme substrates, inhibitors or products, from screened small molecule libraries, from phage display, and from mRNA display. The technology is an alternative to established binder concepts for applications in drug development, diagnostics, medical imaging, and protein separation.
48 CFR 2919.502 - Setting aside acquisitions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... whether procurements should be conducted via 8(a) procedures, HUBZone procedures or as small business set-asides. If a reasonable expectation exists that at least two responsible small businesses may submit... PROGRAMS SMALL BUSINESS AND SMALL DISADVANTAGED BUSINESS CONCERNS Set-Asides for Small Business 2919.502...
Cooper, J J; Brayford, M J; Laycock, P A
2014-08-01
A new method is described which can be used to determine the setting times of small amounts of high value bone cements. The test was developed to measure how the setting times of a commercially available synthetic calcium sulfate cement (Stimulan, Biocomposites, UK) in two forms (standard and Rapid Cure) varies with the addition of clinically relevant antibiotics. The importance of being able to accurately quantify these setting times is discussed. The results demonstrate that this new method, which is shown to correlate to the Vicat needle, gives reliable and repeatable data with additional benefits expressed in the article. The majority of antibiotics mixed were found to retard the setting reaction of the calcium sulfate cement.
48 CFR 19.507 - Automatic dissolution of a small business set-aside.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Automatic dissolution of a small business set-aside. 19.507 Section 19.507 Federal Acquisition Regulations System FEDERAL... Automatic dissolution of a small business set-aside. (a) If a small business set-aside acquisition or...
48 CFR 19.507 - Automatic dissolution of a small business set-aside.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Automatic dissolution of a small business set-aside. 19.507 Section 19.507 Federal Acquisition Regulations System FEDERAL... Automatic dissolution of a small business set-aside. (a) If a small business set-aside acquisition or...
48 CFR 19.507 - Automatic dissolution of a small business set-aside.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Automatic dissolution of a small business set-aside. 19.507 Section 19.507 Federal Acquisition Regulations System FEDERAL... Automatic dissolution of a small business set-aside. (a) If a small business set-aside acquisition or...
48 CFR 19.507 - Automatic dissolution of a small business set-aside.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Automatic dissolution of a small business set-aside. 19.507 Section 19.507 Federal Acquisition Regulations System FEDERAL... Automatic dissolution of a small business set-aside. (a) If a small business set-aside acquisition or...
48 CFR 19.507 - Automatic dissolution of a small business set-aside.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Automatic dissolution of a small business set-aside. 19.507 Section 19.507 Federal Acquisition Regulations System FEDERAL... Automatic dissolution of a small business set-aside. (a) If a small business set-aside acquisition or...
Testing a small UAS for mapping artisanal diamond mining sites in Africa
Malpeli, Katherine C.; Chirico, Peter G.
2015-01-01
Remote sensing technology is advancing at an unprecedented rate. At the forefront of the new technological developments are unmanned aircraft systems (UAS). The advent of small, lightweight, low-cost, and user-friendly UAS is greatly expanding the potential applications of remote sensing technology and improving the set of tools available to researchers seeking to map and monitor terrain from above. In this article, we explore the applications of a small UAS for mapping informal diamond mining sites in Africa. We found that this technology provides aerial imagery of unparalleled resolution in a data-sparse, difficult to access, and remote terrain.
NASA Astrophysics Data System (ADS)
Milroy, Daniel J.; Baker, Allison H.; Hammerling, Dorit M.; Jessup, Elizabeth R.
2018-02-01
The Community Earth System Model Ensemble Consistency Test (CESM-ECT) suite was developed as an alternative to requiring bitwise identical output for quality assurance. This objective test provides a statistical measurement of consistency between an accepted ensemble created by small initial temperature perturbations and a test set of CESM simulations. In this work, we extend the CESM-ECT suite with an inexpensive and robust test for ensemble consistency that is applied to Community Atmospheric Model (CAM) output after only nine model time steps. We demonstrate that adequate ensemble variability is achieved with instantaneous variable values at the ninth step, despite rapid perturbation growth and heterogeneous variable spread. We refer to this new test as the Ultra-Fast CAM Ensemble Consistency Test (UF-CAM-ECT) and demonstrate its effectiveness in practice, including its ability to detect small-scale events and its applicability to the Community Land Model (CLM). The new ultra-fast test facilitates CESM development, porting, and optimization efforts, particularly when used to complement information from the original CESM-ECT suite of tools.
ERIC Educational Resources Information Center
Li, Yanmei
2012-01-01
In a common-item (anchor) equating design, the common items should be evaluated for item parameter drift. Drifted items are often removed. For a test that contains mostly dichotomous items and only a small number of polytomous items, removing some drifted polytomous anchor items may result in anchor sets that no longer resemble mini-versions of…
Real Time Fault Detection and Diagnostics Using FPGA-Based Architectures
2010-03-01
vector sets sent to the DUT. The testing platform combines a myriad of testing and measuring equipment and work hours onto one small reprogrammable ...recently few reprogrammable devices have been used on spacecraft due to their sensitivity to involuntary reconfiguration due to Single Event Upsets...Determination of Nuclear Yield from Thermal Degradation of Automobile Paint MS Thesis. AFIT/GWM/ENP/10-M10. Wright-Patterson AFB OH: Graduate School of
Small molecule absorption by PDMS in the context of drug response bioassays.
van Meer, B J; de Vries, H; Firth, K S A; van Weerd, J; Tertoolen, L G J; Karperien, H B J; Jonkheijm, P; Denning, C; IJzerman, A P; Mummery, C L
2017-01-08
The polymer polydimethylsiloxane (PDMS) is widely used to build microfluidic devices compatible with cell culture. Whilst convenient in manufacture, PDMS has the disadvantage that it can absorb small molecules such as drugs. In microfluidic devices like "Organs-on-Chip", designed to examine cell behavior and test the effects of drugs, this might impact drug bioavailability. Here we developed an assay to compare the absorption of a test set of four cardiac drugs by PDMS based on measuring the residual non-absorbed compound by High Pressure Liquid Chromatography (HPLC). We showed that absorption was variable and time dependent and not determined exclusively by hydrophobicity as claimed previously. We demonstrated that two commercially available lipophilic coatings and the presence of cells affected absorption. The use of lipophilic coatings may be useful in preventing small molecule absorption by PDMS. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Genetic Testing in Clinical Settings.
Franceschini, Nora; Frick, Amber; Kopp, Jeffrey B
2018-04-11
Genetic testing is used for screening, diagnosis, and prognosis of diseases consistent with a genetic cause and to guide drug therapy to improve drug efficacy and avoid adverse effects (pharmacogenomics). This In Practice review aims to inform about DNA-related genetic test availability, interpretation, and recommended clinical actions based on results using evidence from clinical guidelines, when available. We discuss challenges that limit the widespread use of genetic information in the clinical care setting, including a small number of actionable genetic variants with strong evidence of clinical validity and utility, and the need for improving the health literacy of health care providers and the public, including for direct-to-consumer tests. Ethical, legal, and social issues and incidental findings also need to be addressed. Because our understanding of genetic factors associated with disease and drug response is rapidly increasing and new genetic tests are being developed that could be adopted by clinicians in the short term, we also provide extensive resources for information and education on genetic testing. Copyright © 2018 National Kidney Foundation, Inc. All rights reserved.
Accounting for measurement error in log regression models with applications to accelerated testing.
Richardson, Robert; Tolley, H Dennis; Evenson, William E; Lunt, Barry M
2018-01-01
In regression settings, parameter estimates will be biased when the explanatory variables are measured with error. This bias can significantly affect modeling goals. In particular, accelerated lifetime testing involves an extrapolation of the fitted model, and a small amount of bias in parameter estimates may result in a significant increase in the bias of the extrapolated predictions. Additionally, bias may arise when the stochastic component of a log regression model is assumed to be multiplicative when the actual underlying stochastic component is additive. To account for these possible sources of bias, a log regression model with measurement error and additive error is approximated by a weighted regression model which can be estimated using Iteratively Re-weighted Least Squares. Using the reduced Eyring equation in an accelerated testing setting, the model is compared to previously accepted approaches to modeling accelerated testing data with both simulations and real data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aldegunde, Manuel, E-mail: M.A.Aldegunde-Rodriguez@warwick.ac.uk; Kermode, James R., E-mail: J.R.Kermode@warwick.ac.uk; Zabaras, Nicholas
This paper presents the development of a new exchange–correlation functional from the point of view of machine learning. Using atomization energies of solids and small molecules, we train a linear model for the exchange enhancement factor using a Bayesian approach which allows for the quantification of uncertainties in the predictions. A relevance vector machine is used to automatically select the most relevant terms of the model. We then test this model on atomization energies and also on bulk properties. The average model provides a mean absolute error of only 0.116 eV for the test points of the G2/97 set butmore » a larger 0.314 eV for the test solids. In terms of bulk properties, the prediction for transition metals and monovalent semiconductors has a very low test error. However, as expected, predictions for types of materials not represented in the training set such as ionic solids show much larger errors.« less
36 CFR 223.103 - Award of small business set-aside sales.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 36 Parks, Forests, and Public Property 2 2013-07-01 2013-07-01 false Award of small business set... PRODUCTS Timber Sale Contracts Award of Contracts § 223.103 Award of small business set-aside sales. If timber is advertised as set aside for competitive bidding by small business concerns, award will be made...
36 CFR 223.103 - Award of small business set-aside sales.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 36 Parks, Forests, and Public Property 2 2011-07-01 2011-07-01 false Award of small business set... PRODUCTS Timber Sale Contracts Award of Contracts § 223.103 Award of small business set-aside sales. If timber is advertised as set aside for competitive bidding by small business concerns, award will be made...
36 CFR 223.103 - Award of small business set-aside sales.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 36 Parks, Forests, and Public Property 2 2012-07-01 2012-07-01 false Award of small business set... PRODUCTS Timber Sale Contracts Award of Contracts § 223.103 Award of small business set-aside sales. If timber is advertised as set aside for competitive bidding by small business concerns, award will be made...
36 CFR 223.103 - Award of small business set-aside sales.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 36 Parks, Forests, and Public Property 2 2014-07-01 2014-07-01 false Award of small business set... PRODUCTS Timber Sale Contracts Award of Contracts § 223.103 Award of small business set-aside sales. If timber is advertised as set aside for competitive bidding by small business concerns, award will be made...
48 CFR 52.219-7 - Notice of Partial Small Business Set-Aside.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Clauses 52.219-7 Notice of Partial Small Business Set-Aside. As prescribed in 19.508(d), insert the following clause: Notice of Partial Small Business Set-Aside (JUN 2003) (a) Definitions. Small business..., and qualified as a small business under the size standards in this solicitation. (b) General. (1) A...
48 CFR 52.219-7 - Notice of Partial Small Business Set-Aside.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Clauses 52.219-7 Notice of Partial Small Business Set-Aside. As prescribed in 19.508(d), insert the following clause: Notice of Partial Small Business Set-Aside (JUN 2003) (a) Definitions. Small business..., and qualified as a small business under the size standards in this solicitation. (b) General. (1) A...
48 CFR 52.219-7 - Notice of Partial Small Business Set-Aside.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Clauses 52.219-7 Notice of Partial Small Business Set-Aside. As prescribed in 19.508(d), insert the following clause: Notice of Partial Small Business Set-Aside (JUN 2003) (a) Definitions. Small business..., and qualified as a small business under the size standards in this solicitation. (b) General. (1) A...
48 CFR 52.219-6 - Notice of Total Small Business Set-Aside.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Clauses 52.219-6 Notice of Total Small Business Set-Aside. As prescribed in 19.508(c), insert the following clause: Notice of Total Small Business Set-Aside (NOV 2011) (a) Definition. Small business concern... qualified as a small business under the size standards in this solicitation. (b) Applicability. This clause...
48 CFR 52.219-7 - Notice of Partial Small Business Set-Aside.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Clauses 52.219-7 Notice of Partial Small Business Set-Aside. As prescribed in 19.508(d), insert the following clause: Notice of Partial Small Business Set-Aside (JUN 2003) (a) Definitions. Small business..., and qualified as a small business under the size standards in this solicitation. (b) General. (1) A...
48 CFR 52.219-7 - Notice of Partial Small Business Set-Aside.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Clauses 52.219-7 Notice of Partial Small Business Set-Aside. As prescribed in 19.508(d), insert the following clause: Notice of Partial Small Business Set-Aside (JUN 2003) (a) Definitions. Small business..., and qualified as a small business under the size standards in this solicitation. (b) General. (1) A...
48 CFR 52.219-6 - Notice of Total Small Business Set-Aside.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Clauses 52.219-6 Notice of Total Small Business Set-Aside. As prescribed in 19.508(c), insert the following clause: Notice of Total Small Business Set-Aside (NOV 2011) (a) Definition. Small business concern... qualified as a small business under the size standards in this solicitation. (b) Applicability. This clause...
48 CFR 52.219-6 - Notice of Total Small Business Set-Aside.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Clauses 52.219-6 Notice of Total Small Business Set-Aside. As prescribed in 19.508(c), insert the following clause: Notice of Total Small Business Set-Aside (NOV 2011) (a) Definition. Small business concern... qualified as a small business under the size standards in this solicitation. (b) Applicability. This clause...
Effects of Active Sting Damping on Common Research Model Data Quality
NASA Technical Reports Server (NTRS)
Acheson, Michael J.; Balakrishna, S.
2011-01-01
Recent tests using the Common Research Model (CRM) at the Langley National Transonic Facility (NTF) and the Ames 11-foot Transonic Wind Tunnel (11' TWT) produced large sets of data that have been used to examine the effects of active damping on transonic tunnel aerodynamic data quality. In particular, large statistically significant sets of repeat data demonstrate that the active damping system had no apparent effect on drag, lift and pitching moment repeatability during warm testing conditions, while simultaneously enabling aerodynamic data to be obtained post stall. A small set of cryogenic (high Reynolds number) repeat data was obtained at the NTF and again showed a negligible effect on data repeatability. However, due to a degradation of control power in the active damping system cryogenically, the ability to obtain test data post-stall was not achieved during cryogenic testing. Additionally, comparisons of data repeatability between NTF and 11-ft TWT CRM data led to further (warm) testing at the NTF which demonstrated that for a modest increase in data sampling time, a 2-3 factor improvement in drag, and pitching moment repeatability was readily achieved not related with the active damping system.
CONDITIONING CHILDREN FOR SCHOOL. FINAL REPORT.
ERIC Educational Resources Information Center
PRINCE, ALBERT I.
A SET OF BEHAVIORAL PRINCIPLES USED IN INTELLECTUAL REHABILITATION OF A SMALL GROUP OF THIRD GRADERS WITH EDUCATIONAL AND RELATED BEHAVIORAL PROBLEMS WAS EVALUATED. SUBJECTS SELECTED WERE EIGHT THIRD-GRADE STUDENTS AGED 8 TO 10, WHO WERE 1 YEAR BEHIND IN READING AS MEASURED BY A STANDARDIZED ACHIEVEMENT TEST AND 1 YEAR BEHIND IN EITHER SPELLING OR…
Plagiarism under a Magnifying-Glass
ERIC Educational Resources Information Center
Starovoytova, Diana
2017-01-01
This paper embodies the findings from a small part, of a larger study on plagiarism, at the School of Engineering (SOE). The study is a cross-sectional survey, conducted in an institutional setting. 15 senior academic members of staff (N = 15), from SOE were invited to complete a questionnaire. The questioner was pre-tested, to ensure its validity…
Automatic control of cryogenic wind tunnels
NASA Technical Reports Server (NTRS)
Balakrishna, S.
1989-01-01
Inadequate Reynolds number similarity in testing of scaled models affects the quality of aerodynamic data from wind tunnels. This is due to scale effects of boundary-layer shock wave interaction which is likely to be severe at transonic speeds. The idea of operation of wind tunnels using test gas cooled to cryogenic temperatures has yielded a quantrum jump in the ability to realize full scale Reynolds number flow similarity in small transonic tunnels. In such tunnels, the basic flow control problem consists of obtaining and maintaining the desired test section flow parameters. Mach number, Reynolds number, and dynamic pressure are the three flow parameters that are usually required to be kept constant during the period of model aerodynamic data acquisition. The series of activity involved in modeling, control law development, mechanization of the control laws on a microcomputer, and the performance of a globally stable automatic control system for the 0.3-m Transonic Cryogenic Tunnel (TCT) are discussed. A lumped multi-variable nonlinear dynamic model of the cryogenic tunnel, generation of a set of linear control laws for small perturbation, and nonlinear control strategy for large set point changes including tunnel trajectory control are described. The details of mechanization of the control laws on a 16 bit microcomputer system, the software features, operator interface, the display and safety are discussed. The controller is shown to provide globally stable and reliable temperature control to + or - 0.2 K, pressure to + or - 0.07 psi and Mach number to + or - 0.002 of the set point value. This performance is obtained both during large set point commands as for a tunnel cooldown, and during aerodynamic data acquisition with intrusive activity like geometrical changes in the test section such as angle of attack changes, drag rake movements, wall adaptation and sidewall boundary-layer removal. Feasibility of the use of an automatic Reynolds number control mode with fixed Mach number control is demonstrated.
Tests for informative cluster size using a novel balanced bootstrap scheme.
Nevalainen, Jaakko; Oja, Hannu; Datta, Somnath
2017-07-20
Clustered data are often encountered in biomedical studies, and to date, a number of approaches have been proposed to analyze such data. However, the phenomenon of informative cluster size (ICS) is a challenging problem, and its presence has an impact on the choice of a correct analysis methodology. For example, Dutta and Datta (2015, Biometrics) presented a number of marginal distributions that could be tested. Depending on the nature and degree of informativeness of the cluster size, these marginal distributions may differ, as do the choices of the appropriate test. In particular, they applied their new test to a periodontal data set where the plausibility of the informativeness was mentioned, but no formal test for the same was conducted. We propose bootstrap tests for testing the presence of ICS. A balanced bootstrap method is developed to successfully estimate the null distribution by merging the re-sampled observations with closely matching counterparts. Relying on the assumption of exchangeability within clusters, the proposed procedure performs well in simulations even with a small number of clusters, at different distributions and against different alternative hypotheses, thus making it an omnibus test. We also explain how to extend the ICS test to a regression setting and thereby enhancing its practical utility. The methodologies are illustrated using the periodontal data set mentioned earlier. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Kim, Mhinjine; Budd, Nadine; Batorsky, Benjamin; Krubiner, Carleigh; Manchikanti, Swathi; Waldrop, Greer; Trude, Angela; Gittelsohn, Joel
2017-01-01
Receptivity to strategies to improve the food environment by increasing access to healthier foods in small food stores is underexplored. We conducted 20 in-depth interviews with small storeowners of different ethnic backgrounds as part of a small-store intervention trial. Store owners perceived barriers and facilitators to purchase, stock, and promote healthy foods. Barriers mentioned included customer preferences for higher fat and sweeter taste and for lower prices; lower wholesaler availability of healthy food; and customers' lack of interest in health. Most store owners thought positively of taste tests, free samples, and communication interventions. However, they varied in terms of their expectations of the effect of these strategies on customers' healthy food purchases. The findings reported add to the limited data on motivating and working with small-store owners in low-income urban settings.
Kim, Mhinjine; Budd, Nadine; Batorsky, Benjamin; Krubiner, Carleigh; Manchikanti, Swathi; Waldrop, Greer; Trude, Angela; Gittelsohn, Joel
2017-01-01
Receptivity to strategies to improve the food environment by increasing access to healthier foods in small food stores is underexplored. We conducted 20 in-depth interviews with small storeowners of different ethnic backgrounds, as part of a small store intervention trial. Storeowners perceived barriers and facilitators to purchase, stock and promote healthy foods. Barriers mentioned included customer preferences for higher fat and sweeter taste and for lower prices price, lower wholesaler availability of healthy food, and customers’ lack of interest in health. Most storeowners thought positively of taste tests, free samples and communication interventions. However, they varied in terms of their expectations of the impact these strategies on customers’ healthy food purchases. The findings reported add to the limited data on motivating and working with small store owners in low income urban settings. PMID:27841664
Hyperspectral data discrimination methods
NASA Astrophysics Data System (ADS)
Casasent, David P.; Chen, Xuewen
2000-12-01
Hyperspectral data provides spectral response information that provides detailed chemical, moisture, and other description of constituent parts of an item. These new sensor data are useful in USDA product inspection. However, such data introduce problems such as the curse of dimensionality, the need to reduce the number of features used to accommodate realistic small training set sizes, and the need to employ discriminatory features and still achieve good generalization (comparable training and test set performance). Several two-step methods are compared to a new and preferable single-step spectral decomposition algorithm. Initial results on hyperspectral data for good/bad almonds and for good/bad (aflatoxin infested) corn kernels are presented. The hyperspectral application addressed differs greatly from prior USDA work (PLS) in which the level of a specific channel constituent in food was estimated. A validation set (separate from the test set) is used in selecting algorithm parameters. Threshold parameters are varied to select the best Pc operating point. Initial results show that nonlinear features yield improved performance.
48 CFR 52.219-6 - Notice of Total Small Business Set-Aside.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Clauses 52.219-6 Notice of Total Small Business Set-Aside. As prescribed in 19.508(c), insert the following clause: Notice of Total Small Business Set-Aside (JUN 2003) (a) Definition. Small business concern... qualified as a small business under the size standards in this solicitation. (b) General. (1) Offers are...
48 CFR 52.219-6 - Notice of Total Small Business Set-Aside.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Clauses 52.219-6 Notice of Total Small Business Set-Aside. As prescribed in 19.508(c), insert the following clause: Notice of Total Small Business Set-Aside (JUN 2003) (a) Definition. Small business concern... qualified as a small business under the size standards in this solicitation. (b) General. (1) Offers are...
Libiger, Ondrej; Schork, Nicholas J.
2015-01-01
It is now feasible to examine the composition and diversity of microbial communities (i.e., “microbiomes”) that populate different human organs and orifices using DNA sequencing and related technologies. To explore the potential links between changes in microbial communities and various diseases in the human body, it is essential to test associations involving different species within and across microbiomes, environmental settings and disease states. Although a number of statistical techniques exist for carrying out relevant analyses, it is unclear which of these techniques exhibit the greatest statistical power to detect associations given the complexity of most microbiome datasets. We compared the statistical power of principal component regression, partial least squares regression, regularized regression, distance-based regression, Hill's diversity measures, and a modified test implemented in the popular and widely used microbiome analysis methodology “Metastats” across a wide range of simulated scenarios involving changes in feature abundance between two sets of metagenomic samples. For this purpose, simulation studies were used to change the abundance of microbial species in a real dataset from a published study examining human hands. Each technique was applied to the same data, and its ability to detect the simulated change in abundance was assessed. We hypothesized that a small subset of methods would outperform the rest in terms of the statistical power. Indeed, we found that the Metastats technique modified to accommodate multivariate analysis and partial least squares regression yielded high power under the models and data sets we studied. The statistical power of diversity measure-based tests, distance-based regression and regularized regression was significantly lower. Our results provide insight into powerful analysis strategies that utilize information on species counts from large microbiome data sets exhibiting skewed frequency distributions obtained on a small to moderate number of samples. PMID:26734061
ASVCP guidelines: quality assurance for point-of-care testing in veterinary medicine.
Flatland, Bente; Freeman, Kathleen P; Vap, Linda M; Harr, Kendal E
2013-12-01
Point-of-care testing (POCT) refers to any laboratory testing performed outside the conventional reference laboratory and implies close proximity to patients. Instrumental POCT systems consist of small, handheld or benchtop analyzers. These have potential utility in many veterinary settings, including private clinics, academic veterinary medical centers, the community (eg, remote area veterinary medical teams), and for research applications in academia, government, and industry. Concern about the quality of veterinary in-clinic testing has been expressed in published veterinary literature; however, little guidance focusing on POCT is available. Recognizing this void, the ASVCP formed a subcommittee in 2009 charged with developing quality assurance (QA) guidelines for veterinary POCT. Guidelines were developed through literature review and a consensus process. Major recommendations include (1) taking a formalized approach to POCT within the facility, (2) use of written policies, standard operating procedures, forms, and logs, (3) operator training, including periodic assessment of skills, (4) assessment of instrument analytical performance and use of both statistical quality control and external quality assessment programs, (5) use of properly established or validated reference intervals, (6) and ensuring accurate patient results reporting. Where possible, given instrument analytical performance, use of a validated 13s control rule for interpretation of control data is recommended. These guidelines are aimed at veterinarians and veterinary technicians seeking to improve management of POCT in their clinical or research setting, and address QA of small chemistry and hematology instruments. These guidelines are not intended to be all-inclusive; rather, they provide a minimum standard for maintenance of POCT instruments in the veterinary setting. © 2013 American Society for Veterinary Clinical Pathology and European Society for Veterinary Clinical Pathology.
48 CFR 6.203 - Set-asides for small business concerns.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 6.203 Set-asides for small business concerns. (a) To fulfill the statutory requirements relating to small business concerns, contracting officers may set aside solicitations to allow only such business concerns to compete. This includes contract actions conducted under the Small Business Innovation Research...
48 CFR 6.203 - Set-asides for small business concerns.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 6.203 Set-asides for small business concerns. (a) To fulfill the statutory requirements relating to small business concerns, contracting officers may set aside solicitations to allow only such business concerns to compete. This includes contract actions conducted under the Small Business Innovation Research...
48 CFR 6.203 - Set-asides for small business concerns.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 6.203 Set-asides for small business concerns. (a) To fulfill the statutory requirements relating to small business concerns, contracting officers may set aside solicitations to allow only such business concerns to compete. This includes contract actions conducted under the Small Business Innovation Research...
48 CFR 6.203 - Set-asides for small business concerns.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 6.203 Set-asides for small business concerns. (a) To fulfill the statutory requirements relating to small business concerns, contracting officers may set aside solicitations to allow only such business concerns to compete. This includes contract actions conducted under the Small Business Innovation Research...
48 CFR 6.205 - Set-asides for HUBZone small business concerns.
Code of Federal Regulations, 2012 CFR
2012-10-01
... small business concerns. 6.205 Section 6.205 Federal Acquisition Regulations System FEDERAL ACQUISITION... 6.205 Set-asides for HUBZone small business concerns. (a) To fulfill the statutory requirements... (see 19.1302) may set aside solicitations to allow only qualified HUBZone small business concerns to...
48 CFR 6.205 - Set-asides for HUBZone small business concerns.
Code of Federal Regulations, 2014 CFR
2014-10-01
... small business concerns. 6.205 Section 6.205 Federal Acquisition Regulations System FEDERAL ACQUISITION... 6.205 Set-asides for HUBZone small business concerns. (a) To fulfill the statutory requirements... (see 19.1302) may set aside solicitations to allow only qualified HUBZone small business concerns to...
48 CFR 6.205 - Set-asides for HUBZone small business concerns.
Code of Federal Regulations, 2013 CFR
2013-10-01
... small business concerns. 6.205 Section 6.205 Federal Acquisition Regulations System FEDERAL ACQUISITION... 6.205 Set-asides for HUBZone small business concerns. (a) To fulfill the statutory requirements... (see 19.1302) may set aside solicitations to allow only qualified HUBZone small business concerns to...
48 CFR 6.205 - Set-asides for HUBZone small business concerns.
Code of Federal Regulations, 2011 CFR
2011-10-01
... small business concerns. 6.205 Section 6.205 Federal Acquisition Regulations System FEDERAL ACQUISITION... 6.205 Set-asides for HUBZone small business concerns. (a) To fulfill the statutory requirements... (see 19.1302) may set aside solicitations to allow only qualified HUBZone small business concerns to...
Bachim, Brent L; Gaylord, Thomas K
2005-01-20
A new technique, microinterferometric optical phase tomography, is introduced for use in measuring small, asymmetric refractive-index differences in the profiles of optical fibers and fiber devices. The method combines microscopy-based fringe-field interferometry with parallel projection-based computed tomography to characterize fiber index profiles. The theory relating interference measurements to the projection set required for tomographic reconstruction is given, and discrete numerical simulations are presented for three test index profiles that establish the technique's ability to characterize fiber with small, asymmetric index differences. An experimental measurement configuration and specific interferometry and tomography practices employed in the technique are discussed.
NASA Astrophysics Data System (ADS)
Oda, A.; Yamaotsu, N.; Hirono, S.; Takano, Y.; Fukuyoshi, S.; Nakagaki, R.; Takahashi, O.
2013-08-01
CAMDAS is a conformational search program, through which high temperature molecular dynamics (MD) calculations are carried out. In this study, the conformational search ability of CAMDAS was evaluated using structurally known 281 protein-ligand complexes as a test set. For the test, the influences of initial settings and initial conformations on search results were validated. By using the CAMDAS program, reasonable conformations whose root mean square deviations (RMSDs) in comparison with crystal structures were less than 2.0 Å could be obtained from 96% of the test set even though the worst initial settings were used. The success rate was comparable to those of OMEGA, and the errors of CAMDAS were less than those of OMEGA. Based on the results obtained using CAMDAS, the worst RMSD was around 2.5 Å, although the worst value obtained was around 4.0 Å using OMEGA. The results indicated that CAMDAS is a robust and versatile conformational search method and that it can be used for a wide variety of small molecules. In addition, the accuracy of a conformational search in relation to this study was improved by longer MD calculations and multiple MD simulations.
Toward Better Physics Labs for Future Biologists
NASA Astrophysics Data System (ADS)
Giannini, John; Moore, Kim; Losert, Wolfgang
2014-03-01
We have developed a set of laboratories and hands on activities to accompany a new two-semester interdisciplinary physics course that has been successfully developed and tested in two small test classes of students at the University of Maryland, College Park (UMD) in 2012-2013, and is currently being used on a wider scale. We have designed the laboratories to be taken accompanying a reformed course in the student's second year, with calculus, biology, and chemistry as prerequisites. This permits the laboratories to include significant content on physics relevant to cellular scales, from chemical interactions to random motion and charge screening in fluids. One major focus of the laboratories is to introduce the students to research-grade equipment and modern physics analysis tools in contexts relevant to biology, while maintaining the pedagogically valuable open-ended laboratory structure of reformed laboratories. Lab development procedures along with some preliminary student results from these two small test classes are discussed.
Benchmarking contactless acquisition sensor reproducibility for latent fingerprint trace evidence
NASA Astrophysics Data System (ADS)
Hildebrandt, Mario; Dittmann, Jana
2015-03-01
Optical, nano-meter range, contactless, non-destructive sensor devices are promising acquisition techniques in crime scene trace forensics, e.g. for digitizing latent fingerprint traces. Before new approaches are introduced in crime investigations, innovations need to be positively tested and quality ensured. In this paper we investigate sensor reproducibility by studying different scans from four sensors: two chromatic white light sensors (CWL600/CWL1mm), one confocal laser scanning microscope, and one NIR/VIS/UV reflection spectrometer. Firstly, we perform an intra-sensor reproducibility testing for CWL600 with a privacy conform test set of artificial-sweat printed, computer generated fingerprints. We use 24 different fingerprint patterns as original samples (printing samples/templates) for printing with artificial sweat (physical trace samples) and their acquisition with contactless sensory resulting in 96 sensor images, called scan or acquired samples. The second test set for inter-sensor reproducibility assessment consists of the first three patterns from the first test set, acquired in two consecutive scans using each device. We suggest using a simple feature space set in spatial and frequency domain known from signal processing and test its suitability for six different classifiers classifying scan data into small differences (reproducible) and large differences (non-reproducible). Furthermore, we suggest comparing the classification results with biometric verification scores (calculated with NBIS, with threshold of 40) as biometric reproducibility score. The Bagging classifier is nearly for all cases the most reliable classifier in our experiments and the results are also confirmed with the biometric matching rates.
On the occurrence of false positives in tests of migration under an isolation with migration model
Hey, Jody; Chung, Yujin; Sethuraman, Arun
2015-01-01
The population genetic study of divergence is often done using a Bayesian genealogy sampler, like those implemented in IMa2 and related programs, and these analyses frequently include a likelihood-ratio test of the null hypothesis of no migration between populations. Cruickshank and Hahn (2014, Molecular Ecology, 23, 3133–3157) recently reported a high rate of false positive test results with IMa2 for data simulated with small numbers of loci under models with no migration and recent splitting times. We confirm these findings and discover that they are caused by a failure of the assumptions underlying likelihood ratio tests that arises when using marginal likelihoods for a subset of model parameters. We also show that for small data sets, with little divergence between samples from two populations, an excellent fit can often be found by a model with a low migration rate and recent splitting time and a model with a high migration rate and a deep splitting time. PMID:26456794
Normanno, Nicola; Pinto, Carmine; Taddei, Gianluigi; Gambacorta, Marcello; Castiglione, Francesca; Barberis, Massimo; Clemente, Claudio; Marchetti, Antonio
2013-06-01
The Italian Association of Medical Oncology (AIOM) and the Italian Society of Pathology and Cytology organized an external quality assessment (EQA) scheme for EGFR mutation testing in non-small-cell lung cancer. Ten specimens, including three small biopsies with known epidermal growth factor receptor (EGFR) mutation status, were validated in three referral laboratories and provided to 47 participating centers. The participants were requested to perform mutational analysis, using their usual method, and to submit results within a 4-week time frame. According to a predefined scoring system, two points were assigned to correct genotype and zero points to false-negative or false-positive results. The threshold to pass the EQA was set at higher than 18 of 20 points. Two rounds were preplanned. All participating centers submitted the results within the time frame. Polymerase chain reaction (PCR)/sequencing was the main methodology used (n = 37 laboratories), although a few centers did use pyrosequencing (n = 8) or real-time PCR (n = 2). A significant number of analytical errors were observed (n = 20), with a high frequency of false-positive results (n = 16). The lower scores were obtained for the small biopsies. Fourteen of 47 centers (30%) that did not pass the first round, having a score less than or equal to 18 points, used PCR/sequencing, whereas 10 of 10 laboratories, using pyrosequencing or real-time PCR, passed the first round. Eight laboratories passed the second round. Overall, 41of 47 centers (87%) passed the EQA. The results of the EQA for EGFR testing in non-small-cell lung cancer suggest that good quality EGFR mutational analysis is performed in Italian laboratories, although differences between testing methods were observed, especially for small biopsies.
48 CFR 52.219-20 - Notice of Emerging Small Business Set-Aside.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Clauses 52.219-20 Notice of Emerging Small Business Set-Aside. As prescribed in 19.1008(b), insert the following provision: Notice of Emerging Small Business Set-Aside (JAN 1991) Offers or quotations under this acquisition are solicited from emerging small business concerns only. Offers that are not from an emerging...
Vibrational multiconfiguration self-consistent field theory: implementation and test calculations.
Heislbetz, Sandra; Rauhut, Guntram
2010-03-28
A state-specific vibrational multiconfiguration self-consistent field (VMCSCF) approach based on a multimode expansion of the potential energy surface is presented for the accurate calculation of anharmonic vibrational spectra. As a special case of this general approach vibrational complete active space self-consistent field calculations will be discussed. The latter method shows better convergence than the general VMCSCF approach and must be considered the preferred choice within the multiconfigurational framework. Benchmark calculations are provided for a small set of test molecules.
Student Use of Animated Pedagogical Agents in a Middle School Science Inquiry Program
ERIC Educational Resources Information Center
Bowman, Catherine D. D.
2012-01-01
Animated pedagogical agents (APAs) have the potential to provide one-on-one, just-in-time instruction, guidance or mentoring in classrooms where such individualized human interactions may be infeasible. Much current APA research focuses on a wide range of design variables tested with small samples or in laboratory settings, while overlooking…
Using Spatial-Temporal Primitives to Improve Geographic Skills for Preservice Teachers
ERIC Educational Resources Information Center
Kaufman, Martin M.
2004-01-01
An exercise to help improve the geographic skills of preservice teachers was developed and tested during a six year period on over 500 students. The exercise required these students to map two arrangements of roads and facilities within a small neighborhood. A set of special-temporal primitives (place, size, shape, distance, direction,…
40 CFR 1054.635 - What special provisions apply for small-volume engine and equipment manufacturers?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Administration at 13 CFR 121.201) that manufactures nonroad spark-ignition engines or equipment, but you do not... began manufacturing engines before, during, or after 2007. We may set other reasonable conditions that... deterioration factors. See § 1054.240. (4) Waived requirements for production-line testing. See § 1054.301. (5...
Retrieving Essential Material at the End of Lectures Improves Performance on Statistics Exams
ERIC Educational Resources Information Center
Lyle, Keith B.; Crawford, Nicole A.
2011-01-01
At the end of each lecture in a statistics for psychology course, students answered a small set of questions that required them to retrieve information from the same day's lecture. These exercises constituted retrieval practice for lecture material subsequently tested on four exams throughout the course. This technique is called the PUREMEM…
2014-02-11
ISS038-E-044887 (11 Feb. 2014) --- The Small Satellite Orbital Deployer (SSOD), in the grasp of the Kibo laboratory robotic arm, is photographed by an Expedition 38 crew member on the International Space Station as it deploys a set of NanoRacks CubeSats. The CubeSats program contains a variety of experiments such as Earth observations and advanced electronics testing.
2014-02-11
ISS038-E-044889 (11 Feb. 2014) --- The Small Satellite Orbital Deployer (SSOD), in the grasp of the Kibo laboratory robotic arm, is photographed by an Expedition 38 crew member on the International Space Station as it deploys a set of NanoRacks CubeSats. The CubeSats program contains a variety of experiments such as Earth observations and advanced electronics testing.
2014-02-11
ISS038-E-044890 (11 Feb. 2014) --- The Small Satellite Orbital Deployer (SSOD), in the grasp of the Kibo laboratory robotic arm, is photographed by an Expedition 38 crew member on the International Space Station as it deploys a set of NanoRacks CubeSats. The CubeSats program contains a variety of experiments such as Earth observations and advanced electronics testing.
Students in Rural Schools Have Limited Access to Advanced Mathematics Courses. Issue Brief No. 7
ERIC Educational Resources Information Center
Graham, Suzanne E.
2009-01-01
This Carsey brief reveals that students in rural areas and small towns have less access to higher-level mathematics courses than students in urban settings, which results in serious educational consequences, including lower scores on assessment tests and fewer qualified students entering science, technology, engineering, and mathematics (STEM) job…
48 CFR 1452.280-1 - Notice of Indian small business economic enterprise set-aside.
Code of Federal Regulations, 2014 CFR
2014-10-01
... business economic enterprise set-aside. 1452.280-1 Section 1452.280-1 Federal Acquisition Regulations... of Provisions and Clauses 1452.280-1 Notice of Indian small business economic enterprise set-aside... potential offerors. Notice of Indian Small Business Economic Enterprise Set-aside (JUL 2013) Under the Buy...
48 CFR 1452.280-1 - Notice of Indian small business economic enterprise set-aside.
Code of Federal Regulations, 2013 CFR
2013-10-01
... business economic enterprise set-aside. 1452.280-1 Section 1452.280-1 Federal Acquisition Regulations... of Provisions and Clauses 1452.280-1 Notice of Indian small business economic enterprise set-aside... potential offerors. Notice of Indian Small Business Economic Enterprise Set-aside (JUL 2013) Under the Buy...
Bayesian inference for disease prevalence using negative binomial group testing
Pritchard, Nicholas A.; Tebbs, Joshua M.
2011-01-01
Group testing, also known as pooled testing, and inverse sampling are both widely used methods of data collection when the goal is to estimate a small proportion. Taking a Bayesian approach, we consider the new problem of estimating disease prevalence from group testing when inverse (negative binomial) sampling is used. Using different distributions to incorporate prior knowledge of disease incidence and different loss functions, we derive closed form expressions for posterior distributions and resulting point and credible interval estimators. We then evaluate our new estimators, on Bayesian and classical grounds, and apply our methods to a West Nile Virus data set. PMID:21259308
Verma, Rajeshwar P; Matthews, Edwin J
2015-03-01
Evaluation of potential chemical-induced eye injury through irritation and corrosion is required to ensure occupational and consumer safety for industrial, household and cosmetic ingredient chemicals. The historical method for evaluating eye irritant and corrosion potential of chemicals is the rabbit Draize test. However, the Draize test is controversial and its use is diminishing - the EU 7th Amendment to the Cosmetic Directive (76/768/EEC) and recast Regulation now bans marketing of new cosmetics having animal testing of their ingredients and requires non-animal alternative tests for safety assessments. Thus, in silico and/or in vitro tests are advocated. QSAR models for eye irritation have been reported for several small (congeneric) data sets; however, large global models have not been described. This report describes FDA/CFSAN's development of 21 ANN c-QSAR models (QSAR-21) to predict eye irritation using the ADMET Predictor program and a diverse training data set of 2928 chemicals. The 21 models had external (20% test set) and internal validation and average training/verification/test set statistics were: 88/88/85(%) sensitivity and 82/82/82(%) specificity, respectively. The new method utilized multiple artificial neural network (ANN) molecular descriptor selection functionalities to maximize the applicability domain of the battery. The eye irritation models will be used to provide information to fill the critical data gaps for the safety assessment of cosmetic ingredient chemicals. Copyright © 2014 Elsevier Inc. All rights reserved.
Wang, Xuefeng; Lee, Seunggeun; Zhu, Xiaofeng; Redline, Susan; Lin, Xihong
2013-12-01
Family-based genetic association studies of related individuals provide opportunities to detect genetic variants that complement studies of unrelated individuals. Most statistical methods for family association studies for common variants are single marker based, which test one SNP a time. In this paper, we consider testing the effect of an SNP set, e.g., SNPs in a gene, in family studies, for both continuous and discrete traits. Specifically, we propose a generalized estimating equations (GEEs) based kernel association test, a variance component based testing method, to test for the association between a phenotype and multiple variants in an SNP set jointly using family samples. The proposed approach allows for both continuous and discrete traits, where the correlation among family members is taken into account through the use of an empirical covariance estimator. We derive the theoretical distribution of the proposed statistic under the null and develop analytical methods to calculate the P-values. We also propose an efficient resampling method for correcting for small sample size bias in family studies. The proposed method allows for easily incorporating covariates and SNP-SNP interactions. Simulation studies show that the proposed method properly controls for type I error rates under both random and ascertained sampling schemes in family studies. We demonstrate through simulation studies that our approach has superior performance for association mapping compared to the single marker based minimum P-value GEE test for an SNP-set effect over a range of scenarios. We illustrate the application of the proposed method using data from the Cleveland Family GWAS Study. © 2013 WILEY PERIODICALS, INC.
Single Event Transients in Voltage Regulators for FPGA Power Supply Applications
NASA Technical Reports Server (NTRS)
Poivey, Christian; Sanders, Anthony; Kim, Hak; Phan, Anthony; Forney, Jim; LaBel, Kenneth A.; Karsh, Jeremy; Pursley, Scott; Kleyner, Igor; Katz, Richard
2006-01-01
As with other bipolar analog devices, voltage regulators are known to be sensitive to single event transients (SET). In typical applications, large output capacitors are used to provide noise immunity. Therefore, since SET amplitude and duration are generally small, they are often of secondary importance due to this capacitance filtering. In low voltage applications, however, even small SET are a concern. Over-voltages may cause destructive conditions. Under-voltages may cause functional interrupts and may also trigger electrical latchup conditions. In addition, internal protection circuits which are affected by load as well as internal thermal effects can also be triggered from heavy ions, causing dropouts or shutdown ranging from milliseconds to seconds. In the case of FPGA power supplies applications, SETS are critical. For example, in the case of Actel FPGA RTAX family, core power supply voltage is 1.5V. Manufacturer specifies an absolute maximum rating of 1.6V and recommended operating conditions between 1.425V and 1.575V. Therefore, according to the manufacturer, any transient of amplitude greater than 75 mV can disrupt normal circuit functions, and overvoltages greater than 100 mV may damage the FPGA. We tested five low dropout voltage regulators for SET sensitivity under a large range of circuit application conditions.
NASA Astrophysics Data System (ADS)
Bozorgzadeh, Nezam; Yanagimura, Yoko; Harrison, John P.
2017-12-01
The Hoek-Brown empirical strength criterion for intact rock is widely used as the basis for estimating the strength of rock masses. Estimations of the intact rock H-B parameters, namely the empirical constant m and the uniaxial compressive strength σc, are commonly obtained by fitting the criterion to triaxial strength data sets of small sample size. This paper investigates how such small sample sizes affect the uncertainty associated with the H-B parameter estimations. We use Monte Carlo (MC) simulation to generate data sets of different sizes and different combinations of H-B parameters, and then investigate the uncertainty in H-B parameters estimated from these limited data sets. We show that the uncertainties depend not only on the level of variability but also on the particular combination of parameters being investigated. As particular combinations of H-B parameters can informally be considered to represent specific rock types, we discuss that as the minimum number of required samples depends on rock type it should correspond to some acceptable level of uncertainty in the estimations. Also, a comparison of the results from our analysis with actual rock strength data shows that the probability of obtaining reliable strength parameter estimations using small samples may be very low. We further discuss the impact of this on ongoing implementation of reliability-based design protocols and conclude with suggestions for improvements in this respect.
Goldstein, Elizabeth; Farquhar, Marybeth; Crofton, Christine; Darby, Charles; Garfinkel, Steven
2005-12-01
To describe the developmental process for the CAHPS Hospital Survey. A pilot was conducted in three states with 19,720 hospital discharges. A rigorous, multi-step process was used to develop the CAHPS Hospital Survey. It included a public call for measures, multiple Federal Register notices soliciting public input, a review of the relevant literature, meetings with hospitals, consumers and survey vendors, cognitive interviews with consumer, a large-scale pilot test in three states and consumer testing and numerous small-scale field tests. The current version of the CAHPS Hospital Survey has survey items in seven domains, two overall ratings of the hospital and five items used for adjusting for the mix of patients across hospitals and for analytical purposes. The CAHPS Hospital Survey is a core set of questions that can be administered as a stand-alone questionnaire or combined with a broader set of hospital specific items.
NASA Technical Reports Server (NTRS)
Hartman, Edwin P; Biermann, David
1938-01-01
Negative thrust and torque data for 2, 3, and 4-blade metal propellers having Clark y and R.A.F. 6 airfoil sections were obtained from tests in the NACA 20-foot tunnel. The propellers were mounted in front of a radial engine nacelle and the blade-angle settings covered in the tests ranged from l5 degrees to 90 degrees. One propeller was also tested at blade-angle settings of 0 degree, 5 degrees, and 10 degrees. A considerable portion of the report deals with the various applications of the negative thrust and torque to flight problems. A controllable propeller is shown to have a number of interesting, and perhaps valuable, uses within the negative thrust and torque range of operation. A small amount of engine-friction data is included to facilitate the application of the propeller data.
48 CFR 5119.1070-2 - Emerging small business set-aside.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Emerging small business... ARMY ACQUISITION REGULATIONS SMALL BUSINESS AND SMALL DISADVANTAGED BUSINESS CONCERNS Small Business Competitiveness Demonstration Program 5119.1070-2 Emerging small business set-aside. (a)(S-90) Solicitations for...
48 CFR 5119.1070-2 - Emerging small business set-aside.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 7 2014-10-01 2014-10-01 false Emerging small business... ARMY ACQUISITION REGULATIONS SMALL BUSINESS AND SMALL DISADVANTAGED BUSINESS CONCERNS Small Business Competitiveness Demonstration Program 5119.1070-2 Emerging small business set-aside. (a)(S-90) Solicitations for...
48 CFR 5119.1070-2 - Emerging small business set-aside.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 7 2011-10-01 2011-10-01 false Emerging small business... ARMY ACQUISITION REGULATIONS SMALL BUSINESS AND SMALL DISADVANTAGED BUSINESS CONCERNS Small Business Competitiveness Demonstration Program 5119.1070-2 Emerging small business set-aside. (a)(S-90) Solicitations for...
48 CFR 5119.1070-2 - Emerging small business set-aside.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 7 2012-10-01 2012-10-01 false Emerging small business... ARMY ACQUISITION REGULATIONS SMALL BUSINESS AND SMALL DISADVANTAGED BUSINESS CONCERNS Small Business Competitiveness Demonstration Program 5119.1070-2 Emerging small business set-aside. (a)(S-90) Solicitations for...
48 CFR 852.219-10 - VA Notice of total service-disabled veteran-owned small business set-aside.
Code of Federal Regulations, 2012 CFR
2012-10-01
...-disabled veteran-owned small business set-aside. 852.219-10 Section 852.219-10 Federal Acquisition... CLAUSES Texts of Provisions and Clauses 852.219-10 VA Notice of total service-disabled veteran-owned small...-Disabled Veteran-Owned Small Business Set-Aside (DEC 2009) (a) Definition. For the Department of Veterans...
48 CFR 852.219-10 - VA Notice of total service-disabled veteran-owned small business set-aside.
Code of Federal Regulations, 2010 CFR
2010-10-01
...-disabled veteran-owned small business set-aside. 852.219-10 Section 852.219-10 Federal Acquisition... CLAUSES Texts of Provisions and Clauses 852.219-10 VA Notice of total service-disabled veteran-owned small...-Disabled Veteran-Owned Small Business Set-Aside (DEC 2009) (a) Definition. For the Department of Veterans...
48 CFR 852.219-10 - VA Notice of total service-disabled veteran-owned small business set-aside.
Code of Federal Regulations, 2013 CFR
2013-10-01
...-disabled veteran-owned small business set-aside. 852.219-10 Section 852.219-10 Federal Acquisition... CLAUSES Texts of Provisions and Clauses 852.219-10 VA Notice of total service-disabled veteran-owned small...-Disabled Veteran-Owned Small Business Set-Aside (DEC 2009) (a) Definition. For the Department of Veterans...
48 CFR 852.219-10 - VA Notice of total service-disabled veteran-owned small business set-aside.
Code of Federal Regulations, 2011 CFR
2011-10-01
...-disabled veteran-owned small business set-aside. 852.219-10 Section 852.219-10 Federal Acquisition... CLAUSES Texts of Provisions and Clauses 852.219-10 VA Notice of total service-disabled veteran-owned small...-Disabled Veteran-Owned Small Business Set-Aside (DEC 2009) (a) Definition. For the Department of Veterans...
48 CFR 852.219-10 - VA Notice of total service-disabled veteran-owned small business set-aside.
Code of Federal Regulations, 2014 CFR
2014-10-01
...-disabled veteran-owned small business set-aside. 852.219-10 Section 852.219-10 Federal Acquisition... CLAUSES Texts of Provisions and Clauses 852.219-10 VA Notice of total service-disabled veteran-owned small...-Disabled Veteran-Owned Small Business Set-Aside (DEC 2009) (a) Definition. For the Department of Veterans...
Patient Core Data Set. Standard for a longitudinal health/medical record.
Renner, A L; Swart, J C
1997-01-01
Blue Chip Computers Company, in collaboration with Wright State University-Miami Valley College of Nursing and Health, with support from the Agency for Health Care Policy and Research, Public Health Service, completed Small Business innovative Research research to design a comprehensive integrated Patient information System. The Wright State University consultants undertook the development of a Patient Core Data Set (PCDS) in response to the lack of uniform standards of minimum data sets, and lack of standards in data transfer for continuity of care. The purpose of the Patient Core Data Set is to develop a longitudinal patient health record and medical history using a common set of standard data elements with uniform definitions and coding consistent with Health Level 7 (HL7) protocol and the American Society for Testing and Materials (ASTM) standards. The PCDS, intended for transfer across all patient-care settings, is essential information for clinicians, administrators, researchers, and health policy makers.
Performance Calculations for a Boundary-Layer-Ingesting Fan Stage from Sparse Measurements
NASA Technical Reports Server (NTRS)
Hirt, Stefanie M.; Wolter, John D.; Arend, David J.; Hearn, Tristan A.; Hardin, Larry W.; Gazzaniga, John A.
2018-01-01
A test of the Boundary Layer Ingesting-Inlet / Distortion-Tolerant Fan was completed in NASA Glenn's 8-Foot by 6-Foot supersonic wind tunnel. Inlet and fan performance were measured by surveys using a set of rotating rake arrays upstream and downstream of the fan stage. Surveys were conducted along the 100 percent speed line and a constant exit corrected flow line passing through the aerodynamic design point. These surveys represented only a small fraction of the data collected during the test. For other operating points, data was recorded as snapshots without rotating the rakes which resulted in a sparser set of recorded data. This paper will discuss analysis of these additional, lower measurement density data points to expand our coverage of the fan map. Several techniques will be used to supplement the snapshot data at test conditions where survey data also exists. The supplemented snapshot data will be compared with survey results to assess the quality of the approach. Effective methods will be used to analyze the data set for which only snapshots exist.
Preliminary noise tests of the engine-over-the-wing concept. i: 30 deg - 60 deg flap position
NASA Technical Reports Server (NTRS)
Reshotko, M.; Olsen, W. A.; Dorsch, R. G.
1972-01-01
The results of preliminary acoustic tests of the engine over the wing concept are summarized. The tests were conducted with a small wing section model (32 cm chord) having two flaps set at the landing position, which is 30 and 60 deg respectively. The engine exhaust was simulated by an air jet from a convergent nozzle having a nominal diameter of 5.1 centimeters. Factors investigated for their effect on noise include nozzle location, wing shielding, flap leakage, nozzle shape, exhaust deflectors, and internally generated exhaust noise.
High-speed image transmission via the Advanced Communication Technology Satellite (ACTS)
NASA Astrophysics Data System (ADS)
Bazzill, Todd M.; Huang, H. K.; Thoma, George R.; Long, L. Rodney; Gill, Michael J.
1996-05-01
We are developing a wide area test bed network using the Advanced Communication Technology Satellite (ACTS) from NASA for high speed medical image transmission. The two test sites are the University of California, San Francisco, and the National Library of Medicine. The first phase of the test bed runs over a T1 link (1.544 Mbits/sec) using a Very Small Aperture Terminal. The second phase involves the High Data Rate Terminal via an ATM OC 3C (155 Mbits/sec) connection. This paper describes the experimental set up and some preliminary results from phase 1.
Automated Testcase Generation for Numerical Support Functions in Embedded Systems
NASA Technical Reports Server (NTRS)
Schumann, Johann; Schnieder, Stefan-Alexander
2014-01-01
We present a tool for the automatic generation of test stimuli for small numerical support functions, e.g., code for trigonometric functions, quaternions, filters, or table lookup. Our tool is based on KLEE to produce a set of test stimuli for full path coverage. We use a method of iterative deepening over abstractions to deal with floating-point values. During actual testing the stimuli exercise the code against a reference implementation. We illustrate our approach with results of experiments with low-level trigonometric functions, interpolation routines, and mathematical support functions from an open source UAS autopilot.
Baucom, Brian R W; Baucom, Katherine J W; Hogan, Jasara N; Crenshaw, Alexander O; Bourne, Stacia V; Crowell, Sheila E; Georgiou, Panayiotis; Goodwin, Matthew S
2018-03-25
Cardiovascular reactivity during spousal conflict is considered to be one of the main pathways for relationship distress to impact physical, mental, and relationship health. However, the magnitude of association between cardiovascular reactivity during laboratory marital conflict and relationship functioning is small and inconsistent given the scope of its importance in theoretical models of intimate relationships. This study tests the possibility that cardiovascular data collected in laboratory settings downwardly bias the magnitude of these associations when compared to measures obtained in naturalistic settings. Ambulatory cardiovascular reactivity data were collected from 20 couples during two relationship conflicts in a research laboratory, two planned relationship conflicts at couples' homes, and two spontaneous relationship conflicts during couples' daily lives. Associations between self-report measures of relationship functioning, individual functioning, and cardiovascular reactivity across settings are tested using multilevel models. Cardiovascular reactivity was significantly larger during planned and spontaneous relationship conflicts in naturalistic settings than during planned relationship conflicts in the laboratory. Similarly, associations with relationship and individual functioning variables were statistically significantly larger for cardiovascular data collected in naturalistic settings than the same data collected in the laboratory. Our findings suggest that cardiovascular reactivity during spousal conflict in naturalistic settings is statistically significantly different from that elicited in laboratory settings both in magnitude and in the pattern of associations with a wide range of inter- and intrapersonal variables. These differences in findings across laboratory and naturalistic physiological responses highlight the value of testing physiological phenomena across interaction contexts in romantic relationships. © 2018 Family Process Institute.
High-Throughput Classification of Radiographs Using Deep Convolutional Neural Networks.
Rajkomar, Alvin; Lingam, Sneha; Taylor, Andrew G; Blum, Michael; Mongan, John
2017-02-01
The study aimed to determine if computer vision techniques rooted in deep learning can use a small set of radiographs to perform clinically relevant image classification with high fidelity. One thousand eight hundred eighty-five chest radiographs on 909 patients obtained between January 2013 and July 2015 at our institution were retrieved and anonymized. The source images were manually annotated as frontal or lateral and randomly divided into training, validation, and test sets. Training and validation sets were augmented to over 150,000 images using standard image manipulations. We then pre-trained a series of deep convolutional networks based on the open-source GoogLeNet with various transformations of the open-source ImageNet (non-radiology) images. These trained networks were then fine-tuned using the original and augmented radiology images. The model with highest validation accuracy was applied to our institutional test set and a publicly available set. Accuracy was assessed by using the Youden Index to set a binary cutoff for frontal or lateral classification. This retrospective study was IRB approved prior to initiation. A network pre-trained on 1.2 million greyscale ImageNet images and fine-tuned on augmented radiographs was chosen. The binary classification method correctly classified 100 % (95 % CI 99.73-100 %) of both our test set and the publicly available images. Classification was rapid, at 38 images per second. A deep convolutional neural network created using non-radiological images, and an augmented set of radiographs is effective in highly accurate classification of chest radiograph view type and is a feasible, rapid method for high-throughput annotation.
Maternal Plasma DNA and RNA Sequencing for Prenatal Testing.
Tamminga, Saskia; van Maarle, Merel; Henneman, Lidewij; Oudejans, Cees B M; Cornel, Martina C; Sistermans, Erik A
2016-01-01
Cell-free DNA (cfDNA) testing has recently become indispensable in diagnostic testing and screening. In the prenatal setting, this type of testing is often called noninvasive prenatal testing (NIPT). With a number of techniques, using either next-generation sequencing or single nucleotide polymorphism-based approaches, fetal cfDNA in maternal plasma can be analyzed to screen for rhesus D genotype, common chromosomal aneuploidies, and increasingly for testing other conditions, including monogenic disorders. With regard to screening for common aneuploidies, challenges arise when implementing NIPT in current prenatal settings. Depending on the method used (targeted or nontargeted), chromosomal anomalies other than trisomy 21, 18, or 13 can be detected, either of fetal or maternal origin, also referred to as unsolicited or incidental findings. For various biological reasons, there is a small chance of having either a false-positive or false-negative NIPT result, or no result, also referred to as a "no-call." Both pre- and posttest counseling for NIPT should include discussing potential discrepancies. Since NIPT remains a screening test, a positive NIPT result should be confirmed by invasive diagnostic testing (either by chorionic villus biopsy or by amniocentesis). As the scope of NIPT is widening, professional guidelines need to discuss the ethics of what to offer and how to offer. In this review, we discuss the current biochemical, clinical, and ethical challenges of cfDNA testing in the prenatal setting and its future perspectives including novel applications that target RNA instead of DNA. © 2016 Elsevier Inc. All rights reserved.
Compressed/reconstructed test images for CRAF/Cassini
NASA Technical Reports Server (NTRS)
Dolinar, S.; Cheung, K.-M.; Onyszchuk, I.; Pollara, F.; Arnold, S.
1991-01-01
A set of compressed, then reconstructed, test images submitted to the Comet Rendezvous Asteroid Flyby (CRAF)/Cassini project is presented as part of its evaluation of near lossless high compression algorithms for representing image data. A total of seven test image files were provided by the project. The seven test images were compressed, then reconstructed with high quality (root mean square error of approximately one or two gray levels on an 8 bit gray scale), using discrete cosine transforms or Hadamard transforms and efficient entropy coders. The resulting compression ratios varied from about 2:1 to about 10:1, depending on the activity or randomness in the source image. This was accomplished without any special effort to optimize the quantizer or to introduce special postprocessing to filter the reconstruction errors. A more complete set of measurements, showing the relative performance of the compression algorithms over a wide range of compression ratios and reconstruction errors, shows that additional compression is possible at a small sacrifice in fidelity.
Hot-Fire Testing of 5N and 22N HPGP Thrusters
NASA Technical Reports Server (NTRS)
Burnside, Christopher G.; Pedersen, Kevin W.; Pierce, Charles W.
2015-01-01
This hot-fire test continues NASA investigation of green propellant technologies for future missions. To show the potential for green propellants to replace some hydrazine systems in future spacecraft, NASA Marshall Space Flight Center (MSFC) is continuing to embark on hot-fire test campaigns with various green propellant blends.NASA completed hot-fire testing of 5N and 22N HPGP thrusters at the Marshall Space Flight Center’s Component Development Area altitude test stand in April 2015. Both thrusters are ground test articles and not flight ready units, but are representative of potential flight hardware with a known path towards flight application. The purpose of the 5N testing was to perform facility check-outs and generate a small set of data for comparison to ECAPS and Orbital ATK data sets. The 5N thruster performed as expected with thrust and propellant flow-rate data generated that are similar to previous testing at Orbital ATK. Immediately following the 5N testing, and using the same facility, the 22N testing was conducted on the same test stand with the purpose of demonstrating the 22N performance. The results of 22N testing indicate it performed as expected.The results of the hot-fire testing are presented in this paper and presentation.
Koch, Hèlen; van Bokhoven, Marloes A; ter Riet, Gerben; van Alphen-Jager, Jm Tineke; van der Weijden, Trudy; Dinant, Geert-Jan; Bindels, Patrick J E
2009-04-01
Unexplained fatigue is frequently encountered in general practice. Because of the low prior probability of underlying somatic pathology, the positive predictive value of abnormal (blood) test results is limited in such patients. The study objectives were to investigate the relationship between established diagnoses and the occurrence of abnormal blood test results among patients with unexplained fatigue; to survey the effects of the postponement of test ordering on this relationship; and to explore consultation-related determinants of abnormal test results. Cluster randomised trial. General practices of 91 GPs in the Netherlands. GPs were randomised to immediate or postponed blood-test ordering. Patients with new unexplained fatigue were included. Limited and expanded sets of blood tests were ordered either immediately or after 4 weeks. Diagnoses during the 1-year follow-up period were extracted from medical records. Two-by-two tables were generated. To establish independent determinants of abnormal test results, a multivariate logistic regression model was used. Data of 325 patients were analysed (71% women; mean age 41 years). Eight per cent of patients had a somatic illness that was detectable by blood-test ordering. The number of false-positive test results increased in particular in the expanded test set. Patients rarely re-consulted after 4 weeks. Test postponement did not affect the distribution of patients over the two-by-two tables. No independent consultation-related determinants of abnormal test results were found. Results support restricting the number of tests ordered because of the increased risk of false-positive test results from expanded test sets. Although the number of re-consulting patients was small, the data do not refute the advice to postpone blood-test ordering for medical reasons in patients with unexplained fatigue in general practice.
A goal attainment pain management program for older adults with arthritis.
Davis, Gail C; White, Terri L
2008-12-01
The purpose of this study was to test a pain management intervention that integrates goal setting with older adults (age > or =65) living independently in residential settings. This preliminary testing of the Goal Attainment Pain Management Program (GAPMAP) included a sample of 17 adults (mean age 79.29 years) with self-reported pain related to arthritis. Specific study aims were to: 1) explore the use of individual goal setting; 2) determine participants' levels of goal attainment; 3) determine whether changes occurred in the pain management methods used and found to be helpful by GAPMAP participants; and 4) determine whether changes occurred in selected pain-related variables (i.e., experience of living with persistent pain, the expected outcomes of pain management, pain management barriers, and global ratings of perceived pain intensity and success of pain management). Because of the small sample size, both parametric (t test) and nonparametric (Wilcoxon signed rank test) analyses were used to examine differences from pretest to posttest. Results showed that older individuals could successfully participate in setting and attaining individual goals. Thirteen of the 17 participants (76%) met their goals at the expected level or above. Two management methods (exercise and using a heated pool, tub, or shower) were used significantly more often after the intervention, and two methods (exercise and distraction) were identified as significantly more helpful. Two pain-related variables (experience of living with persistent pain and expected outcomes of pain management) revealed significant change, and all of those tested showed overall improvement.
Patterns-Based IS Change Management in SMEs
NASA Astrophysics Data System (ADS)
Makna, Janis; Kirikova, Marite
The majority of information systems change management guidelines and standards are either too abstract or too bureaucratic to be easily applicable in small enterprises. This chapter proposes the approach, the method, and the prototype that are designed especially for information systems change management in small and medium enterprises. The approach is based on proven patterns of changes in the set of information systems elements. The set of elements was obtained by theoretical analysis of information systems and business process definitions and enterprise architectures. The patterns were evolved from a number of information systems theories and tested in 48 information systems change management projects. The prototype presents and helps to handle three basic change patterns, which help to anticipate the overall scope of changes related to particular elementary changes in an enterprise information system. The use of prototype requires just basic knowledge in organizational business process and information management.
Foundational numerical capacities and the origins of dyscalculia.
Butterworth, Brian
2010-12-01
One important cause of very low attainment in arithmetic (dyscalculia) seems to be a core deficit in an inherited foundational capacity for numbers. According to one set of hypotheses, arithmetic ability is built on an inherited system responsible for representing approximate numerosity. One account holds that this is supported by a system for representing exactly a small number (less than or equal to four4) of individual objects. In these approaches, the core deficit in dyscalculia lies in either of these systems. An alternative proposal holds that the deficit lies in an inherited system for sets of objects and operations on them (numerosity coding) on which arithmetic is built. I argue that a deficit in numerosity coding, not in the approximate number system or the small number system, is responsible for dyscalculia. Nevertheless, critical tests should involve both longitudinal studies and intervention, and these have yet to be carried out. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Fishman, Jack; Gregory, Gerald L.; Sachse, Glen W.; Beck, Sherwin M.; Hill, Gerald F.
1987-01-01
A set of 14 pairs of vertical profiles of ozone and carbon monoxide, obtained with fast-response instrumentation, is presented. Most of these profiles, which were measured in the remote troposphere, also have supporting fast-response dew-point temperature profiles. The data suggest that the continental boundary layer is a source of tropospheric ozone, even in October and November, when photochemical activity should be rather small. In general, the small-scale vertical variability between CO and O3 is in phase. At low latitudes this relationship defines levels in the atmosphere where midlatitude air is being transported to lower latitudes, since lower dew-point temperatures accompany these higher CO and O3 concentrations. A set of profiles which is suggestive of interhemispheric transport is also presented. Independent meteorological analyses support these interpretations.
NASA Astrophysics Data System (ADS)
Santra, Biswajit; Michaelides, Angelos; Scheffler, Matthias
2007-11-01
The ability of several density-functional theory (DFT) exchange-correlation functionals to describe hydrogen bonds in small water clusters (dimer to pentamer) in their global minimum energy structures is evaluated with reference to second order Møller-Plesset perturbation theory (MP2). Errors from basis set incompleteness have been minimized in both the MP2 reference data and the DFT calculations, thus enabling a consistent systematic evaluation of the true performance of the tested functionals. Among all the functionals considered, the hybrid X3LYP and PBE0 functionals offer the best performance and among the nonhybrid generalized gradient approximation functionals, mPWLYP and PBE1W perform best. The popular BLYP and B3LYP functionals consistently underbind and PBE and PW91 display rather variable performance with cluster size.
Santra, Biswajit; Michaelides, Angelos; Scheffler, Matthias
2007-11-14
The ability of several density-functional theory (DFT) exchange-correlation functionals to describe hydrogen bonds in small water clusters (dimer to pentamer) in their global minimum energy structures is evaluated with reference to second order Moller-Plesset perturbation theory (MP2). Errors from basis set incompleteness have been minimized in both the MP2 reference data and the DFT calculations, thus enabling a consistent systematic evaluation of the true performance of the tested functionals. Among all the functionals considered, the hybrid X3LYP and PBE0 functionals offer the best performance and among the nonhybrid generalized gradient approximation functionals, mPWLYP and PBE1W perform best. The popular BLYP and B3LYP functionals consistently underbind and PBE and PW91 display rather variable performance with cluster size.
Microbiological testing of Skylab foods.
NASA Technical Reports Server (NTRS)
Heidelbaugh, N. D.; Mcqueen, J. L.; Rowley, D. B.; Powers , E. M.; Bourland, C. T.
1973-01-01
Review of some of the unique food microbiology problems and problem-generating circumstances the Skylab manned space flight program involves. The situations these problems arise from include: extended storage times, variations in storage temperatures, no opportunity to resupply or change foods after launch of the Skylab Workshop, first use of frozen foods in space, first use of a food-warming device in weightlessness, relatively small size of production lots requiring statistically valid sampling plans, and use of food as an accurately controlled part in a set of sophisticated life science experiments. Consideration of all of these situations produced the need for definite microbiological tests and test limits. These tests are described along with the rationale for their selection. Reported test results show good compliance with the test limits.
Langley Wind Tunnel Data Quality Assurance-Check Standard Results
NASA Technical Reports Server (NTRS)
Hemsch, Michael J.; Grubb, John P.; Krieger, William B.; Cler, Daniel L.
2000-01-01
A framework for statistical evaluation, control and improvement of wind funnel measurement processes is presented The methodology is adapted from elements of the Measurement Assurance Plans developed by the National Bureau of Standards (now the National Institute of Standards and Technology) for standards and calibration laboratories. The present methodology is based on the notions of statistical quality control (SQC) together with check standard testing and a small number of customer repeat-run sets. The results of check standard and customer repeat-run -sets are analyzed using the statistical control chart-methods of Walter A. Shewhart long familiar to the SQC community. Control chart results are presented for. various measurement processes in five facilities at Langley Research Center. The processes include test section calibration, force and moment measurements with a balance, and instrument calibration.
ERIC Educational Resources Information Center
Pugh, Geoff; Telhaj, Shqiponje
2008-01-01
Social capital theory, recent developments in the theory of identity and a small econometric literature all suggest positive attainment effects from faith schooling. To test this hypothesis, the authors use a unique data set on Flemish secondary school students from the 1999 repeat of the Third International Mathematics and Science Study to…
ERIC Educational Resources Information Center
Shimada, Shoko; And Others
The purpose of this study was to cross-sectionally and longitudinally examine the developmental process of search behavior in infancy. Subjects were 23 Japanese normal infants (11 males and 12 females) who were individually tested once a month from the age of six to 13 months in laboratory settings. Small toys and three white opaque cubic boxes…
Impact of Group Size on Classroom On-Task Behavior and Work Productivity in Children with ADHD
ERIC Educational Resources Information Center
Hart, Katie C.; Massetti, Greta M.; Fabiano, Gregory A.; Pariseau, Meaghan E.; Pelham, William E., Jr.
2011-01-01
This study sought to systematically examine the academic behavior of children with ADHD in different instructional contexts in an analogue classroom setting. A total of 33 children with ADHD participated in a reading comprehension activity followed by a testing period and were randomly assigned within days to either small-group instruction,…
2014-02-11
ISS038-E-044883 (11 Feb. 2014) --- The Small Satellite Orbital Deployer (SSOD), in the grasp of the Kibo laboratory robotic arm, is photographed by an Expedition 38 crew member on the International Space Station as it begins the deployment of a set of NanoRacks CubeSats. The CubeSats program contains a variety of experiments such as Earth observations and advanced electronics testing.
2014-02-11
ISS038-E-044994 (11 Feb. 2014) --- The Small Satellite Orbital Deployer (SSOD), in the grasp of the Kibo laboratory robotic arm, is photographed by an Expedition 38 crew member on the International Space Station prior to the deployment of a set of NanoRacks CubeSats. The CubeSats program contains a variety of experiments such as Earth observations and advanced electronics testing.
Code of Federal Regulations, 2011 CFR
2011-10-01
... for Women-Owned Small Business Concerns Eligible Under the Women-Owned Small Business Program. 52.219... Notice of Total Set-Aside for Women-Owned Small Business Concerns Eligible Under the Women-Owned Small... Women-Owned Small Business Concerns Eligible Under the Women-Owned Small Business Program (APR 2011) (a...
48 CFR 819.7006 - Veteran-owned small business set-aside procedures.
Code of Federal Regulations, 2010 CFR
2010-10-01
... OF VETERANS AFFAIRS SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Service-Disabled Veteran-Owned and Veteran-Owned Small Business Acquisition Program 819.7006 Veteran-owned small business set-aside... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Veteran-owned small...
48 CFR 819.7006 - Veteran-owned small business set-aside procedures.
Code of Federal Regulations, 2013 CFR
2013-10-01
... OF VETERANS AFFAIRS SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Service-Disabled Veteran-Owned and Veteran-Owned Small Business Acquisition Program 819.7006 Veteran-owned small business set-aside... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Veteran-owned small...
48 CFR 819.7006 - Veteran-owned small business set-aside procedures.
Code of Federal Regulations, 2011 CFR
2011-10-01
... OF VETERANS AFFAIRS SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Service-Disabled Veteran-Owned and Veteran-Owned Small Business Acquisition Program 819.7006 Veteran-owned small business set-aside... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Veteran-owned small...
48 CFR 819.7006 - Veteran-owned small business set-aside procedures.
Code of Federal Regulations, 2014 CFR
2014-10-01
... OF VETERANS AFFAIRS SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Service-Disabled Veteran-Owned and Veteran-Owned Small Business Acquisition Program 819.7006 Veteran-owned small business set-aside... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Veteran-owned small...
Arias, Ana; de Vasconcelos, Rafaela Andrade; Hernández, Alexis; Peters, Ove A
2017-03-01
The purpose of this study was to assess the ex vivo torsional performance of a novel rotary system in small root canals after 2 different glide path preparations. Each independent canal of 8 mesial roots of mandibular molars was randomly assigned to achieve a reproducible glide path with a new set of either PathFile #1 (Dentsply Maillefer, Ballaigues, Switzerland) and #2 or ProGlider (Dentsply Maillefer) after negotiation with a 10 K-file. After glide path preparation, root canals in both groups were shaped with the same sequence of ProTaper Gold (Dentsply Tulsa Dental Specialties, Tulsa, OK) following the directions for use recommended by the manufacturer. A total of 16 new sets of each instrument of the ProTaper Gold (PTG) system were used. The tests were run in a standardized fashion in a torque-testing platform. Peak torque (Ncm) and force (N) were registered during the shaping procedure and compared with Student t tests after normal distribution of data was confirmed. No significant differences were found for any of the instruments in peak torque or force after the 2 different glide path preparations (P > .05). Data presented in this study also serve as a basis for the recommended torque for the use of PTG instruments. Under the conditions of this study, differences in the torsional performance of PTG rotary instruments after 2 different glide path preparations could not be shown. The different geometry of glide path rotary systems seemed to have no effect on peak torque and force induced by PTG rotary instruments when shaping small root canals in extracted teeth. Copyright © 2016 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
Impact of the Curve Diameter and Laser Settings on Laser Fiber Fracture.
Haddad, Mattieu; Emiliani, Esteban; Rouchausse, Yann; Coste, Frederic; Doizi, Steeve; Berthe, Laurent; Butticé, Salvatore; Somani, Bhaskar; Traxer, Olivier
2017-09-01
To analyze the risk factors for laser fiber fractures when deflected to form a curve, including laser settings, size of the laser fiber, and the fiber bending diameter. Single-use 272 and 365 μm fibers (Rocamed ® , Monaco) were employed along with a holmium laser (Rocamed). Five different fiber curve diameters were tested: 9, 12, 15, 18, and 20 mm. Fragmentation and dusting settings were used at a theoretical power of 7.5 W. The laser was activated for 5 minutes and the principal judgment criterion was fiber fracture. Every test for each parameter, bending diameter, and fiber size combinations was repeated 10 times. With dusting settings, fibers broke more frequently at a curved diameter of 9 mm for both 272 and 365 μm fibers (p = 0.037 and 0.006, respectively). Using fragmentation settings, fibers broke more frequently at 12 mm for 272 μm and 15 mm for 365 μm (p = 0.007 and 0.033, respectively). Short pulse and high energy were significant risk factors for fiber fracture using the 365 μm fibers (p = 0.02), but not for the 272 μm fibers (p = 0.35). Frequency was not a risk factor for fiber rupture. Fiber diameters also seemed to be involved in the failure with a higher number of broken fibers for the 365 μm fibers, but this was not statistically significant when compared with the 272 μm fibers (p > 0.05). Small-core fibers are more resistant than large-core fibers as lower bending diameters (<9 mm) are required to break smaller fibers. In acute angles, the use of small-core fibers, at a low energy and long-pulse (dusting) setting, will reduce the risk of fiber rupture.
Portrait of a small population of boreal toads (Anaxyrus boreas)
Muths, Erin; Scherer, Rick D.
2011-01-01
Much attention has been given to the conservation of small populations, those that are small because of decline, and those that are naturally small. Small populations are of particular interest because ecological theory suggests that they are vulnerable to the deleterious effects of environmental, demographic, and genetic stochasticity as well as natural and human-induced catastrophes. However, testing theory and developing applicable conservation measures for small populations is hampered by sparse data. This lack of information is frequently driven by computational issues with small data sets that can be confounded by the impacts of stressors. We present estimates of demographic parameters from a small population of Boreal Toads (Anaxyrus boreas) that has been surveyed since 2001 by using capture-recapture methods. Estimates of annual adult survival probability are high relative to other Boreal Toad populations, whereas estimates of recruitment rate are low. Despite using simple models, clear patterns emerged from the analyses, suggesting that population size is constrained by low recruitment of adults and is declining slowly. These patterns provide insights that are useful in developing management directions for this small population, and this study serves as an example of the potential for small populations to yield robust and useful information despite sample size constraints.
48 CFR 206.203 - Set-asides for small business concerns.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Competition After Exclusion of Sources 206.203 Set-asides for small business concerns. (b) Also no separate... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Set-asides for small business concerns. 206.203 Section 206.203 Federal Acquisition Regulations System DEFENSE ACQUISITION...
48 CFR 206.203 - Set-asides for small business concerns.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Competition After Exclusion of Sources 206.203 Set-asides for small business concerns. (b) Also no separate... 48 Federal Acquisition Regulations System 3 2012-10-01 2012-10-01 false Set-asides for small business concerns. 206.203 Section 206.203 Federal Acquisition Regulations System DEFENSE ACQUISITION...
48 CFR 206.203 - Set-asides for small business concerns.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Competition After Exclusion of Sources 206.203 Set-asides for small business concerns. (b) Also no separate... 48 Federal Acquisition Regulations System 3 2014-10-01 2014-10-01 false Set-asides for small business concerns. 206.203 Section 206.203 Federal Acquisition Regulations System DEFENSE ACQUISITION...
48 CFR 206.203 - Set-asides for small business concerns.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Competition After Exclusion of Sources 206.203 Set-asides for small business concerns. (b) Also no separate... 48 Federal Acquisition Regulations System 3 2013-10-01 2013-10-01 false Set-asides for small business concerns. 206.203 Section 206.203 Federal Acquisition Regulations System DEFENSE ACQUISITION...
Fiolka, Tom; Dressman, Jennifer
2018-03-01
Various types of two stage in vitro testing have been used in a number of experimental settings. In addition to its application in quality control and for regulatory purposes, two-stage in vitro testing has also been shown to be a valuable technique to evaluate the supersaturation and precipitation behavior of poorly soluble drugs during drug development. The so-called 'transfer model', which is an example of two-stage testing, has provided valuable information about the in vivo performance of poorly soluble, weakly basic drugs by simulating the gastrointestinal drug transit from the stomach into the small intestine with a peristaltic pump. The evolution of the transfer model has resulted in various modifications of the experimental model set-up. Concomitantly, various research groups have developed simplified approaches to two-stage testing to investigate the supersaturation and precipitation behavior of weakly basic drugs without the necessity of using a transfer pump. Given the diversity among the various two-stage test methods available today, a more harmonized approach needs to be taken to optimize the use of two stage testing at different stages of drug development. © 2018 Royal Pharmaceutical Society.
Zhang, Xinyuan; Zheng, Nan
2008-01-01
Cell-based molecular transport simulations are being developed to facilitate exploratory cheminformatic analysis of virtual libraries of small drug-like molecules. For this purpose, mathematical models of single cells are built from equations capturing the transport of small molecules across membranes. In turn, physicochemical properties of small molecules can be used as input to simulate intracellular drug distribution, through time. Here, with mathematical equations and biological parameters adjusted so as to mimic a leukocyte in the blood, simulations were performed to analyze steady state, relative accumulation of small molecules in lysosomes, mitochondria, and cytosol of this target cell, in the presence of a homogenous extracellular drug concentration. Similarly, with equations and parameters set to mimic an intestinal epithelial cell, simulations were also performed to analyze steady state, relative distribution and transcellular permeability in this non-target cell, in the presence of an apical-to-basolateral concentration gradient. With a test set of ninety-nine monobasic amines gathered from the scientific literature, simulation results helped analyze relationships between the chemical diversity of these molecules and their intracellular distributions. Electronic supplementary material The online version of this article (doi:10.1007/s10822-008-9194-7) contains supplementary material, which is available to authorized users. PMID:18338229
MABAL: a Novel Deep-Learning Architecture for Machine-Assisted Bone Age Labeling.
Mutasa, Simukayi; Chang, Peter D; Ruzal-Shapiro, Carrie; Ayyala, Rama
2018-02-05
Bone age assessment (BAA) is a commonly performed diagnostic study in pediatric radiology to assess skeletal maturity. The most commonly utilized method for assessment of BAA is the Greulich and Pyle method (Pediatr Radiol 46.9:1269-1274, 2016; Arch Dis Child 81.2:172-173, 1999) atlas. The evaluation of BAA can be a tedious and time-consuming process for the radiologist. As such, several computer-assisted detection/diagnosis (CAD) methods have been proposed for automation of BAA. Classical CAD tools have traditionally relied on hard-coded algorithmic features for BAA which suffer from a variety of drawbacks. Recently, the advent and proliferation of convolutional neural networks (CNNs) has shown promise in a variety of medical imaging applications. There have been at least two published applications of using deep learning for evaluation of bone age (Med Image Anal 36:41-51, 2017; JDI 1-5, 2017). However, current implementations are limited by a combination of both architecture design and relatively small datasets. The purpose of this study is to demonstrate the benefits of a customized neural network algorithm carefully calibrated to the evaluation of bone age utilizing a relatively large institutional dataset. In doing so, this study will aim to show that advanced architectures can be successfully trained from scratch in the medical imaging domain and can generate results that outperform any existing proposed algorithm. The training data consisted of 10,289 images of different skeletal age examinations, 8909 from the hospital Picture Archiving and Communication System at our institution and 1383 from the public Digital Hand Atlas Database. The data was separated into four cohorts, one each for male and female children above the age of 8, and one each for male and female children below the age of 10. The testing set consisted of 20 radiographs of each 1-year-age cohort from 0 to 1 years to 14-15+ years, half male and half female. The testing set included left-hand radiographs done for bone age assessment, trauma evaluation without significant findings, and skeletal surveys. A 14 hidden layer-customized neural network was designed for this study. The network included several state of the art techniques including residual-style connections, inception layers, and spatial transformer layers. Data augmentation was applied to the network inputs to prevent overfitting. A linear regression output was utilized. Mean square error was used as the network loss function and mean absolute error (MAE) was utilized as the primary performance metric. MAE accuracies on the validation and test sets for young females were 0.654 and 0.561 respectively. For older females, validation and test accuracies were 0.662 and 0.497 respectively. For young males, validation and test accuracies were 0.649 and 0.585 respectively. Finally, for older males, validation and test set accuracies were 0.581 and 0.501 respectively. The female cohorts were trained for 900 epochs each and the male cohorts were trained for 600 epochs. An eightfold cross-validation set was employed for hyperparameter tuning. Test error was obtained after training on a full data set with the selected hyperparameters. Using our proposed customized neural network architecture on our large available data, we achieved an aggregate validation and test set mean absolute errors of 0.637 and 0.536 respectively. To date, this is the best published performance on utilizing deep learning for bone age assessment. Our results support our initial hypothesis that customized, purpose-built neural networks provide improved performance over networks derived from pre-trained imaging data sets. We build on that initial work by showing that the addition of state-of-the-art techniques such as residual connections and inception architecture further improves prediction accuracy. This is important because the current assumption for use of residual and/or inception architectures is that a large pre-trained network is required for successful implementation given the relatively small datasets in medical imaging. Instead we show that a small, customized architecture incorporating advanced CNN strategies can indeed be trained from scratch, yielding significant improvements in algorithm accuracy. It should be noted that for all four cohorts, testing error outperformed validation error. One reason for this is that our ground truth for our test set was obtained by averaging two pediatric radiologist reads compared to our training data for which only a single read was used. This suggests that despite relatively noisy training data, the algorithm could successfully model the variation between observers and generate estimates that are close to the expected ground truth.
Culen, Martin; Rezacova, Anna; Jampilek, Josef; Dohnal, Jiri
2013-09-01
Development of new pharmaceutical compounds and dosage forms often requires in vitro dissolution testing with the closest similarity to the human gastrointestinal (GI) tract. To create such conditions, one needs a suitable dissolution apparatus and the appropriate data on the human GI physiology. This review discusses technological approaches applicable in biorelevant dissolutions as well as the physiology of stomach and small intestine in both fasted and fed state, that is, volumes of contents, transit times for water/food and various solid oral dosage forms, pH, osmolality, surface tension, buffer capacity, and concentrations of bile salts, phospholipids, enzymes, and Ca(2+) ions. The information is aimed to provide clear suggestions on how these conditions should be set in a dynamic biorelevant dissolution test. Copyright © 2013 Wiley Periodicals, Inc.
Sweeney, Sedona; Mosha, Jacklin F; Terris-Prestholt, Fern; Sollis, Kimberly A; Kelly, Helen; Changalucha, John; Peeling, Rosanna W
2014-08-01
To determine the costs of Rapid Syphilis Test (RSTs) as compared with rapid plasma reagin (RPR) when implemented in a Tanzanian setting, and to determine the relative impact of a quality assurance (QA) system on the cost of RST implementation. The incremental costs for RPR and RST screening programmes in existing antenatal care settings in Geita District, Tanzania were collected for 9 months in subsequent years from nine health facilities that varied in size, remoteness and scope of antenatal services. The costs per woman tested and treated were estimated for each facility. A sensitivity analysis was constructed to determine the impact of parameter and model uncertainty. In surveyed facilities, a total of 6362 women were tested with RSTs compared with 224 tested with RPR. The range of unit costs was $1.76-$3.13 per woman screened and $12.88-$32.67 per woman treated. Unit costs for the QA system came to $0.51 per woman tested, of which 50% were attributed to salaries and transport for project personnel. Our results suggest that rapid syphilis diagnostics are very inexpensive in this setting and can overcome some critical barriers to ensuring universal access to syphilis testing and treatment. The additional costs for implementation of a quality system were found to be relatively small, and could be reduced through alterations to the programme design. Given the potential for a quality system to improve quality of diagnosis and care, we recommend that QA activities be incorporated into RST roll-out. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2013; all rights reserved.
Involvement of astrocyte metabolic coupling in Tourette syndrome pathogenesis.
de Leeuw, Christiaan; Goudriaan, Andrea; Smit, August B; Yu, Dongmei; Mathews, Carol A; Scharf, Jeremiah M; Verheijen, Mark H G; Posthuma, Danielle
2015-11-01
Tourette syndrome is a heritable neurodevelopmental disorder whose pathophysiology remains unknown. Recent genome-wide association studies suggest that it is a polygenic disorder influenced by many genes of small effect. We tested whether these genes cluster in cellular function by applying gene-set analysis using expert curated sets of brain-expressed genes in the current largest available Tourette syndrome genome-wide association data set, involving 1285 cases and 4964 controls. The gene sets included specific synaptic, astrocytic, oligodendrocyte and microglial functions. We report association of Tourette syndrome with a set of genes involved in astrocyte function, specifically in astrocyte carbohydrate metabolism. This association is driven primarily by a subset of 33 genes involved in glycolysis and glutamate metabolism through which astrocytes support synaptic function. Our results indicate for the first time that the process of astrocyte-neuron metabolic coupling may be an important contributor to Tourette syndrome pathogenesis.
Involvement of astrocyte metabolic coupling in Tourette syndrome pathogenesis
de Leeuw, Christiaan; Goudriaan, Andrea; Smit, August B; Yu, Dongmei; Mathews, Carol A; Scharf, Jeremiah M; Scharf, J M; Pauls, D L; Yu, D; Illmann, C; Osiecki, L; Neale, B M; Mathews, C A; Reus, V I; Lowe, T L; Freimer, N B; Cox, N J; Davis, L K; Rouleau, G A; Chouinard, S; Dion, Y; Girard, S; Cath, D C; Posthuma, D; Smit, J H; Heutink, P; King, R A; Fernandez, T; Leckman, J F; Sandor, P; Barr, C L; McMahon, W; Lyon, G; Leppert, M; Morgan, J; Weiss, R; Grados, M A; Singer, H; Jankovic, J; Tischfield, J A; Heiman, G A; Verheijen, Mark H G; Posthuma, Danielle
2015-01-01
Tourette syndrome is a heritable neurodevelopmental disorder whose pathophysiology remains unknown. Recent genome-wide association studies suggest that it is a polygenic disorder influenced by many genes of small effect. We tested whether these genes cluster in cellular function by applying gene-set analysis using expert curated sets of brain-expressed genes in the current largest available Tourette syndrome genome-wide association data set, involving 1285 cases and 4964 controls. The gene sets included specific synaptic, astrocytic, oligodendrocyte and microglial functions. We report association of Tourette syndrome with a set of genes involved in astrocyte function, specifically in astrocyte carbohydrate metabolism. This association is driven primarily by a subset of 33 genes involved in glycolysis and glutamate metabolism through which astrocytes support synaptic function. Our results indicate for the first time that the process of astrocyte-neuron metabolic coupling may be an important contributor to Tourette syndrome pathogenesis. PMID:25735483
Dooley, Christopher J; Tenore, Francesco V; Gayzik, F Scott; Merkle, Andrew C
2018-04-27
Biological tissue testing is inherently susceptible to the wide range of variability specimen to specimen. A primary resource for encapsulating this range of variability is the biofidelity response corridor or BRC. In the field of injury biomechanics, BRCs are often used for development and validation of both physical, such as anthropomorphic test devices, and computational models. For the purpose of generating corridors, post-mortem human surrogates were tested across a range of loading conditions relevant to under-body blast events. To sufficiently cover the wide range of input conditions, a relatively small number of tests were performed across a large spread of conditions. The high volume of required testing called for leveraging the capabilities of multiple impact test facilities, all with slight variations in test devices. A method for assessing similitude of responses between test devices was created as a metric for inclusion of a response in the resulting BRC. The goal of this method was to supply a statistically sound, objective method to assess the similitude of an individual response against a set of responses to ensure that the BRC created from the set was affected primarily by biological variability, not anomalies or differences stemming from test devices. Copyright © 2018 Elsevier Ltd. All rights reserved.
Development and evaluation of an automatic labeling technique for spring small grains
NASA Technical Reports Server (NTRS)
Crist, E. P.; Malila, W. A. (Principal Investigator)
1981-01-01
A labeling technique is described which seeks to associate a sampling entity with a particular crop or crop group based on similarity of growing season and temporal-spectral patterns of development. Human analyst provide contextual information, after which labeling decisions are made automatically. Results of a test of the technique on a large, multi-year data set are reported. Grain labeling accuracies are similar to those achieved by human analysis techniques, while non-grain accuracies are lower. Recommendations for improvments and implications of the test results are discussed.
Schillaci, Michael A; Schillaci, Mario E
2009-02-01
The use of small sample sizes in human and primate evolutionary research is commonplace. Estimating how well small samples represent the underlying population, however, is not commonplace. Because the accuracy of determinations of taxonomy, phylogeny, and evolutionary process are dependant upon how well the study sample represents the population of interest, characterizing the uncertainty, or potential error, associated with analyses of small sample sizes is essential. We present a method for estimating the probability that the sample mean is within a desired fraction of the standard deviation of the true mean using small (n<10) or very small (n < or = 5) sample sizes. This method can be used by researchers to determine post hoc the probability that their sample is a meaningful approximation of the population parameter. We tested the method using a large craniometric data set commonly used by researchers in the field. Given our results, we suggest that sample estimates of the population mean can be reasonable and meaningful even when based on small, and perhaps even very small, sample sizes.
Assessing the Probability that a Finding Is Genuine for Large-Scale Genetic Association Studies
Kuo, Chia-Ling; Vsevolozhskaya, Olga A.; Zaykin, Dmitri V.
2015-01-01
Genetic association studies routinely involve massive numbers of statistical tests accompanied by P-values. Whole genome sequencing technologies increased the potential number of tested variants to tens of millions. The more tests are performed, the smaller P-value is required to be deemed significant. However, a small P-value is not equivalent to small chances of a spurious finding and significance thresholds may fail to serve as efficient filters against false results. While the Bayesian approach can provide a direct assessment of the probability that a finding is spurious, its adoption in association studies has been slow, due in part to the ubiquity of P-values and the automated way they are, as a rule, produced by software packages. Attempts to design simple ways to convert an association P-value into the probability that a finding is spurious have been met with difficulties. The False Positive Report Probability (FPRP) method has gained increasing popularity. However, FPRP is not designed to estimate the probability for a particular finding, because it is defined for an entire region of hypothetical findings with P-values at least as small as the one observed for that finding. Here we propose a method that lets researchers extract probability that a finding is spurious directly from a P-value. Considering the counterpart of that probability, we term this method POFIG: the Probability that a Finding is Genuine. Our approach shares FPRP's simplicity, but gives a valid probability that a finding is spurious given a P-value. In addition to straightforward interpretation, POFIG has desirable statistical properties. The POFIG average across a set of tentative associations provides an estimated proportion of false discoveries in that set. POFIGs are easily combined across studies and are immune to multiple testing and selection bias. We illustrate an application of POFIG method via analysis of GWAS associations with Crohn's disease. PMID:25955023
Assessing the Probability that a Finding Is Genuine for Large-Scale Genetic Association Studies.
Kuo, Chia-Ling; Vsevolozhskaya, Olga A; Zaykin, Dmitri V
2015-01-01
Genetic association studies routinely involve massive numbers of statistical tests accompanied by P-values. Whole genome sequencing technologies increased the potential number of tested variants to tens of millions. The more tests are performed, the smaller P-value is required to be deemed significant. However, a small P-value is not equivalent to small chances of a spurious finding and significance thresholds may fail to serve as efficient filters against false results. While the Bayesian approach can provide a direct assessment of the probability that a finding is spurious, its adoption in association studies has been slow, due in part to the ubiquity of P-values and the automated way they are, as a rule, produced by software packages. Attempts to design simple ways to convert an association P-value into the probability that a finding is spurious have been met with difficulties. The False Positive Report Probability (FPRP) method has gained increasing popularity. However, FPRP is not designed to estimate the probability for a particular finding, because it is defined for an entire region of hypothetical findings with P-values at least as small as the one observed for that finding. Here we propose a method that lets researchers extract probability that a finding is spurious directly from a P-value. Considering the counterpart of that probability, we term this method POFIG: the Probability that a Finding is Genuine. Our approach shares FPRP's simplicity, but gives a valid probability that a finding is spurious given a P-value. In addition to straightforward interpretation, POFIG has desirable statistical properties. The POFIG average across a set of tentative associations provides an estimated proportion of false discoveries in that set. POFIGs are easily combined across studies and are immune to multiple testing and selection bias. We illustrate an application of POFIG method via analysis of GWAS associations with Crohn's disease.
NASA Astrophysics Data System (ADS)
Anderson, Kevin; Lin, Jun T.; Wong, Alexander J.
2017-11-01
Research findings of an experimental and numerical investigation of windage losses in the small annular air gap region between the stator and rotor of a high speed electric motor are presented herein. The experimental set-up is used to empirically measure the windage losses in the motor by measuring torque and rotational speed. The motor rotor spins at roughly 30,000 rpm and the rotor sets up windage losses on the order of 100 W. Axial air flow of 200 L/min is used to cool the motor, thus setting up a pseudo Taylor-Couette Poiseuille type of flow. Details of the experimental test apparatus, instrumentation and data acquisition are given. Experimental data for spin-down (both actively and passively cooled) and calibration of bearing windage losses are discussed. A Computational Fluid Dynamics (CFD) model is developed and used to predict the torque speed curve and windage losses in the motor. The CFD model is correlated with the experimental data. The CFD model is also used to predict the formation of the Taylor-Couette cells in the small gap region of the high speed motor. Results for windage losses, spin-down time constant, bearing losses, and torque of the motor versus cooling air mass flow rate and rotational speed are presented in this study. Mechanical Engineering.
48 CFR 6.206 - Set-asides for service-disabled veteran-owned small business concerns.
Code of Federal Regulations, 2011 CFR
2011-10-01
...-disabled veteran-owned small business concerns. 6.206 Section 6.206 Federal Acquisition Regulations System... After Exclusion of Sources 6.206 Set-asides for service-disabled veteran-owned small business concerns.... 657f), contracting officers may set-aside solicitations to allow only service-disabled veteran-owned...
48 CFR 6.206 - Set-asides for service-disabled veteran-owned small business concerns.
Code of Federal Regulations, 2012 CFR
2012-10-01
...-disabled veteran-owned small business concerns. 6.206 Section 6.206 Federal Acquisition Regulations System... After Exclusion of Sources 6.206 Set-asides for service-disabled veteran-owned small business concerns.... 657f), contracting officers may set-aside solicitations to allow only service-disabled veteran-owned...
48 CFR 6.206 - Set-asides for service-disabled veteran-owned small business concerns.
Code of Federal Regulations, 2013 CFR
2013-10-01
...-disabled veteran-owned small business concerns. 6.206 Section 6.206 Federal Acquisition Regulations System... After Exclusion of Sources 6.206 Set-asides for service-disabled veteran-owned small business concerns.... 657f), contracting officers may set-aside solicitations to allow only service-disabled veteran-owned...
48 CFR 6.206 - Set-asides for service-disabled veteran-owned small business concerns.
Code of Federal Regulations, 2014 CFR
2014-10-01
...-disabled veteran-owned small business concerns. 6.206 Section 6.206 Federal Acquisition Regulations System... After Exclusion of Sources 6.206 Set-asides for service-disabled veteran-owned small business concerns.... 657f), contracting officers may set-aside solicitations to allow only service-disabled veteran-owned...
75 FR 25844 - Class Deviation From FAR 52.219-7, Notice of Partial Small Business Set-Aside
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-10
... Small Business Set-Aside AGENCY: Defense Logistics Agency, DoD. ACTION: Notice. SUMMARY: This is to...) regarding partial small business set-asides for Defense Logistics Agency (DLA), Defense Energy Support Center (DESC) bulk fuels solicitations and resulting contract awards. DLA is requesting Department of...
76 FR 68032 - Federal Acquisition Regulation; Set-Asides for Small Business
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-02
... clause 52.219-29, Notice of Set-Aside for Economically Disadvantaged Women-owned Small Business Concerns... disadvantaged women-owned small business (EDWOSB) concerns under 19.1505(b). This includes multiple-award...). (b) The contracting officer shall insert the clause 52.219-30, Notice of Set-Aside for Women-Owned...
Acoustic and aerodynamic testing of a scale model variable pitch fan
NASA Technical Reports Server (NTRS)
Jutras, R. R.; Kazin, S. B.
1974-01-01
A fully reversible pitch scale model fan with variable pitch rotor blades was tested to determine its aerodynamic and acoustic characteristics. The single-stage fan has a design tip speed of 1160 ft/sec (353.568 m/sec) at a bypass pressure ratio of 1.5. Three operating lines were investigated. Test results show that the blade pitch for minimum noise also resulted in the highest efficiency for all three operating lines at all thrust levels. The minimum perceived noise on a 200-ft (60.96 m) sideline was obtained with the nominal nozzle. At 44% of takeoff thrust, the PNL reduction between blade pitch and minimum noise blade pitch is 1.8 PNdB for the nominal nozzle and decreases with increasing thrust. The small nozzle (6% undersized) has the highest efficiency at all part thrust conditions for the minimum noise blade pitch setting; although, the noise is about 1.0 PNdB higher for the small nozzle at the minimum noise blade pitch position.
Testing Small CPAS Parachutes Using HIVAS
NASA Technical Reports Server (NTRS)
Ray, Eric S.; Hennings, Elsa; Bernatovich, Michael A.
2013-01-01
The High Velocity Airflow System (HIVAS) facility at the Naval Air Warfare Center (NAWC) at China Lake was successfully used as an alternative to flight test to determine parachute drag performance of two small Capsule Parachute Assembly System (CPAS) canopies. A similar parachute with known performance was also tested as a control. Realtime computations of drag coefficient were unrealistically low. This is because HIVAS produces a non-uniform flow which rapidly decays from a high central core flow. Additional calibration runs were performed to characterize this flow assuming radial symmetry from the centerline. The flow field was used to post-process effective flow velocities at each throttle setting and parachute diameter using the definition of the momentum flux factor. Because one parachute had significant oscillations, additional calculations were required to estimate the projected flow at off-axis angles. The resulting drag data from HIVAS compared favorably to previously estimated parachute performance based on scaled data from analogous CPAS parachutes. The data will improve drag area distributions in the next version of the CPAS Model Memo.
Statistical correlation analysis for comparing vibration data from test and analysis
NASA Technical Reports Server (NTRS)
Butler, T. G.; Strang, R. F.; Purves, L. R.; Hershfeld, D. J.
1986-01-01
A theory was developed to compare vibration modes obtained by NASTRAN analysis with those obtained experimentally. Because many more analytical modes can be obtained than experimental modes, the analytical set was treated as expansion functions for putting both sources in comparative form. The dimensional symmetry was developed for three general cases: nonsymmetric whole model compared with a nonsymmetric whole structural test, symmetric analytical portion compared with a symmetric experimental portion, and analytical symmetric portion with a whole experimental test. The theory was coded and a statistical correlation program was installed as a utility. The theory is established with small classical structures.
NASA Astrophysics Data System (ADS)
Yockel, Scott; Mintz, Benjamin; Wilson, Angela K.
2004-07-01
Advanced ab initio [coupled cluster theory through quasiperturbative triple excitations (CCSD(T))] and density functional (B3LYP) computational chemistry approaches were used in combination with the standard and augmented correlation consistent polarized valence basis sets [cc-pVnZ and aug-cc-pVnZ, where n=D(2), T(3), Q(4), and 5] to investigate the energetic and structural properties of small molecules containing third-row (Ga-Kr) atoms. These molecules were taken from the Gaussian-2 (G2) extended test set for third-row atoms. Several different schemes were used to extrapolate the calculated energies to the complete basis set (CBS) limit for CCSD(T) and the Kohn-Sham (KS) limit for B3LYP. Zero point energy and spin orbital corrections were included in the results. Overall, CCSD(T) atomization energies, ionization energies, proton affinities, and electron affinities are in good agreement with experiment, within 1.1 kcal/mol when the CBS limit has been determined using a series of two basis sets of at least triple zeta quality. For B3LYP, the overall mean absolute deviation from experiment for the three properties and the series of molecules is more significant at the KS limit, within 2.3 and 2.6 kcal/mol for the cc-pVnZ and aug-cc-pVnZ basis set series, respectively.
Efficiency of parallel direct optimization
NASA Technical Reports Server (NTRS)
Janies, D. A.; Wheeler, W. C.
2001-01-01
Tremendous progress has been made at the level of sequential computation in phylogenetics. However, little attention has been paid to parallel computation. Parallel computing is particularly suited to phylogenetics because of the many ways large computational problems can be broken into parts that can be analyzed concurrently. In this paper, we investigate the scaling factors and efficiency of random addition and tree refinement strategies using the direct optimization software, POY, on a small (10 slave processors) and a large (256 slave processors) cluster of networked PCs running LINUX. These algorithms were tested on several data sets composed of DNA and morphology ranging from 40 to 500 taxa. Various algorithms in POY show fundamentally different properties within and between clusters. All algorithms are efficient on the small cluster for the 40-taxon data set. On the large cluster, multibuilding exhibits excellent parallel efficiency, whereas parallel building is inefficient. These results are independent of data set size. Branch swapping in parallel shows excellent speed-up for 16 slave processors on the large cluster. However, there is no appreciable speed-up for branch swapping with the further addition of slave processors (>16). This result is independent of data set size. Ratcheting in parallel is efficient with the addition of up to 32 processors in the large cluster. This result is independent of data set size. c2001 The Willi Hennig Society.
Genomic Prediction of Seed Quality Traits Using Advanced Barley Breeding Lines.
Nielsen, Nanna Hellum; Jahoor, Ahmed; Jensen, Jens Due; Orabi, Jihad; Cericola, Fabio; Edriss, Vahid; Jensen, Just
2016-01-01
Genomic selection was recently introduced in plant breeding. The objective of this study was to develop genomic prediction for important seed quality parameters in spring barley. The aim was to predict breeding values without expensive phenotyping of large sets of lines. A total number of 309 advanced spring barley lines tested at two locations each with three replicates were phenotyped and each line was genotyped by Illumina iSelect 9Kbarley chip. The population originated from two different breeding sets, which were phenotyped in two different years. Phenotypic measurements considered were: seed size, protein content, protein yield, test weight and ergosterol content. A leave-one-out cross-validation strategy revealed high prediction accuracies ranging between 0.40 and 0.83. Prediction across breeding sets resulted in reduced accuracies compared to the leave-one-out strategy. Furthermore, predicting across full and half-sib-families resulted in reduced prediction accuracies. Additionally, predictions were performed using reduced marker sets and reduced training population sets. In conclusion, using less than 200 lines in the training set can result in low prediction accuracy, and the accuracy will then be highly dependent on the family structure of the selected training set. However, the results also indicate that relatively small training sets (200 lines) are sufficient for genomic prediction in commercial barley breeding. In addition, our results indicate a minimum marker set of 1,000 to decrease the risk of low prediction accuracy for some traits or some families.
Genomic Prediction of Seed Quality Traits Using Advanced Barley Breeding Lines
Nielsen, Nanna Hellum; Jahoor, Ahmed; Jensen, Jens Due; Orabi, Jihad; Cericola, Fabio; Edriss, Vahid; Jensen, Just
2016-01-01
Genomic selection was recently introduced in plant breeding. The objective of this study was to develop genomic prediction for important seed quality parameters in spring barley. The aim was to predict breeding values without expensive phenotyping of large sets of lines. A total number of 309 advanced spring barley lines tested at two locations each with three replicates were phenotyped and each line was genotyped by Illumina iSelect 9Kbarley chip. The population originated from two different breeding sets, which were phenotyped in two different years. Phenotypic measurements considered were: seed size, protein content, protein yield, test weight and ergosterol content. A leave-one-out cross-validation strategy revealed high prediction accuracies ranging between 0.40 and 0.83. Prediction across breeding sets resulted in reduced accuracies compared to the leave-one-out strategy. Furthermore, predicting across full and half-sib-families resulted in reduced prediction accuracies. Additionally, predictions were performed using reduced marker sets and reduced training population sets. In conclusion, using less than 200 lines in the training set can result in low prediction accuracy, and the accuracy will then be highly dependent on the family structure of the selected training set. However, the results also indicate that relatively small training sets (200 lines) are sufficient for genomic prediction in commercial barley breeding. In addition, our results indicate a minimum marker set of 1,000 to decrease the risk of low prediction accuracy for some traits or some families. PMID:27783639
P.D. Jones; L.R. Schimleck; G.F. Peter; R.F. Daniels; A. Clark
2005-01-01
Preliminary studies based on small sample sets show that near infrared (NIR) spectroscopy has the potential for rapidly estimating many important wood properties. However, if NIR is to be used operationally, then calibrations using several hundred samples from a wide variety of growing conditions need to be developed and their performance tested on samples from new...
ERIC Educational Resources Information Center
LONGEST, JAMES W.; GENGENBACK, WILLIAM H.
THE MOST FREQUENT METHOD OF GROUP FORMATION FOR INTENSIVE FARM MANAGEMENT PROGRAMS IN NEW YORK STATE HAS BEEN TO COMBINE ALL INTERESTED FARMERS IN LARGE GROUPS AT THE COUNTY EXTENSION HEADQUARTERS. THIS EXPERIMENT WAS SET UP TO STUDY THE EFFECTIVENESS OF TWO METHODS OF FORMING SMALL GROUPS--BY SOCIOMETRIC CHOICE OR SIMILAR CHARACTERISTICS. ALL…
Discrete-Slots Models of Visual Working-Memory Response Times
Donkin, Christopher; Nosofsky, Robert M.; Gold, Jason M.; Shiffrin, Richard M.
2014-01-01
Much recent research has aimed to establish whether visual working memory (WM) is better characterized by a limited number of discrete all-or-none slots or by a continuous sharing of memory resources. To date, however, researchers have not considered the response-time (RT) predictions of discrete-slots versus shared-resources models. To complement the past research in this field, we formalize a family of mixed-state, discrete-slots models for explaining choice and RTs in tasks of visual WM change detection. In the tasks under investigation, a small set of visual items is presented, followed by a test item in 1 of the studied positions for which a change judgment must be made. According to the models, if the studied item in that position is retained in 1 of the discrete slots, then a memory-based evidence-accumulation process determines the choice and the RT; if the studied item in that position is missing, then a guessing-based accumulation process operates. Observed RT distributions are therefore theorized to arise as probabilistic mixtures of the memory-based and guessing distributions. We formalize an analogous set of continuous shared-resources models. The model classes are tested on individual subjects with both qualitative contrasts and quantitative fits to RT-distribution data. The discrete-slots models provide much better qualitative and quantitative accounts of the RT and choice data than do the shared-resources models, although there is some evidence for “slots plus resources” when memory set size is very small. PMID:24015956
The modified hole board--measuring behavior, cognition and social interaction in mice and rats.
Labots, Maaike; Van Lith, Hein A; Ohl, Frauke; Arndt, Saskia S
2015-04-08
This protocol describes the modified hole board (mHB), which combines features from a traditional hole board and open field and is designed to measure multiple dimensions of unconditioned behavior in small laboratory mammals (e.g., mice, rats, tree shrews and small primates). This paradigm is a valuable alternative for the use of a behavioral test battery, since a broad behavioral spectrum of an animal's behavioral profile can be investigated in one single test. The apparatus consists of a box, representing the 'protected' area, separated from a group compartment. A board, on which small cylinders are staggered in three lines, is placed in the center of the box, representing the 'unprotected' area of the set-up. The cognitive abilities of the animals can be measured by baiting some cylinders on the board and measuring the working and reference memory. Other unconditioned behavior, such as activity-related-, anxiety-related- and social behavior, can be observed using this paradigm. Behavioral flexibility and the ability to habituate to a novel environment can additionally be observed by subjecting the animals to multiple trials in the mHB, revealing insight into the animals' adaptive capacities. Due to testing order effects in a behavioral test battery, naïve animals should be used for each individual experiment. By testing multiple behavioral dimensions in a single paradigm and thereby circumventing this issue, the number of experimental animals used is reduced. Furthermore, by avoiding social isolation during testing and without the need to food deprive the animals, the mHB represents a behavioral test system, inducing if any, very low amount of stress.
DiCanio, Christian; Nam, Hosung; Whalen, Douglas H.; Timothy Bunnell, H.; Amith, Jonathan D.; García, Rey Castillo
2013-01-01
While efforts to document endangered languages have steadily increased, the phonetic analysis of endangered language data remains a challenge. The transcription of large documentation corpora is, by itself, a tremendous feat. Yet, the process of segmentation remains a bottleneck for research with data of this kind. This paper examines whether a speech processing tool, forced alignment, can facilitate the segmentation task for small data sets, even when the target language differs from the training language. The authors also examined whether a phone set with contextualization outperforms a more general one. The accuracy of two forced aligners trained on English (hmalign and p2fa) was assessed using corpus data from Yoloxóchitl Mixtec. Overall, agreement performance was relatively good, with accuracy at 70.9% within 30 ms for hmalign and 65.7% within 30 ms for p2fa. Segmental and tonal categories influenced accuracy as well. For instance, additional stop allophones in hmalign's phone set aided alignment accuracy. Agreement differences between aligners also corresponded closely with the types of data on which the aligners were trained. Overall, using existing alignment systems was found to have potential for making phonetic analysis of small corpora more efficient, with more allophonic phone sets providing better agreement than general ones. PMID:23967953
DiCanio, Christian; Nam, Hosung; Whalen, Douglas H; Bunnell, H Timothy; Amith, Jonathan D; García, Rey Castillo
2013-09-01
While efforts to document endangered languages have steadily increased, the phonetic analysis of endangered language data remains a challenge. The transcription of large documentation corpora is, by itself, a tremendous feat. Yet, the process of segmentation remains a bottleneck for research with data of this kind. This paper examines whether a speech processing tool, forced alignment, can facilitate the segmentation task for small data sets, even when the target language differs from the training language. The authors also examined whether a phone set with contextualization outperforms a more general one. The accuracy of two forced aligners trained on English (hmalign and p2fa) was assessed using corpus data from Yoloxóchitl Mixtec. Overall, agreement performance was relatively good, with accuracy at 70.9% within 30 ms for hmalign and 65.7% within 30 ms for p2fa. Segmental and tonal categories influenced accuracy as well. For instance, additional stop allophones in hmalign's phone set aided alignment accuracy. Agreement differences between aligners also corresponded closely with the types of data on which the aligners were trained. Overall, using existing alignment systems was found to have potential for making phonetic analysis of small corpora more efficient, with more allophonic phone sets providing better agreement than general ones.
Oligo-branched peptides for tumor targeting: from magic bullets to magic forks.
Falciani, Chiara; Pini, Alessandro; Bracci, Luisa
2009-02-01
Selective targeting of tumor cells is the final goal of research and drug discovery for cancer diagnosis, imaging and therapy. After the invention of hybridoma technology, the concept of magic bullet was introduced into the field of oncology, referring to selective killing of tumor cells, by specific antibodies. More recently, small molecules and peptides have also been proposed as selective targeting agents. We analyze the state of the art of tumor-selective agents that are presently available and tested in clinical settings. A novel approach based on 'armed' oligo-branched peptides as tumor targeting agents, is discussed and compared with existing tumor-selective therapies mediated by antibodies, small molecules or monomeric peptides. Oligo-branched peptides could be novel drugs that combine the advantages of antibodies and small molecules.
Retina verification system based on biometric graph matching.
Lajevardi, Seyed Mehdi; Arakala, Arathi; Davis, Stephen A; Horadam, Kathy J
2013-09-01
This paper presents an automatic retina verification framework based on the biometric graph matching (BGM) algorithm. The retinal vasculature is extracted using a family of matched filters in the frequency domain and morphological operators. Then, retinal templates are defined as formal spatial graphs derived from the retinal vasculature. The BGM algorithm, a noisy graph matching algorithm, robust to translation, non-linear distortion, and small rotations, is used to compare retinal templates. The BGM algorithm uses graph topology to define three distance measures between a pair of graphs, two of which are new. A support vector machine (SVM) classifier is used to distinguish between genuine and imposter comparisons. Using single as well as multiple graph measures, the classifier achieves complete separation on a training set of images from the VARIA database (60% of the data), equaling the state-of-the-art for retina verification. Because the available data set is small, kernel density estimation (KDE) of the genuine and imposter score distributions of the training set are used to measure performance of the BGM algorithm. In the one dimensional case, the KDE model is validated with the testing set. A 0 EER on testing shows that the KDE model is a good fit for the empirical distribution. For the multiple graph measures, a novel combination of the SVM boundary and the KDE model is used to obtain a fair comparison with the KDE model for the single measure. A clear benefit in using multiple graph measures over a single measure to distinguish genuine and imposter comparisons is demonstrated by a drop in theoretical error of between 60% and more than two orders of magnitude.
Plutonium segregation in glassy aerodynamic fallout from a nuclear weapon test
Holliday, K. S.; Dierken, J. M.; Monroe, M. L.; ...
2017-01-11
Our study combines electron microscopy equipped with energy dispersive spectroscopy to probe major element composition and autoradiography to map plutonium in order to examine the spatial relationships between plutonium and fallout composition in aerodynamic glassy fallout from a nuclear weapon test. We interrogated a sample set of 48 individual fallout specimens in order to reveal that the significant chemical heterogeneity of this sample set could be described compositionally with a relatively small number of compositional endmembers. Furthermore, high concentrations of plutonium were never associated with several endmember compositions and concentrated with the so-called mafic glass endmember. Our result suggests thatmore » it is the physical characteristics of the compositional endmembers and not the chemical characteristics of the individual component elements that govern the un-burnt plutonium distribution with respect to major element composition in fallout.« less
A bootstrap based Neyman-Pearson test for identifying variable importance.
Ditzler, Gregory; Polikar, Robi; Rosen, Gail
2015-04-01
Selection of most informative features that leads to a small loss on future data are arguably one of the most important steps in classification, data analysis and model selection. Several feature selection (FS) algorithms are available; however, due to noise present in any data set, FS algorithms are typically accompanied by an appropriate cross-validation scheme. In this brief, we propose a statistical hypothesis test derived from the Neyman-Pearson lemma for determining if a feature is statistically relevant. The proposed approach can be applied as a wrapper to any FS algorithm, regardless of the FS criteria used by that algorithm, to determine whether a feature belongs in the relevant set. Perhaps more importantly, this procedure efficiently determines the number of relevant features given an initial starting point. We provide freely available software implementations of the proposed methodology.
Contextual classification on the massively parallel processor
NASA Technical Reports Server (NTRS)
Tilton, James C.
1987-01-01
Classifiers are often used to produce land cover maps from multispectral Earth observation imagery. Conventionally, these classifiers have been designed to exploit the spectral information contained in the imagery. Very few classifiers exploit the spatial information content of the imagery, and the few that do rarely exploit spatial information content in conjunction with spectral and/or temporal information. A contextual classifier that exploits spatial and spectral information in combination through a general statistical approach was studied. Early test results obtained from an implementation of the classifier on a VAX-11/780 minicomputer were encouraging, but they are of limited meaning because they were produced from small data sets. An implementation of the contextual classifier is presented on the Massively Parallel Processor (MPP) at Goddard that for the first time makes feasible the testing of the classifier on large data sets.
48 CFR 52.219-27 - Notice of Total Service-Disabled Veteran-Owned Small Business Set-Aside.
Code of Federal Regulations, 2011 CFR
2011-10-01
...-Disabled Veteran-Owned Small Business Set-Aside. 52.219-27 Section 52.219-27 Federal Acquisition... CONTRACT CLAUSES Text of Provisions and Clauses 52.219-27 Notice of Total Service-Disabled Veteran-Owned...-Disabled Veteran-Owned Small Business Set-Aside (MAY 2004) (a) Definition. Service-disabled veteran-owned...
48 CFR 19.503 - Setting aside a class of acquisitions for small business.
Code of Federal Regulations, 2013 CFR
2013-10-01
... only to the (named) contracting office(s) making the determination; and (4) Provide that the set-aside... competitive market conditions that have occurred since the initial approval of the class small business set... a fair market price by the Government or in a change in the capability of small business concerns to...
48 CFR 52.219-27 - Notice of Total Service-Disabled Veteran-Owned Small Business Set-Aside.
Code of Federal Regulations, 2010 CFR
2010-10-01
... management and daily business operations of which are controlled by one or more service-disabled veterans or...-Disabled Veteran-Owned Small Business Set-Aside. 52.219-27 Section 52.219-27 Federal Acquisition... Small Business Set-Aside. As prescribed in 19.1407, insert the following clause: Notice of Total Service...
48 CFR 852.219-11 - VA notice of total veteran-owned small business set-aside.
Code of Federal Regulations, 2010 CFR
2010-10-01
... owned by one or more veterans; (ii) The management and daily business operations of which are controlled...-owned small business set-aside. 852.219-11 Section 852.219-11 Federal Acquisition Regulations System... Provisions and Clauses 852.219-11 VA notice of total veteran-owned small business set-aside. As prescribed in...
NASA Astrophysics Data System (ADS)
Fewtrell, Timothy; Bates, Paul; Horritt, Matthew
2010-05-01
This abstract describes the development of a new set of equations derived from 1D shallow water theory for use in 2D storage cell inundation models. The new equation set is designed to be solved explicitly at very low computational cost, and is here tested against a suite of four analytical and numerical test cases of increasing complexity. In each case the predicted water depths compare favourably to analytical solutions or to benchmark results from the optimally stable diffusive storage cell code of Hunter et al. (2005). For the most complex test involving the fine spatial resolution simulation of flow in a topographically complex urban area the Root Mean Squared Difference between the new formulation and the model of Hunter et al. is ~1 cm. However, unlike diffusive storage cell codes where the stable time step scales with (1-?x)2 the new equation set developed here represents shallow water wave propagation and so the stability is controlled by the Courant-Freidrichs-Lewy condition such that the stable time step instead scales with 1-?x. This allows use of a stable time step that is 1-3 orders of magnitude greater for typical cell sizes than that possible with diffusive storage cell models and results in commensurate reductions in model run times. The maximum speed up achieved over a diffusive storage cell model was 1120x in these tests, although the actual value seen will depend on model resolution and water depth and surface gradient. Solutions using the new equation set are shown to be relatively grid-independent for the conditions considered given the numerical diffusion likely at coarse model resolution. In addition, the inertial formulation appears to have an intuitively correct sensitivity to friction, however small instabilities and increased errors on predicted depth were noted when Manning's n = 0.01. These small instabilities are likely to be a result of the numerical scheme employed, whereby friction is acting to stabilise the solution although this scheme is still widely used in practice. The new equations are likely to find widespread application in many types of flood inundation modelling and should provide a useful additional tool, alongside more established model formulations, for a variety of flood risk management studies.
48 CFR 19.1405 - Service-disabled veteran-owned small business set-aside procedures.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Service-disabled veteran... System FEDERAL ACQUISITION REGULATION SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Service-Disabled Veteran-Owned Small Business Procurement Program 19.1405 Service-disabled veteran-owned small business set...
48 CFR 19.1405 - Service-disabled veteran-owned small business set-aside procedures.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Service-disabled veteran... System FEDERAL ACQUISITION REGULATION SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Service-Disabled Veteran-Owned Small Business Procurement Program 19.1405 Service-disabled veteran-owned small business set...
48 CFR 19.1405 - Service-disabled veteran-owned small business set-aside procedures.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Service-disabled veteran... System FEDERAL ACQUISITION REGULATION SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Service-Disabled Veteran-Owned Small Business Procurement Program 19.1405 Service-disabled veteran-owned small business set...
48 CFR 19.1405 - Service-disabled veteran-owned small business set-aside procedures.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Service-disabled veteran... System FEDERAL ACQUISITION REGULATION SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Service-Disabled Veteran-Owned Small Business Procurement Program 19.1405 Service-disabled veteran-owned small business set...
48 CFR 19.1405 - Service-disabled veteran-owned small business set-aside procedures.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Service-disabled veteran... System FEDERAL ACQUISITION REGULATION SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Service-Disabled Veteran-Owned Small Business Procurement Program 19.1405 Service-disabled veteran-owned small business set...
48 CFR 19.502-3 - Partial set-asides.
Code of Federal Regulations, 2010 CFR
2010-10-01
... non-set-aside part of the acquisition shall have first priority with respect to negotiations for the... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Partial set-asides. 19.502... SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 19.502-3 Partial set-asides. (a...
Evaluation of commercially available small RNASeq library preparation kits using low input RNA.
Yeri, Ashish; Courtright, Amanda; Danielson, Kirsty; Hutchins, Elizabeth; Alsop, Eric; Carlson, Elizabeth; Hsieh, Michael; Ziegler, Olivia; Das, Avash; Shah, Ravi V; Rozowsky, Joel; Das, Saumya; Van Keuren-Jensen, Kendall
2018-05-05
Evolving interest in comprehensively profiling the full range of small RNAs present in small tissue biopsies and in circulating biofluids, and how the profile differs with disease, has launched small RNA sequencing (RNASeq) into more frequent use. However, known biases associated with small RNASeq, compounded by low RNA inputs, have been both a significant concern and a hurdle to widespread adoption. As RNASeq is becoming a viable choice for the discovery of small RNAs in low input samples and more labs are employing it, there should be benchmark datasets to test and evaluate the performance of new sequencing protocols and operators. In a recent publication from the National Institute of Standards and Technology, Pine et al., 2018, the investigators used a commercially available set of three tissues and tested performance across labs and platforms. In this paper, we further tested the performance of low RNA input in three commonly used and commercially available RNASeq library preparation kits; NEB Next, NEXTFlex, and TruSeq small RNA library preparation. We evaluated the performance of the kits at two different sites, using three different tissues (brain, liver, and placenta) with high (1 μg) and low RNA (10 ng) input from tissue samples, or 5.0, 3.0, 2.0, 1.0, 0.5, and 0.2 ml starting volumes of plasma. As there has been a lack of robust validation platforms for differentially expressed miRNAs, we also compared low input RNASeq data with their expression profiles on three different platforms (Abcam Fireplex, HTG EdgeSeq, and Qiagen miRNome). The concordance of RNASeq results on these three platforms was dependent on the RNA expression level; the higher the expression, the better the reproducibility. The results provide an extensive analysis of small RNASeq kit performance using low RNA input, and replication of these data on three downstream technologies.
SOFIA secondary mirror Hindle test analysis
NASA Astrophysics Data System (ADS)
Davis, Paul K.
2003-02-01
The Stratospheric Observatory for Infrared Astronomy (SOFIA) is a NASA facility, nearing completion, consisting of an infrared telescope of 2.5 meter system aperture flying in a modified Boeing 747. Its Cassegrain secondary mirror has recently completed polishing. The SOFIA Project Office at Ames Research Center considered it important to perform an independent analysis of secondary mirror figure. The polishing was controlled by the standard test for a convex hyperboloid, the Hindle test, in a modified form with a meniscus lens partially reflecting on the concave face, rather than a fully reflecting mirror with a central hole. The spacing between this meniscus lens and the secondary mirror was controlled by three peripherally located spacing spheres. This necessitated special analysis to determine what the resulting curvature and conic constant of the mirror would be, if manufacturing imprecisions of the test set-up components were to be taken into account. This set-up was specially programmed, and the resulting hyperboloid calculated for the nominal case, and all extreme cases from the reported error limits on the manufacturing of the components. The results were then verified using the standard program CODE-V of Optical Research Associates. The conclusion is that the secondary mirror has a vertex radius of curvature of 954.05 mm +/- .1 mm (design value: 954.13), and a conic constant of -1.2965 +/- .001 (dimensionless, design value: -1.298). Such small divergences from design are to be expected, and these are within the refocusing ability of SOFIA, and would result in an acceptably small amount of spherical aberration in the image.
Knoll, Florian; Hammernik, Kerstin; Kobler, Erich; Pock, Thomas; Recht, Michael P; Sodickson, Daniel K
2018-05-17
Although deep learning has shown great promise for MR image reconstruction, an open question regarding the success of this approach is the robustness in the case of deviations between training and test data. The goal of this study is to assess the influence of image contrast, SNR, and image content on the generalization of learned image reconstruction, and to demonstrate the potential for transfer learning. Reconstructions were trained from undersampled data using data sets with varying SNR, sampling pattern, image contrast, and synthetic data generated from a public image database. The performance of the trained reconstructions was evaluated on 10 in vivo patient knee MRI acquisitions from 2 different pulse sequences that were not used during training. Transfer learning was evaluated by fine-tuning baseline trainings from synthetic data with a small subset of in vivo MR training data. Deviations in SNR between training and testing led to substantial decreases in reconstruction image quality, whereas image contrast was less relevant. Trainings from heterogeneous training data generalized well toward the test data with a range of acquisition parameters. Trainings from synthetic, non-MR image data showed residual aliasing artifacts, which could be removed by transfer learning-inspired fine-tuning. This study presents insights into the generalization ability of learned image reconstruction with respect to deviations in the acquisition settings between training and testing. It also provides an outlook for the potential of transfer learning to fine-tune trainings to a particular target application using only a small number of training cases. © 2018 International Society for Magnetic Resonance in Medicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grimme, Stefan, E-mail: grimme@thch.uni-bonn.de; Brandenburg, Jan Gerit; Bannwarth, Christoph
A density functional theory (DFT) based composite electronic structure approach is proposed to efficiently compute structures and interaction energies in large chemical systems. It is based on the well-known and numerically robust Perdew-Burke-Ernzerhoff (PBE) generalized-gradient-approximation in a modified global hybrid functional with a relatively large amount of non-local Fock-exchange. The orbitals are expanded in Ahlrichs-type valence-double zeta atomic orbital (AO) Gaussian basis sets, which are available for many elements. In order to correct for the basis set superposition error (BSSE) and to account for the important long-range London dispersion effects, our well-established atom-pairwise potentials are used. In the design ofmore » the new method, particular attention has been paid to an accurate description of structural parameters in various covalent and non-covalent bonding situations as well as in periodic systems. Together with the recently proposed three-fold corrected (3c) Hartree-Fock method, the new composite scheme (termed PBEh-3c) represents the next member in a hierarchy of “low-cost” electronic structure approaches. They are mainly free of BSSE and account for most interactions in a physically sound and asymptotically correct manner. PBEh-3c yields good results for thermochemical properties in the huge GMTKN30 energy database. Furthermore, the method shows excellent performance for non-covalent interaction energies in small and large complexes. For evaluating its performance on equilibrium structures, a new compilation of standard test sets is suggested. These consist of small (light) molecules, partially flexible, medium-sized organic molecules, molecules comprising heavy main group elements, larger systems with long bonds, 3d-transition metal systems, non-covalently bound complexes (S22 and S66×8 sets), and peptide conformations. For these sets, overall deviations from accurate reference data are smaller than for various other tested DFT methods and reach that of triple-zeta AO basis set second-order perturbation theory (MP2/TZ) level at a tiny fraction of computational effort. Periodic calculations conducted for molecular crystals to test structures (including cell volumes) and sublimation enthalpies indicate very good accuracy competitive to computationally more involved plane-wave based calculations. PBEh-3c can be applied routinely to several hundreds of atoms on a single processor and it is suggested as a robust “high-speed” computational tool in theoretical chemistry and physics.« less
A psychometric study of the Test of Everyday Attention for Children in the Chinese setting.
Chan, Raymond C K; Wang, Li; Ye, Jiawen; Leung, Winnie W Y; Mok, Monica Y K
2008-07-01
To explore the psychometric properties of the Test of Everyday Attention for Children (TEA-Ch) in the context of a Chinese setting. Confirmatory factor analysis was conducted to examine the construct validity of the Chinese version of the TEA-Ch among a group of 232 children without attention deficit hyperactivity disorder (ADHD). Test-retest reliability was tested on a random sub-sample of 20 children at a 4-week interval. Clinical discrimination was also examined by comparing children with and without ADHD (22 in each group) on the performances of the TEA-Ch. The current Chinese sample demonstrated a three-factor solution for attentional performance among children without ADHD, namely selective attention, executive control/switch, and sustained attention (chi(2)(24)=34.56; RMSEA=.044; p=.075). Moreover, the whole test demonstrated acceptable test-retest reliability at a 4-week interval among a small sub-sample. Children with ADHD performed significantly more poorly than healthy controls in most of the subtests of the TEA-Ch. The results of the present study demonstrate that the test items remain useful in China, a culture very different from that in which the test originated. Finally, the TEA-Ch also presents several advantages when compared to other conventional objective measures of attention.
Which peer teaching methods do medical students prefer?
Jayakumar, Nithish; Srirathan, Danushan; Shah, Rishita; Jakubowska, Agnieszka; Clarke, Andrew; Annan, David; Albasha, Dekan
2016-01-01
The beneficial effects of peer teaching in medical education have been well-described in the literature. However, it is unclear whether students prefer to be taught by peers in small or large group settings. This study's aim was to identify differences in medical students' preferences and perceptions of small-group versus large-group peer teaching. Questionnaires were administered to medical students in Year 3 and Year 4 (first 2 years of clinical training) at one institution in the United Kingdom to identify their experiences and perceptions of small-and large-group peer teaching. For this study, small-group peer teaching was defined as a tutorial, or similar, taught by peer tutor to a group of 5 students or less. Large-group peer teaching was defined as a lecture, or similar, taught by peer tutors to a group of more than 20 students. Seventy-three students (81% response rate) completed the questionnaires (54% males; median age of 23). Nearly 55% of respondents reported prior exposure to small-group peer teaching but a larger proportion of respondents (86%) had previously attended large-group peer teaching. Of all valid responses, 49% did not have a preference of peer teaching method while 47% preferred small-group peer teaching. The majority of Year 3 students preferred small-group peer teaching to no preference (62.5% vs 37.5%, Fisher's exact test; P = 0.035) whereas most Year 4 students did not report a particular preference. Likert-scale responses showed that the majority of students held negative perceptions about large-group peer teaching, in comparison with small-group peer teaching, with respect to (1) interactivity, (2) a comfortable environment to ask questions, and (3) feedback received. Most respondents in this study did not report a preference for small-versus large-group settings when taught by peers. More Year 3 respondents were likely to prefer small-group peer teaching as opposed to Year 4 respondents.
Code of Federal Regulations, 2011 CFR
2011-10-01
... for Economically Disadvantaged Women-Owned Small Business (EDWOSB) Concerns. 52.219-29 Section 52.219... Total Set-Aside for Economically Disadvantaged Women-Owned Small Business (EDWOSB) Concerns. As... Women-Owned Small Business (EDWOSB) Concerns (APR 2011) (a) Definitions. Economically disadvantaged...
Photoacoustic sensor for medical diagnostics
NASA Astrophysics Data System (ADS)
Wolff, Marcus; Groninga, Hinrich G.; Harde, Hermann
2004-03-01
The development of new optical sensor technologies has a major impact on the progress of diagnostic methods. Of the permanently increasing number of non-invasive breath tests, the 13C-Urea Breath Test (UBT) for the detection of Helicobacter pylori is the most prominent. However, many recent developments, like the detection of cancer by breath test, go beyond gastroenterological applications. We present a new detection scheme for breath analysis that employs an especially compact and simple set-up. Photoacoustic Spectroscopy (PAS) represents an offset-free technique that allows for short absorption paths and small sample cells. Using a single-frequency diode laser and taking advantage of acoustical resonances of the sample cell, we performed extremely sensitive and selective measurements. The smart data processing method contributes to the extraordinary sensitivity and selectivity as well. Also, the reasonable acquisition cost and low operational cost make this detection scheme attractive for many biomedical applications. The experimental set-up and data processing method, together with exemplary isotope-selective measurements on carbon dioxide, are presented.
An investigation of new methods for estimating parameter sensitivities
NASA Technical Reports Server (NTRS)
Beltracchi, Todd J.; Gabriele, Gary A.
1988-01-01
Parameter sensitivity is defined as the estimation of changes in the modeling functions and the design variables due to small changes in the fixed parameters of the formulation. There are currently several methods for estimating parameter sensitivities requiring either difficult to obtain second order information, or do not return reliable estimates for the derivatives. Additionally, all the methods assume that the set of active constraints does not change in a neighborhood of the estimation point. If the active set does in fact change, than any extrapolations based on these derivatives may be in error. The objective here is to investigate more efficient new methods for estimating parameter sensitivities when the active set changes. The new method is based on the recursive quadratic programming (RQP) method and in conjunction a differencing formula to produce estimates of the sensitivities. This is compared to existing methods and is shown to be very competitive in terms of the number of function evaluations required. In terms of accuracy, the method is shown to be equivalent to a modified version of the Kuhn-Tucker method, where the Hessian of the Lagrangian is estimated using the BFS method employed by the RPQ algorithm. Inital testing on a test set with known sensitivities demonstrates that the method can accurately calculate the parameter sensitivity. To handle changes in the active set, a deflection algorithm is proposed for those cases where the new set of active constraints remains linearly independent. For those cases where dependencies occur, a directional derivative is proposed. A few simple examples are included for the algorithm, but extensive testing has not yet been performed.
A prototypic small molecule database for bronchoalveolar lavage-based metabolomics
NASA Astrophysics Data System (ADS)
Walmsley, Scott; Cruickshank-Quinn, Charmion; Quinn, Kevin; Zhang, Xing; Petrache, Irina; Bowler, Russell P.; Reisdorph, Richard; Reisdorph, Nichole
2018-04-01
The analysis of bronchoalveolar lavage fluid (BALF) using mass spectrometry-based metabolomics can provide insight into lung diseases, such as asthma. However, the important step of compound identification is hindered by the lack of a small molecule database that is specific for BALF. Here we describe prototypic, small molecule databases derived from human BALF samples (n=117). Human BALF was extracted into lipid and aqueous fractions and analyzed using liquid chromatography mass spectrometry. Following filtering to reduce contaminants and artifacts, the resulting BALF databases (BALF-DBs) contain 11,736 lipid and 658 aqueous compounds. Over 10% of these were found in 100% of samples. Testing the BALF-DBs using nested test sets produced a 99% match rate for lipids and 47% match rate for aqueous molecules. Searching an independent dataset resulted in 45% matching to the lipid BALF-DB compared to<25% when general databases are searched. The BALF-DBs are available for download from MetaboLights. Overall, the BALF-DBs can reduce false positives and improve confidence in compound identification compared to when general databases are used.
Testing self-regulation interventions to increase walking using factorial randomized N-of-1 trials.
Sniehotta, Falko F; Presseau, Justin; Hobbs, Nicola; Araújo-Soares, Vera
2012-11-01
To investigate the suitability of N-of-1 randomized controlled trials (RCTs) as a means of testing the effectiveness of behavior change techniques based on self-regulation theory (goal setting and self-monitoring) for promoting walking in healthy adult volunteers. A series of N-of-1 RCTs in 10 normal and overweight adults ages 19-67 (M = 36.9 years). We randomly allocated 60 days within each individual to text message-prompted daily goal-setting and/or self-monitoring interventions in accordance with a 2 (step-count goal prompt vs. alternative goal prompt) × 2 (self-monitoring: open vs. blinded Omron-HJ-113-E pedometer) factorial design. Aggregated data were analyzed using random intercept multilevel models. Single cases were analyzed individually. The primary outcome was daily pedometer step counts over 60 days. Single-case analyses showed that 4 participants significantly increased walking: 2 on self-monitoring days and 2 on goal-setting days, compared with control days. Six participants did not benefit from the interventions. In aggregated analyses, mean step counts were higher on goal-setting days (8,499.9 vs. 7,956.3) and on self-monitoring days (8,630.3 vs. 7,825.9). Multilevel analyses showed a significant effect of the self-monitoring condition (p = .01), the goal-setting condition approached significance (p = .08), and there was a small linear increase in walking over time (p = .03). N-of-1 randomized trials are a suitable means to test behavioral interventions in individual participants.
Rohde, Katja; Papiorek, Sarah; Lunau, Klaus
2013-03-01
Differences in the concentration of pigments as well as their composition and spatial arrangement cause intraspecific variation in the spectral signature of flowers. Known colour preferences and requirements for flower-constant foraging bees predict different responses to colour variability. In experimental settings, we simulated small variations of unicoloured petals and variations in the spatial arrangement of colours within tricoloured petals using artificial flowers and studied their impact on the colour choices of bumblebees and honeybees. Workers were trained to artificial flowers of a given colour and then given the simultaneous choice between three test colours: either the training colour, one colour of lower and one of higher spectral purity, or the training colour, one colour of lower and one of higher dominant wavelength; in all cases the perceptual contrast between the training colour and the additional test colours was similarly small. Bees preferred artificial test flowers which resembled the training colour with the exception that they preferred test colours with higher spectral purity over trained colours. Testing the behaviour of bees at artificial flowers displaying a centripetal or centrifugal arrangement of three equally sized colours with small differences in spectral purity, bees did not prefer any type of artificial flowers, but preferentially choose the most spectrally pure area for the first antenna contact at both types of artificial flowers. Our results indicate that innate preferences for flower colours of high spectral purity in pollinators might exert selective pressure on the evolution of flower colours.
NASA Astrophysics Data System (ADS)
Dewalque, Florence; Schwartz, Cédric; Denoël, Vincent; Croisier, Jean-Louis; Forthomme, Bénédicte; Brüls, Olivier
2018-02-01
This paper studies the dynamics of tape springs which are characterised by a highly geometrical nonlinear behaviour including buckling, the formation of folds and hysteresis. An experimental set-up is designed to capture these complex nonlinear phenomena. The experimental data are acquired by the means of a 3D motion analysis system combined with a synchronised force plate. Deployment tests show that the motion can be divided into three phases characterised by different types of folds, frequencies of oscillation and damping behaviours. Furthermore, the reproducibility quality of the dynamic and quasi-static results is validated by performing a large number of tests. In parallel, a nonlinear finite element model is developed. The required model parameters are identified based on simple experimental tests such as static deformed configurations and small amplitude vibration tests. In the end, the model proves to be well correlated with the experimental results in opposite sense bending, while in equal sense, both the experimental set-up and the numerical model are particularly sensitive to the initial conditions.
Stability Properties of the Regular Set for the Navier-Stokes Equation
NASA Astrophysics Data System (ADS)
D'Ancona, Piero; Lucà, Renato
2018-06-01
We investigate the size of the regular set for small perturbations of some classes of strong large solutions to the Navier-Stokes equation. We consider perturbations of the data that are small in suitable weighted L2 spaces but can be arbitrarily large in any translation invariant Banach space. We give similar results in the small data setting.
Arechar, Antonio A; Kouchaki, Maryam; Rand, David G
2018-03-01
We had participants play two sets of repeated Prisoner's Dilemma (RPD) games, one with a large continuation probability and the other with a small continuation probability, as well as Dictator Games (DGs) before and after the RPDs. We find that, regardless of which is RPD set is played first, participants typically cooperate when the continuation probability is large and defect when the continuation probability is small. However, there is an asymmetry in behavior when transitioning from one continuation probability to the other. When switching from large to small, transient higher levels of cooperation are observed in the early games of the small continuation set. Conversely, when switching from small to large, cooperation is immediately high in the first game of the large continuation set. We also observe that response times increase when transitioning between sets of RPDs, except for altruistic participants transitioning into the set of RPDs with long continuation probabilities. These asymmetries suggest a bias in favor of cooperation. Finally, we examine the link between altruism and RPD play. We find that small continuation probability RPD play is correlated with giving in DGs played before and after the RPDs, whereas high continuation probability RPD play is not.
Community nurses and self-management of blood glucose.
Abbott, S; Burns, J; Gleadell, A; Gunnell, C
2007-01-01
Self-monitoring of blood glucose (SMBG) is commonly recommended to patients with diabetes, although the rationale for this is unclear. This small research project was designed to explore the reasons why nurses working in the community recommend SMBG. Seven interviews were carried out with community nurses caring primarily for housebound patients. Those interviewed believed that a sound evidence-base supported the recommendation that patients test their blood, but not urine, for glucose levels. Though nurses believed in the importance of patient choice and empowerment, the scope for these was limited among housebound patients. There was no evidence that patients understood how to respond to test results, or that comprehensive care planning was normal practice. Although small, this study suggests that nurses working in community settings may need to update their knowledge. It also suggests that a national debate is necessary to disseminate better the evidence about SMBG, and its implications for nursing practice.
The Wireless Motility Capsule: a One-Stop Shop for the Evaluation of GI Motility Disorders.
Saad, Richard J
2016-03-01
The wireless motility and pH capsule (WMC) provides an office-based test to simultaneously assess both regional and whole gut transit. Ingestion of this non-digestible capsule capable of measuring temperature, pH, and the pressure of its immediate surroundings allows for the measurement of gastric, small bowel, and colonic transit times in an ambulatory setting. Approved by the US Food and Drug Administration for the evaluation of suspected conditions of delayed gastric emptying and the evaluation of colonic transit in chronic idiopathic constipation, WMC should be considered in suspected gastrointestinal motility disorders as it provides a single study capable of simultaneously assessing for regional, multiregional, or generalized motility disorders. Specific indications for testing with the WMC should include the evaluation of suspect cases of gastroparesis, small bowel dysmotility, and slow transit constipation, as well as symptom syndromes suggestive of a multiregional or generalized gastrointestinal transit delay.
Simulated Space Environmental Effects on Thin Film Solar Array Components
NASA Technical Reports Server (NTRS)
Finckenor, Miria; Carr, John; SanSoucie, Michael; Boyd, Darren; Phillips, Brandon
2017-01-01
The Lightweight Integrated Solar Array and Transceiver (LISA-T) experiment consists of thin-film, low mass, low volume solar panels. Given the variety of thin solar cells and cover materials and the lack of environmental protection typically afforded by thick coverglasses, a series of tests were conducted in Marshall Space Flight Center's Space Environmental Effects Facility to evaluate the performance of these materials. Candidate thin polymeric films and nitinol wires used for deployment were also exposed. Simulated space environment exposures were selected based on SSP 30425 rev. B, "Space Station Program Natural Environment Definition for Design" or AIAA Standard S-111A-2014, "Qualification and Quality Requirements for Space Solar Cells." One set of candidate materials were exposed to 5 eV atomic oxygen and concurrent vacuum ultraviolet (VUV) radiation for low Earth orbit simulation. A second set of materials were exposed to 1 MeV electrons. A third set of samples were exposed to 50, 100, 500, and 700 keV energy protons, and a fourth set were exposed to >2,000 hours of near ultraviolet (NUV) radiation. A final set was rapidly thermal cycled between -55 and +125 C. This test series provides data on enhanced power generation, particularly for small satellites with reduced mass and volume resources. Performance versus mass and cost per Watt is discussed.
Simulated Space Environmental Effects on Thin Film Solar Array Components
NASA Technical Reports Server (NTRS)
Finckenor, Miria; Carr, John; SanSoucie, Michael; Boyd, Darren; Phillips, Brandon
2017-01-01
The Lightweight Integrated Solar Array and Transceiver (LISA-T) experiment consists of thin-film, low mass, low volume solar panels. Given the variety of thin solar cells and cover materials and the lack of environmental protection typically afforded by thick coverglasses, a series of tests were conducted in Marshall Space Flight Center's Space Environmental Effects Facility to evaluate the performance of these materials. Candidate thin polymeric films and nitinol wires used for deployment were also exposed. Simulated space environment exposures were selected based on SSP 30425 rev. B, "Space Station Program Natural Environment Definition for Design" or AIAA Standard S-111A-2014, "Qualification and Quality Requirements for Space Solar Cells." One set of candidate materials were exposed to 5 eV atomic oxygen and concurrent vacuum ultraviolet (VUV) radiation for low Earth orbit simulation. A second set of materials were exposed to 1 MeV electrons. A third set of samples were exposed to 50, 100, 500, and 700 keV energy protons, and a fourth set were exposed to >2,000 hours of near ultraviolet (NUV) radiation. A final set was rapidly thermal cycled between -55 and +125degC. This test series provides data on enhanced power generation, particularly for small satellites with reduced mass and volume resources. Performance versus mass and cost per Watt is discussed.
Simulated Space Environmental Effects on Thin Film Solar Array Components
NASA Technical Reports Server (NTRS)
Finckenor, Miria; Carr, John; SanSoucie, Michael; Boyd, Darren; Phillips, Brandon
2017-01-01
The Lightweight Integrated Solar Array and Transceiver (LISA-T) experiment consists of thin-film, low mass, low volume solar panels. Given the variety of thin solar cells and cover materials and the lack of environmental protection afforded by typical thick coverglasses, a series of tests were conducted in Marshall Space Flight Center's Space Environmental Effects Facility to evaluate the performance of these materials. Candidate thin polymeric films and nitinol wires used for deployment were also exposed. Simulated space environment exposures were selected based on SSP 30425 rev. B, "Space Station Program Natural Environment Definition for Design" or AIAA Standard S-111A-2014, "Qualification and Quality Requirements for Space Solar Cells." One set of candidate materials were exposed to 5 eV atomic oxygen and concurrent vacuum ultraviolet (VUV) radiation for low Earth orbit simulation. A second set of materials were exposed to 1 MeV electrons. A third set of samples were exposed to 50, 500, and 750 keV energy protons, and a fourth set were exposed to >2,000 hours of ultraviolet radiation. A final set was rapidly thermal cycled between -50 and +120 C. This test series provides data on enhanced power generation, particularly for small satellites with reduced mass and volume resources. Performance versus mass and cost per Watt is discussed.
Portrait of a small population of boreal toads (anaxyrus boreas)
Muths, E.; Scherer, R. D.
2011-01-01
Much attention has been given to the conservation of small populations, those that are small because of decline, and those that are naturally small. Small populations are of particular interest because ecological theory suggests that they are vulnerable to the deleterious effects of environmental, demographic, and genetic stochasticity as well as natural and human-induced catastrophes. However, testing theory and developing applicable conservation measures for small populations is hampered by sparse data. This lack of information is frequently driven by computational issues with small data sets that can be confounded by the impacts of stressors. We present estimates of demographic parameters from a small population of Boreal Toads (Anaxyrus boreas) that has been surveyed since 2001 by using capturerecapture methods. Estimates of annual adult survival probability are high relative to other Boreal Toad populations, whereas estimates of recruitment rate are low. Despite using simple models, clear patterns emerged from the analyses, suggesting that population size is constrained by low recruitment of adults and is declining slowly. These patterns provide insights that are useful in developing management directions for this small population, and this study serves as an example of the potential for small populations to yield robust and useful information despite sample size constraints. ?? 2011 The Herpetologists' League, Inc.
On the magnetic polarizability tensor of US coinage
NASA Astrophysics Data System (ADS)
Davidson, John L.; Abdel-Rehim, Omar A.; Hu, Peipei; Marsh, Liam A.; O'Toole, Michael D.; Peyton, Anthony J.
2018-03-01
The magnetic dipole polarizability tensor of a metallic object gives unique information about the size, shape and electromagnetic properties of the object. In this paper, we present a novel method of coin characterization based on the spectroscopic response of the absolute tensor. The experimental measurements are validated using a combination of tests with a small set of bespoke coin surrogates and simulated data. The method is applied to an uncirculated set of US coins. Measured and simulated spectroscopic tensor responses of the coins show significant differences between different coin denominations. The presented results are encouraging as they strongly demonstrate the ability to characterize coins using an absolute tensor approach.
Unified Engineering Software System
NASA Technical Reports Server (NTRS)
Purves, L. R.; Gordon, S.; Peltzman, A.; Dube, M.
1989-01-01
Collection of computer programs performs diverse functions in prototype engineering. NEXUS, NASA Engineering Extendible Unified Software system, is research set of computer programs designed to support full sequence of activities encountered in NASA engineering projects. Sequence spans preliminary design, design analysis, detailed design, manufacturing, assembly, and testing. Primarily addresses process of prototype engineering, task of getting single or small number of copies of product to work. Written in FORTRAN 77 and PROLOG.
Risthaus, Tobias; Grimme, Stefan
2013-03-12
A new test set (S12L) containing 12 supramolecular noncovalently bound complexes is presented and used to evaluate seven different methods to account for dispersion in DFT (DFT-D3, DFT-D2, DFT-NL, XDM, dDsC, TS-vdW, M06-L) at different basis set levels against experimental, back-corrected reference energies. This allows conclusions about the performance of each method in an explorative research setting on "real-life" problems. Most DFT methods show satisfactory performance but, due to the largeness of the complexes, almost always require an explicit correction for the nonadditive Axilrod-Teller-Muto three-body dispersion interaction to get accurate results. The necessity of using a method capable of accounting for dispersion is clearly demonstrated in that the two-body dispersion contributions are on the order of 20-150% of the total interaction energy. MP2 and some variants thereof are shown to be insufficient for this while a few tested D3-corrected semiempirical MO methods perform reasonably well. Overall, we suggest the use of this benchmark set as a "sanity check" against overfitting to too small molecular cases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
KRUGER AA; MATLACK KS; GONG W
2011-12-29
This report documents melter and off-gas performance results obtained on the DM1200 HLW Pilot Melter during processing of AZ-101 HLW simulants. The tests reported herein are a subset of six tests from a larger series of tests described in the Test Plan for the work; results from the other tests have been reported separately. The solids contents of the melter feeds were based on the WTP baseline value for the solids content of the feeds from pretreatment which changed during these tests from 20% to 15% undissolved solids resulting in tests conducted at two feed solids contents. Based on themore » results of earlier tests with single outlet 'J' bubblers, initial tests were performed with a total bubbling rate of 651 pm. The first set of tests (Tests 1A-1E) addressed the effects of skewing this total air flow rate back and forth between the two installed bubblers in comparison to a fixed equal division of flow between them. The second set of tests (2A-2D) addressed the effects of bubbler depth. Subsequently, as the location, type and number of bubbling outlets were varied, the optimum bubbling rate for each was determined. A third (3A-3C) and fourth (8A-8C) set of tests evaluated the effects of alternative bubbler designs with two gas outlets per bubbler instead of one by placing four bubblers in positions simulating multiple-outlet bubblers. Data from the simulated multiple outlet bubblers were used to design bubblers with two outlets for an additional set of tests (9A-9C). Test 9 was also used to determine the effect of small sugar additions to the feed on ruthenium volatility. Another set of tests (10A-10D) evaluated the effects on production rate of spiking the feed with chloride and sulfate. Variables held constant to the extent possible included melt temperature, plenum temperature, cold cap coverage, the waste simulant composition, and the target glass composition. The feed rate was increased to the point that a constant, essentially complete, cold cap was achieved, which was used as an indicator of a maximized feed rate for each test. The first day of each test was used to build the cold cap and decrease the plenum temperature. The remainder of each test was split into two- to six-day segments, each with a different bubbling rate, bubbler orientation, or feed concentration of chloride and sulfur.« less
48 CFR 8.405-5 - Small business.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Small business. 8.405-5... REQUIRED SOURCES OF SUPPLIES AND SERVICES Federal Supply Schedules 8.405-5 Small business. (a) Although the...— (i) Set aside orders for any of the small business concerns identified in 19.000(a)(3); and (ii) Set...
48 CFR 8.405-5 - Small business.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Small business. 8.405-5... REQUIRED SOURCES OF SUPPLIES AND SERVICES Federal Supply Schedules 8.405-5 Small business. (a) Although the...— (i) Set aside orders for any of the small business concerns identified in 19.000(a)(3); and (ii) Set...
48 CFR 8.405-5 - Small business.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Small business. 8.405-5... REQUIRED SOURCES OF SUPPLIES AND SERVICES Federal Supply Schedules 8.405-5 Small business. (a) Although the...— (i) Set aside orders for any of the small business concerns identified in 19.000(a)(3); and (ii) Set...
48 CFR 19.502-1 - Requirements for setting aside acquisitions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... REGULATION SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 19.502-1 Requirements... or class of acquisitions for competition among small businesses when— (1) It is determined to be in... is placed with small business concerns; and the circumstances described in 19.502-2 or 19.502-3(a...
NASA Astrophysics Data System (ADS)
Yang, GuanYa; Wu, Jiang; Chen, ShuGuang; Zhou, WeiJun; Sun, Jian; Chen, GuanHua
2018-06-01
Neural network-based first-principles method for predicting heat of formation (HOF) was previously demonstrated to be able to achieve chemical accuracy in a broad spectrum of target molecules [L. H. Hu et al., J. Chem. Phys. 119, 11501 (2003)]. However, its accuracy deteriorates with the increase in molecular size. A closer inspection reveals a systematic correlation between the prediction error and the molecular size, which appears correctable by further statistical analysis, calling for a more sophisticated machine learning algorithm. Despite the apparent difference between simple and complex molecules, all the essential physical information is already present in a carefully selected set of small molecule representatives. A model that can capture the fundamental physics would be able to predict large and complex molecules from information extracted only from a small molecules database. To this end, a size-independent, multi-step multi-variable linear regression-neural network-B3LYP method is developed in this work, which successfully improves the overall prediction accuracy by training with smaller molecules only. And in particular, the calculation errors for larger molecules are drastically reduced to the same magnitudes as those of the smaller molecules. Specifically, the method is based on a 164-molecule database that consists of molecules made of hydrogen and carbon elements. 4 molecular descriptors were selected to encode molecule's characteristics, among which raw HOF calculated from B3LYP and the molecular size are also included. Upon the size-independent machine learning correction, the mean absolute deviation (MAD) of the B3LYP/6-311+G(3df,2p)-calculated HOF is reduced from 16.58 to 1.43 kcal/mol and from 17.33 to 1.69 kcal/mol for the training and testing sets (small molecules), respectively. Furthermore, the MAD of the testing set (large molecules) is reduced from 28.75 to 1.67 kcal/mol.
Code of Federal Regulations, 2011 CFR
2011-01-01
.... (a) Reservation as small business or SDB set-aside. The procuring activity issued a solicitation for... small disadvantaged business (SDB) set-aside prior to offering the requirement to SBA for award as an 8... the requirement be procured as a small business or, if authorized, an SDB set-aside. [63 FR 35739...
Effects of Group Size on Students Mathematics Achievement in Small Group Settings
ERIC Educational Resources Information Center
Enu, Justice; Danso, Paul Amoah; Awortwe, Peter K.
2015-01-01
An ideal group size is hard to obtain in small group settings; hence there are groups with more members than others. The purpose of the study was to find out whether group size has any effects on students' mathematics achievement in small group settings. Two third year classes of the 2011/2012 academic year were selected from two schools in the…
McKim, James M.; Hartung, Thomas; Kleensang, Andre; Sá-Rocha, Vanessa
2016-01-01
Supervised learning methods promise to improve integrated testing strategies (ITS), but must be adjusted to handle high dimensionality and dose–response data. ITS approaches are currently fueled by the increasing mechanistic understanding of adverse outcome pathways (AOP) and the development of tests reflecting these mechanisms. Simple approaches to combine skin sensitization data sets, such as weight of evidence, fail due to problems in information redundancy and high dimension-ality. The problem is further amplified when potency information (dose/response) of hazards would be estimated. Skin sensitization currently serves as the foster child for AOP and ITS development, as legislative pressures combined with a very good mechanistic understanding of contact dermatitis have led to test development and relatively large high-quality data sets. We curated such a data set and combined a recursive variable selection algorithm to evaluate the information available through in silico, in chemico and in vitro assays. Chemical similarity alone could not cluster chemicals’ potency, and in vitro models consistently ranked high in recursive feature elimination. This allows reducing the number of tests included in an ITS. Next, we analyzed with a hidden Markov model that takes advantage of an intrinsic inter-relationship among the local lymph node assay classes, i.e. the monotonous connection between local lymph node assay and dose. The dose-informed random forest/hidden Markov model was superior to the dose-naive random forest model on all data sets. Although balanced accuracy improvement may seem small, this obscures the actual improvement in misclassifications as the dose-informed hidden Markov model strongly reduced "false-negatives" (i.e. extreme sensitizers as non-sensitizer) on all data sets. PMID:26046447
Luechtefeld, Thomas; Maertens, Alexandra; McKim, James M; Hartung, Thomas; Kleensang, Andre; Sá-Rocha, Vanessa
2015-11-01
Supervised learning methods promise to improve integrated testing strategies (ITS), but must be adjusted to handle high dimensionality and dose-response data. ITS approaches are currently fueled by the increasing mechanistic understanding of adverse outcome pathways (AOP) and the development of tests reflecting these mechanisms. Simple approaches to combine skin sensitization data sets, such as weight of evidence, fail due to problems in information redundancy and high dimensionality. The problem is further amplified when potency information (dose/response) of hazards would be estimated. Skin sensitization currently serves as the foster child for AOP and ITS development, as legislative pressures combined with a very good mechanistic understanding of contact dermatitis have led to test development and relatively large high-quality data sets. We curated such a data set and combined a recursive variable selection algorithm to evaluate the information available through in silico, in chemico and in vitro assays. Chemical similarity alone could not cluster chemicals' potency, and in vitro models consistently ranked high in recursive feature elimination. This allows reducing the number of tests included in an ITS. Next, we analyzed with a hidden Markov model that takes advantage of an intrinsic inter-relationship among the local lymph node assay classes, i.e. the monotonous connection between local lymph node assay and dose. The dose-informed random forest/hidden Markov model was superior to the dose-naive random forest model on all data sets. Although balanced accuracy improvement may seem small, this obscures the actual improvement in misclassifications as the dose-informed hidden Markov model strongly reduced " false-negatives" (i.e. extreme sensitizers as non-sensitizer) on all data sets. Copyright © 2015 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Grudinin, Sergei; Kadukova, Maria; Eisenbarth, Andreas; Marillet, Simon; Cazals, Frédéric
2016-09-01
The 2015 D3R Grand Challenge provided an opportunity to test our new model for the binding free energy of small molecules, as well as to assess our protocol to predict binding poses for protein-ligand complexes. Our pose predictions were ranked 3-9 for the HSP90 dataset, depending on the assessment metric. For the MAP4K dataset the ranks are very dispersed and equal to 2-35, depending on the assessment metric, which does not provide any insight into the accuracy of the method. The main success of our pose prediction protocol was the re-scoring stage using the recently developed Convex-PL potential. We make a thorough analysis of our docking predictions made with AutoDock Vina and discuss the effect of the choice of rigid receptor templates, the number of flexible residues in the binding pocket, the binding pocket size, and the benefits of re-scoring. However, the main challenge was to predict experimentally determined binding affinities for two blind test sets. Our affinity prediction model consisted of two terms, a pairwise-additive enthalpy, and a non pairwise-additive entropy. We trained the free parameters of the model with a regularized regression using affinity and structural data from the PDBBind database. Our model performed very well on the training set, however, failed on the two test sets. We explain the drawback and pitfalls of our model, in particular in terms of relative coverage of the test set by the training set and missed dynamical properties from crystal structures, and discuss different routes to improve it.
Reactive flow calibration for diaminoazoxyfurazan (DAAF) and comparison with experiment
NASA Astrophysics Data System (ADS)
Johnson, Carl; Francois, Elizabeth Green; Morris, John
2012-03-01
Diaminoazoxyfurazan (DAAF) has a number of desirable properties; it is sensitive to shock while being insensitive to initiation by low level impact or friction, it has a small failure diameter, and its manufacturing process is inexpensive with minimal environmental impact. In light of its unique properties, DAAF based materials have gained interest for possible applications in insensitive munitions. In order to facilitate hydrocode modeling of DAAF and DAAF based formulations, we have developed a set of reactive flow parameters which were calibrated using published experimental data as well as recent experiments at LANL. Hydrocode calculations using the DAAF reactive flow parameters developed in the course of this work were compared to rate stick experiments, small scale gap tests, as well as the Onionskin experiment. Hydrocode calculations were compared directly to streak image results using numerous tracer points in conjunction with an external algorithm to match the data sets. The calculations display a reasonable agreement with experiment with the exception of effects related to shock desensitization of explosive.
Ates, Gamze; Favyts, Dorien; Hendriks, Giel; Derr, Remco; Mertens, Birgit; Verschaeve, Luc; Rogiers, Vera; Y Doktorova, Tatyana
2016-11-01
To ensure safety for humans, it is essential to characterize the genotoxic potential of new chemical entities, such as pharmaceutical and cosmetic substances. In a first tier, a battery of in vitro tests is recommended by international regulatory agencies. However, these tests suffer from inadequate specificity: compounds may be wrongly categorized as genotoxic, resulting in unnecessary, time-consuming, and expensive in vivo follow-up testing. In the last decade, novel assays (notably, reporter-based assays) have been developed in an attempt to overcome these drawbacks. Here, we have investigated the performance of two in vitro reporter-based assays, Vitotox and ToxTracker. A set of reference compounds was selected to span a variety of mechanisms of genotoxic action and applicability domains (e.g., pharmaceutical and cosmetic ingredients). Combining the performance of the two assays, we achieved 93% sensitivity and 79% specificity for prediction of gentoxicity for this set of compounds. Both assays permit quick high-throughput analysis of drug candidates, while requiring only small quantities of the test substances. Our study shows that these two assays, when combined, can be a reliable method for assessment of genotoxicity hazard. Copyright © 2016 Elsevier B.V. All rights reserved.
Multiple Hypothesis Testing for Experimental Gingivitis Based on Wilcoxon Signed Rank Statistics
Preisser, John S.; Sen, Pranab K.; Offenbacher, Steven
2011-01-01
Dental research often involves repeated multivariate outcomes on a small number of subjects for which there is interest in identifying outcomes that exhibit change in their levels over time as well as to characterize the nature of that change. In particular, periodontal research often involves the analysis of molecular mediators of inflammation for which multivariate parametric methods are highly sensitive to outliers and deviations from Gaussian assumptions. In such settings, nonparametric methods may be favored over parametric ones. Additionally, there is a need for statistical methods that control an overall error rate for multiple hypothesis testing. We review univariate and multivariate nonparametric hypothesis tests and apply them to longitudinal data to assess changes over time in 31 biomarkers measured from the gingival crevicular fluid in 22 subjects whereby gingivitis was induced by temporarily withholding tooth brushing. To identify biomarkers that can be induced to change, multivariate Wilcoxon signed rank tests for a set of four summary measures based upon area under the curve are applied for each biomarker and compared to their univariate counterparts. Multiple hypothesis testing methods with choice of control of the false discovery rate or strong control of the family-wise error rate are examined. PMID:21984957
Misquitta, Alston J; Stone, Anthony J; Price, Sarah L
2008-01-01
In part 1 of this two-part investigation we set out the theoretical basis for constructing accurate models of the induction energy of clusters of moderately sized organic molecules. In this paper we use these techniques to develop a variety of accurate distributed polarizability models for a set of representative molecules that include formamide, N-methyl propanamide, benzene, and 3-azabicyclo[3.3.1]nonane-2,4-dione. We have also explored damping, penetration, and basis set effects. In particular, we have provided a way to treat the damping of the induction expansion. Different approximations to the induction energy are evaluated against accurate SAPT(DFT) energies, and we demonstrate the accuracy of our induction models on the formamide-water dimer.
Phillips Laboratory small satellite initiatives
NASA Astrophysics Data System (ADS)
Lutey, Mark K.; Imler, Thomas A.; Davis, Robert J.
1993-09-01
The Phillips Laboratory Space Experiments Directorate in conjunction with the Air Force Space Test Program (AF STP), Defense Advanced Research and Projects Agency (DARPA) and Strategic Defense Initiative Organization (SDIO), are managing five small satellite program initiatives: Lightweight Exo-Atmospheric Projectile (LEAP) sponsored by SDIO, Miniature Sensor Technology Integration (MSTI) sponsored by SDIO, Technology for Autonomous Operational Survivability (TAOS) sponsored by Phillips Laboratory, TechSat sponsored by SDIO, and the Advanced Technology Standard Satellite Bus (ATSSB) sponsored by DARPA. Each of these spacecraft fulfills a unique set of program requirements. These program requirements range from a short-lived `one-of-a-kind' mission to the robust multi- mission role. Because of these diverging requirements, each program is driven to use a different design philosophy. But regardless of their design, there is the underlying fact that small satellites do not always equate to small missions. These spacecraft with their use of or ability to insert new technologies provide more capabilities and services for their respective payloads which allows the expansion of their mission role. These varying program efforts culminate in an ATSSB spacecraft bus approach that will support moderate size payloads, up to 500 pounds, in a large set of orbits while satisfying the `cheaper, faster, better' method of doing business. This technical paper provides an overview of each of the five spacecraft, focusing on the objectives, payoffs, technologies demonstrated, and program status.
Collection of quantitative chemical release field data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demirgian, J.; Macha, S.; Loyola Univ.
1999-01-01
Detection and quantitation of chemicals in the environment requires Fourier-transform infrared (FTIR) instruments that are properly calibrated and tested. This calibration and testing requires field testing using matrices that are representative of actual instrument use conditions. Three methods commonly used for developing calibration files and training sets in the field are a closed optical cell or chamber, a large-scale chemical release, and a small-scale chemical release. There is no best method. The advantages and limitations of each method should be considered in evaluating field results. Proper calibration characterizes the sensitivity of an instrument, its ability to detect a component inmore » different matrices, and the quantitative accuracy and precision of the results.« less
Utilization of ERTS-1 data to monitor and classify eutrophication of inland lakes
NASA Technical Reports Server (NTRS)
Chase, P. E. (Principal Investigator)
1973-01-01
The author has identified the following significant results. Bands 6 and 7 have fine structure as obtained by proper selection of digital levels in processing the CCT's. This is contrary to the imagery density received. This means that the small lakes can be classified in IR for different types of water masses. At least four distinct water masses have been determined for test lakes. They are shoreline, shallow water, and two deep waters. One deep water is patchy and presents difficulty in training set selection. The excellent weather and a completely successful field test form a significant happening. It required 12 orbits over the test area before perfect weather occurred.
YSAR: a compact low-cost synthetic aperture radar
NASA Astrophysics Data System (ADS)
Thompson, Douglas G.; Arnold, David V.; Long, David G.; Miner, Gayle F.; Karlinsey, Thomas W.; Robertson, Adam E.
1997-09-01
The Brigham Young University Synthetic Aperture Radar (YSAR) is a compact, inexpensive SAR system which can be flown on a small aircraft. The system has exhibited a resolution of approximately 0.8 m by 0.8 m in test flights in calm conditions. YSAR has been used to collect data over archeological sites in Israel. Using a relatively low frequency (2.1 GHz), we hope to be able to identify walls or other archeological features to assist in excavation. A large data set of radar and photographic data have been collected over sites at Tel Safi, Qumran, Tel Micnah, and the Zippori National Forest in Israel. We show sample images from the archeological data. We are currently working on improved autofocus algorithms for this data and are developing a small, low-cost interferometric SAR system (YINSAR) for operation from a small aircraft.
NASA Astrophysics Data System (ADS)
Green, David William; Lee, Kenneth Ka-Ho; Watson, Jolanta Anna; Kim, Hyun-Yi; Yoon, Kyung-Sik; Kim, Eun-Jung; Lee, Jong-Min; Watson, Gregory Shaun; Jung, Han-Sung
2017-01-01
The external epithelial surfaces of plants and animals are frequently carpeted with small micro- and nanostructures, which broadens their adaptive capabilities in challenging physical habitats. Hairs and other shaped protuberances manage with excessive water, light contaminants, predators or parasites in innovative ways. We are interested in transferring these intricate architectures onto biomedical devices and daily-life surfaces. Such a project requires a very rapid and accurate small-scale fabrication process not involving lithography. In this study, we describe a simple benchtop biotemplating method using shed gecko lizard skin that generates duplicates that closely replicate the small nanotipped hairs (spinules) that cover the original skin. Synthetic replication of the spinule arrays in popular biomaterials closely matched the natural spinules in length. More significantly, the shape, curvature and nanotips of the synthetic arrays are virtually identical to the natural ones. Despite some small differences, the synthetic gecko skin surface resisted wetting and bacterial contamination at the same level as natural shed skin templates. Such synthetic gecko skin surfaces are excellent platforms to test for bacterial control in clinical settings. We envision testing the biocidal properties of the well-matched templates for fungal spores and viral resistance in biomedicine as well as co/multi-cultures.
NASA Astrophysics Data System (ADS)
Goldberg, Fred; Price, Edward; Robinson, Stephen; Boyd-Harlow, Danielle; McKean, Michael
2012-06-01
We report on the adaptation of the small enrollment, lab and discussion based physical science course, Physical Science and Everyday Thinking (PSET), for a large-enrollment, lecture-style setting. Like PSET, the new Learning Physical Science (LEPS) curriculum was designed around specific principles based on research on learning to meet the needs of nonscience students, especially prospective and practicing elementary and middle school teachers. We describe the structure of the two curricula and the adaptation process, including a detailed comparison of similar activities from the two curricula and a case study of a LEPS classroom implementation. In LEPS, short instructor-guided lessons replace lengthier small group activities, and movies, rather than hands-on investigations, provide the evidence used to support and test ideas. LEPS promotes student peer interaction as an important part of sense making via “clicker” questions, rather than small group and whole class discussions typical of PSET. Examples of student dialog indicate that this format is capable of generating substantive student discussion and successfully enacting the design principles. Field-test data show similar student content learning gains with the two curricula. Nevertheless, because of classroom constraints, some important practices of science that were an integral part of PSET were not included in LEPS.
Gene expression analysis using a highly sensitive DNA microarray for colorectal cancer screening.
Koga, Yoshikatsu; Yamazaki, Nobuyoshi; Takizawa, Satoko; Kawauchi, Junpei; Nomura, Osamu; Yamamoto, Seiichiro; Saito, Norio; Kakugawa, Yasuo; Otake, Yosuke; Matsumoto, Minori; Matsumura, Yasuhiro
2014-01-01
Half of all patients with small, right-sided, non-metastatic colorectal cancer (CRC) have negative results for the fecal occult blood test (FOBT). In the present study, the usefulness of CRC screening with a highly sensitive DNA microarray was evaluated in comparison with that by FOBT using fecal samples. A total of 53 patients with CRC and 61 healthy controls were divided into "training" and "validation sets". For the gene profiling, total RNA extracted from 0.5 g of feces was hybridized to a highly sensitive DNA chip. The expressions of 43 genes were significantly higher in the patients with CRC than in healthy controls (p<0.05). In the training set, the sensitivity and specificity of the DNA chip assay using six genes were 85.4% and 85.2%, respectively. On the other hand, in the validation set, the sensitivity and specificity of the DNA chip assay were 85.2% and 85.7%, respectively. The sensitivities of the DNA chip assay were higher than those of FOBT in cases of the small, right-sided, early-CRC, tumor invading up to the muscularis propria (i.e. surface tumor) subgroups. In particular, the sensitivities of the DNA chip assay in the surface tumor and early-CRC subgroups were significantly higher than those of FOBT (p=0.023 and 0.019, respectively.). Gene profiling assay using a highly sensitive DNA chip was more effective than FOBT at detecting patients with small, right-sided, surface tumor, and early-stage CRC.
Rescorla, Leslie; Ivanova, Masha Y; Achenbach, Thomas M; Begovac, Ivan; Chahed, Myriam; Drugli, May Britt; Emerich, Deisy Ribas; Fung, Daniel S S; Haider, Mariam; Hansson, Kjell; Hewitt, Nohelia; Jaimes, Stefanny; Larsson, Bo; Maggiolini, Alfio; Marković, Jasminka; Mitrović, Dragan; Moreira, Paulo; Oliveira, João Tiago; Olsson, Martin; Ooi, Yoon Phaik; Petot, Djaouida; Pisa, Cecilia; Pomalima, Rolando; da Rocha, Marina Monzani; Rudan, Vlasta; Sekulić, Slobodan; Shahini, Mimoza; de Mattos Silvares, Edwiges Ferreira; Szirovicza, Lajos; Valverde, José; Vera, Luis Anderssen; Villa, Maria Clara; Viola, Laura; Woo, Bernardine S C; Zhang, Eugene Yuqing
2012-12-01
To build on Achenbach, Rescorla, and Ivanova (2012) by (a) reporting new international findings for parent, teacher, and self-ratings on the Child Behavior Checklist, Youth Self-Report, and Teacher's Report Form; (b) testing the fit of syndrome models to new data from 17 societies, including previously underrepresented regions; (c) testing effects of society, gender, and age in 44 societies by integrating new and previous data; (d) testing cross-society correlations between mean item ratings; (e) describing the construction of multisociety norms; (f) illustrating clinical applications. Confirmatory factor analyses (CFAs) of parent, teacher, and self-ratings, performed separately for each society; tests of societal, gender, and age effects on dimensional syndrome scales, DSM-oriented scales, Internalizing, Externalizing, and Total Problems scales; tests of agreement between low, medium, and high ratings of problem items across societies. CFAs supported the tested syndrome models in all societies according to the primary fit index (Root Mean Square Error of Approximation [RMSEA]), but less consistently according to other indices; effect sizes were small-to-medium for societal differences in scale scores, but very small for gender, age, and interactions with society; items received similarly low, medium, or high ratings in different societies; problem scores from 44 societies fit three sets of multisociety norms. Statistically derived syndrome models fit parent, teacher, and self-ratings when tested individually in all 44 societies according to RMSEAs (but less consistently according to other indices). Small to medium differences in scale scores among societies supported the use of low-, medium-, and high-scoring norms in clinical assessment of individual children. Copyright © 2012 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.
ChemBank: a small-molecule screening and cheminformatics resource database.
Seiler, Kathleen Petri; George, Gregory A; Happ, Mary Pat; Bodycombe, Nicole E; Carrinski, Hyman A; Norton, Stephanie; Brudz, Steve; Sullivan, John P; Muhlich, Jeremy; Serrano, Martin; Ferraiolo, Paul; Tolliday, Nicola J; Schreiber, Stuart L; Clemons, Paul A
2008-01-01
ChemBank (http://chembank.broad.harvard.edu/) is a public, web-based informatics environment developed through a collaboration between the Chemical Biology Program and Platform at the Broad Institute of Harvard and MIT. This knowledge environment includes freely available data derived from small molecules and small-molecule screens and resources for studying these data. ChemBank is unique among small-molecule databases in its dedication to the storage of raw screening data, its rigorous definition of screening experiments in terms of statistical hypothesis testing, and its metadata-based organization of screening experiments into projects involving collections of related assays. ChemBank stores an increasingly varied set of measurements derived from cells and other biological assay systems treated with small molecules. Analysis tools are available and are continuously being developed that allow the relationships between small molecules, cell measurements, and cell states to be studied. Currently, ChemBank stores information on hundreds of thousands of small molecules and hundreds of biomedically relevant assays that have been performed at the Broad Institute by collaborators from the worldwide research community. The goal of ChemBank is to provide life scientists unfettered access to biomedically relevant data and tools heretofore available primarily in the private sector.
NASA Astrophysics Data System (ADS)
Nanni, Ambra; Marigo, Paola; Groenewegen, Martin A. T.; Aringer, Berhard; Girardi, Léo; Pastorelli, Giada; Bressan, Alessandro; Bladh, Sara
2016-07-01
We present our recent investigation aimed at constraining the typical size and optical properties of carbon dust grains in Circumstellar envelopes (CSEs) of carbon-rich stars (C-stars) in the Small Magellanic Cloud (SMC).We applied our recent dust growth model, coupled with a radiative transfer code, to the dusty CSEs of C-stars along the TP-AGB phase, for which we computed spectra and colors. We then compared our modeled colors in the Near and Mid Infrared (NIR and MIR) bands with the observed ones, testing different assumptions in our dust scheme and employing different optical constants data sets for carbon dust. We constrained the optical properties of carbon dust by identifying the combinations of typical grain size and optical constants data set which simultaneously reproduce several colors in the NIR and MIR wavelengths. In particular, the different choices of optical properties and grain size lead to differences in the NIR and MIR colors greater than two magnitudes in some cases. We concluded that the complete set of selected NIR and MIR colors are best reproduced by small grains, with sizes between 0.06 and 0.1 mum, rather than by large grains of 0.2-0.4 mum. The inability of large grains to reproduce NIR and MIR colors is found to be independent of the adopted optical data set and the deviations between models and observations tend to increase for increasing grain sizes. We also find a possible trend of the typical grain size with mss-loss and/or carbon-excess in the CSEs of these stars.The work presented is preparatory to future studies aimed at calibrating the TP-AGB phase through resolved stellar populations in the framework of the STARKEY project.
Sahi, Kamal; Jackson, Stuart; Wiebe, Edward; Armstrong, Gavin; Winters, Sean; Moore, Ronald; Low, Gavin
2014-02-01
To assess if "liver window" settings improve the conspicuity of small renal cell carcinomas (RCC). Patients were analysed from our institution's pathology-confirmed RCC database that included the following: (1) stage T1a RCCs, (2) an unenhanced computed tomography (CT) abdomen performed ≤ 6 months before histologic diagnosis, and (3) age ≥ 17 years. Patients with multiple tumours, prior nephrectomy, von Hippel-Lindau disease, and polycystic kidney disease were excluded. The unenhanced CT was analysed, and the tumour locations were confirmed by using corresponding contrast-enhanced CT or magnetic resonance imaging studies. Representative single-slice axial, coronal, and sagittal unenhanced CT images were acquired in "soft tissue windows" (width, 400 Hounsfield unit (HU); level, 40 HU) and liver windows (width, 150 HU; level, 88 HU). In addition, single-slice axial, coronal, and sagittal unenhanced CT images of nontumourous renal tissue (obtained from the same cases) were acquired in soft tissue windows and liver windows. These data sets were randomized, unpaired, and were presented independently to 3 blinded radiologists for analysis. The presence or absence of suspicious findings for tumour was scored on a 5-point confidence scale. Eighty-three of 415 patients met the study criteria. Receiver operating characteristics (ROC) analysis, t test analysis, and kappa analysis were used. ROC analysis showed statistically superior diagnostic performance for liver windows compared with soft tissue windows (area under the curve of 0.923 vs 0.879; P = .0002). Kappa statistics showed "good" vs "moderate" agreement between readers for liver windows compared with soft tissue windows. Use of liver windows settings improves the detection of small RCCs on the unenhanced CT. Copyright © 2014 Canadian Association of Radiologists. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Han, D. Y.; Cao, P.; Liu, J.; Zhu, J. B.
2017-12-01
Cutter spacing is an essential parameter in the TBM design. However, few efforts have been made to study the optimum cutter spacing incorporating penetration depth. To investigate the influence of pre-set penetration depth and cutter spacing on sandstone breakage and TBM performance, a series of sequential laboratory indentation tests were performed in a biaxial compression state. Effects of parameters including penetration force, penetration depth, chip mass, chip size distribution, groove volume, specific energy and maximum angle of lateral crack were investigated. Results show that the total mass of chips, the groove volume and the observed optimum cutter spacing increase with increasing pre-set penetration depth. It is also found that the total mass of chips could be an alternative means to determine optimum cutter spacing. In addition, analysis of chip size distribution suggests that the mass of large chips is dominated by both cutter spacing and pre-set penetration depth. After fractal dimension analysis, we found that cutter spacing and pre-set penetration depth have negligible influence on the formation of small chips and that small chips are formed due to squeezing of cutters and surface abrasion caused by shear failure. Analysis on specific energy indicates that the observed optimum spacing/penetration ratio is 10 for the sandstone, at which, the specific energy and the maximum angle of lateral cracks are smallest. The findings in this paper contribute to better understanding of the coupled effect of cutter spacing and pre-set penetration depth on TBM performance and rock breakage, and provide some guidelines for cutter arrangement.
2013-01-01
Background Fabry disease is an inborn lysosomal storage disorder which is associated with small fiber neuropathy. We set out to investigate small fiber conduction in Fabry patients using pain-related evoked potentials (PREP). Methods In this case–control study we prospectively studied 76 consecutive Fabry patients for electrical small fiber conduction in correlation with small fiber function and morphology. Data were compared with healthy controls using non-parametric statistical tests. All patients underwent neurological examination and were investigated with pain and depression questionnaires. Small fiber function (quantitative sensory testing, QST), morphology (skin punch biopsy), and electrical conduction (PREP) were assessed and correlated. Patients were stratified for gender and disease severity as reflected by renal function. Results All Fabry patients (31 men, 45 women) had small fiber neuropathy. Men with Fabry disease showed impaired cold (p < 0.01) and warm perception (p < 0.05), while women did not differ from controls. Intraepidermal nerve fiber density (IENFD) was reduced at the lower leg (p < 0.001) and the back (p < 0.05) mainly of men with impaired renal function. When investigating A-delta fiber conduction with PREP, men but not women with Fabry disease had lower amplitudes upon stimulation at face (p < 0.01), hands (p < 0.05), and feet (p < 0.01) compared to controls. PREP amplitudes further decreased with advance in disease severity. PREP amplitudes and warm (p < 0.05) and cold detection thresholds (p < 0.01) at the feet correlated positively in male patients. Conclusion Small fiber conduction is impaired in men with Fabry disease and worsens with advanced disease severity. PREP are well-suited to measure A-delta fiber conduction. PMID:23705943
Predicting a small molecule-kinase interaction map: A machine learning approach
2011-01-01
Background We present a machine learning approach to the problem of protein ligand interaction prediction. We focus on a set of binding data obtained from 113 different protein kinases and 20 inhibitors. It was attained through ATP site-dependent binding competition assays and constitutes the first available dataset of this kind. We extract information about the investigated molecules from various data sources to obtain an informative set of features. Results A Support Vector Machine (SVM) as well as a decision tree algorithm (C5/See5) is used to learn models based on the available features which in turn can be used for the classification of new kinase-inhibitor pair test instances. We evaluate our approach using different feature sets and parameter settings for the employed classifiers. Moreover, the paper introduces a new way of evaluating predictions in such a setting, where different amounts of information about the binding partners can be assumed to be available for training. Results on an external test set are also provided. Conclusions In most of the cases, the presented approach clearly outperforms the baseline methods used for comparison. Experimental results indicate that the applied machine learning methods are able to detect a signal in the data and predict binding affinity to some extent. For SVMs, the binding prediction can be improved significantly by using features that describe the active site of a kinase. For C5, besides diversity in the feature set, alignment scores of conserved regions turned out to be very useful. PMID:21708012
Arechar, Antonio A.; Kouchaki, Maryam; Rand, David G.
2018-01-01
We had participants play two sets of repeated Prisoner’s Dilemma (RPD) games, one with a large continuation probability and the other with a small continuation probability, as well as Dictator Games (DGs) before and after the RPDs. We find that, regardless of which is RPD set is played first, participants typically cooperate when the continuation probability is large and defect when the continuation probability is small. However, there is an asymmetry in behavior when transitioning from one continuation probability to the other. When switching from large to small, transient higher levels of cooperation are observed in the early games of the small continuation set. Conversely, when switching from small to large, cooperation is immediately high in the first game of the large continuation set. We also observe that response times increase when transitioning between sets of RPDs, except for altruistic participants transitioning into the set of RPDs with long continuation probabilities. These asymmetries suggest a bias in favor of cooperation. Finally, we examine the link between altruism and RPD play. We find that small continuation probability RPD play is correlated with giving in DGs played before and after the RPDs, whereas high continuation probability RPD play is not. PMID:29809199
Hsiang, E; Little, K M; Haguma, P; Hanrahan, C F; Katamba, A; Cattamanchi, A; Davis, J L; Vassall, A; Dowdy, D
2016-09-01
Initial cost-effectiveness evaluations of Xpert(®) MTB/RIF for tuberculosis (TB) diagnosis have not fully accounted for the realities of implementation in peripheral settings. To evaluate costs and diagnostic outcomes of Xpert testing implemented at various health care levels in Uganda. We collected empirical cost data from five health centers utilizing Xpert for TB diagnosis, using an ingredients approach. We reviewed laboratory and patient records to assess outcomes at these sites and10 sites without Xpert. We also estimated incremental cost-effectiveness of Xpert testing; our primary outcome was the incremental cost of Xpert testing per newly detected TB case. The mean unit cost of an Xpert test was US$21 based on a mean monthly volume of 54 tests per site, although unit cost varied widely (US$16-58) and was primarily determined by testing volume. Total diagnostic costs were 2.4-fold higher in Xpert clinics than in non-Xpert clinics; however, Xpert only increased diagnoses by 12%. The diagnostic costs of Xpert averaged US$119 per newly detected TB case, but were as high as US$885 at the center with the lowest volume of tests. Xpert testing can detect TB cases at reasonable cost, but may double diagnostic budgets for relatively small gains, with cost-effectiveness deteriorating with lower testing volumes.
13 CFR 121.412 - What are the size procedures for partial small business set-asides?
Code of Federal Regulations, 2010 CFR
2010-01-01
... Requirements for Government Procurement § 121.412 What are the size procedures for partial small business set... portion of a procurement, and is not required to qualify as a small business for the unrestricted portion. ...
Optimization of auxiliary basis sets for the LEDO expansion and a projection technique for LEDO-DFT.
Götz, Andreas W; Kollmar, Christian; Hess, Bernd A
2005-09-01
We present a systematic procedure for the optimization of the expansion basis for the limited expansion of diatomic overlap density functional theory (LEDO-DFT) and report on optimized auxiliary orbitals for the Ahlrichs split valence plus polarization basis set (SVP) for the elements H, Li--F, and Na--Cl. A new method to deal with near-linear dependences in the LEDO expansion basis is introduced, which greatly reduces the computational effort of LEDO-DFT calculations. Numerical results for a test set of small molecules demonstrate the accuracy of electronic energies, structural parameters, dipole moments, and harmonic frequencies. For larger molecular systems the numerical errors introduced by the LEDO approximation can lead to an uncontrollable behavior of the self-consistent field (SCF) process. A projection technique suggested by Löwdin is presented in the framework of LEDO-DFT, which guarantees for SCF convergence. Numerical results on some critical test molecules suggest the general applicability of the auxiliary orbitals presented in combination with this projection technique. Timing results indicate that LEDO-DFT is competitive with conventional density fitting methods. (c) 2005 Wiley Periodicals, Inc.
Paul, Topon Kumar; Iba, Hitoshi
2009-01-01
In order to get a better understanding of different types of cancers and to find the possible biomarkers for diseases, recently, many researchers are analyzing the gene expression data using various machine learning techniques. However, due to a very small number of training samples compared to the huge number of genes and class imbalance, most of these methods suffer from overfitting. In this paper, we present a majority voting genetic programming classifier (MVGPC) for the classification of microarray data. Instead of a single rule or a single set of rules, we evolve multiple rules with genetic programming (GP) and then apply those rules to test samples to determine their labels with majority voting technique. By performing experiments on four different public cancer data sets, including multiclass data sets, we have found that the test accuracies of MVGPC are better than those of other methods, including AdaBoost with GP. Moreover, some of the more frequently occurring genes in the classification rules are known to be associated with the types of cancers being studied in this paper.
A large-scale deforestation experiment: Effects of patch area and isolation on Amazon birds
Ferraz, G.; Nichols, J.D.; Hines, J.E.; Stouffer, P.C.; Bierregaard, R.O.; Lovejoy, T.E.
2007-01-01
As compared with extensive contiguous areas, small isolated habitat patches lack many species. Some species disappear after isolation; others are rarely found in any small patch, regardless of isolation. We used a 13-year data set of bird captures from a large landscape-manipulation experiment in a Brazilian Amazon forest to model the extinction-colonization dynamics of 55 species and tested basic predictions of island biogeography and metapopulation theory. From our models, we derived two metrics of species vulnerability to changes in isolation and patch area. We found a strong effect of area and a variable effect of isolation on the predicted patch occupancy by birds.
NASA Astrophysics Data System (ADS)
Luo, Jia; Zhang, Min; Zhou, Xiaoling; Chen, Jianhua; Tian, Yuxin
2018-01-01
Taken 4 main tree species in the Wuling mountain small watershed as research objects, 57 typical sample plots were set up according to the stand type, site conditions and community structure. 311 goal diameter-class sample trees were selected according to diameter-class groups of different tree-height grades, and the optimal fitting models of tree height and DBH growth of main tree species were obtained by stem analysis using Richard, Logistic, Korf, Mitscherlich, Schumacher, Weibull theoretical growth equations, and the correlation coefficient of all optimal fitting models reached above 0.9. Through the evaluation and test, the optimal fitting models possessed rather good fitting precision and forecast dependability.
NASA Astrophysics Data System (ADS)
Beck, Hylke; de Roo, Ad; van Dijk, Albert; McVicar, Tim; Miralles, Diego; Schellekens, Jaap; Bruijnzeel, Sampurno; de Jeu, Richard
2015-04-01
Motivated by the lack of large-scale model parameter regionalization studies, a large set of 3328 small catchments (< 10000 km2) around the globe was used to set up and evaluate five model parameterization schemes at global scale. The HBV-light model was chosen because of its parsimony and flexibility to test the schemes. The catchments were calibrated against observed streamflow (Q) using an objective function incorporating both behavioral and goodness-of-fit measures, after which the catchment set was split into subsets of 1215 donor and 2113 evaluation catchments based on the calibration performance. The donor catchments were subsequently used to derive parameter sets that were transferred to similar grid cells based on a similarity measure incorporating climatic and physiographic characteristics, thereby producing parameter maps with global coverage. Overall, there was a lack of suitable donor catchments for mountainous and tropical environments. The schemes with spatially-uniform parameter sets (EXP2 and EXP3) achieved the worst Q estimation performance in the evaluation catchments, emphasizing the importance of parameter regionalization. The direct transfer of calibrated parameter sets from donor catchments to similar grid cells (scheme EXP1) performed best, although there was still a large performance gap between EXP1 and HBV-light calibrated against observed Q. The schemes with parameter sets obtained by simultaneously calibrating clusters of similar donor catchments (NC10 and NC58) performed worse than EXP1. The relatively poor Q estimation performance achieved by two (uncalibrated) macro-scale hydrological models suggests there is considerable merit in regionalizing the parameters of such models. The global HBV-light parameter maps and ancillary data are freely available via http://water.jrc.ec.europa.eu.
48 CFR 19.503 - Setting aside a class of acquisitions for small business.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 19.502-3(a). The determination to make a class small business set-aside shall not depend on the... economic production runs or reasonable lots, in the case of a partial class set-aside. (d) The contracting...
Correcting for Optimistic Prediction in Small Data Sets
Smith, Gordon C. S.; Seaman, Shaun R.; Wood, Angela M.; Royston, Patrick; White, Ian R.
2014-01-01
The C statistic is a commonly reported measure of screening test performance. Optimistic estimation of the C statistic is a frequent problem because of overfitting of statistical models in small data sets, and methods exist to correct for this issue. However, many studies do not use such methods, and those that do correct for optimism use diverse methods, some of which are known to be biased. We used clinical data sets (United Kingdom Down syndrome screening data from Glasgow (1991–2003), Edinburgh (1999–2003), and Cambridge (1990–2006), as well as Scottish national pregnancy discharge data (2004–2007)) to evaluate different approaches to adjustment for optimism. We found that sample splitting, cross-validation without replication, and leave-1-out cross-validation produced optimism-adjusted estimates of the C statistic that were biased and/or associated with greater absolute error than other available methods. Cross-validation with replication, bootstrapping, and a new method (leave-pair-out cross-validation) all generated unbiased optimism-adjusted estimates of the C statistic and had similar absolute errors in the clinical data set. Larger simulation studies confirmed that all 3 methods performed similarly with 10 or more events per variable, or when the C statistic was 0.9 or greater. However, with lower events per variable or lower C statistics, bootstrapping tended to be optimistic but with lower absolute and mean squared errors than both methods of cross-validation. PMID:24966219
SMR Re-Scaling and Modeling for Load Following Studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoover, K.; Wu, Q.; Bragg-Sitton, S.
2016-11-01
This study investigates the creation of a new set of scaling parameters for the Oregon State University Multi-Application Small Light Water Reactor (MASLWR) scaled thermal hydraulic test facility. As part of a study being undertaken by Idaho National Lab involving nuclear reactor load following characteristics, full power operations need to be simulated, and therefore properly scaled. Presented here is the scaling analysis and plans for RELAP5-3D simulation.
SPANISH PEAKS PRIMITIVE AREA, MONTANA.
Calkins, James A.; Pattee, Eldon C.
1984-01-01
A mineral survey of the Spanish Peaks Primitive Area, Montana, disclosed a small low-grade deposit of demonstrated chromite and asbestos resources. The chances for discovery of additional chrome resources are uncertain and the area has little promise for the occurrence of other mineral or energy resources. A reevaluation, sampling at depth, and testing for possible extensions of the Table Mountain asbestos and chromium deposit should be undertaken in the light of recent interpretations regarding its geologic setting.
Development of a Detonation Diffuser
2014-03-27
detonation frequency is adjustable from 8 Hz to 40 Hz, and the ignition can be set to operate in “burst mode” firing for a predetermined number of cycles... resistance were tried, but the strain on the windows caused the coating to fracture. Without a scratch- resistant coating, the windows regularly suffered... abrasion from the Shock wave Strain waves 35 test articles. The heat from local explosions did burn away a small amount of the window surface
Comparison of Sample and Detection Quantification Methods for Salmonella Enterica from Produce
NASA Technical Reports Server (NTRS)
Hummerick, M. P.; Khodadad, C.; Richards, J. T.; Dixit, A.; Spencer, L. M.; Larson, B.; Parrish, C., II; Birmele, M.; Wheeler, Raymond
2014-01-01
The purpose of this study was to identify and optimize fast and reliable sampling and detection methods for the identification of pathogens that may be present on produce grown in small vegetable production units on the International Space Station (ISS), thus a field setting. Microbiological testing is necessary before astronauts are allowed to consume produce grown on ISS where currently there are two vegetable production units deployed, Lada and Veggie.
Once-through integral system (OTIS): Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gloudemans, J R
1986-09-01
A scaled experimental facility, designated the once-through integral system (OTIS), was used to acquire post-small break loss-of-coolant accident (SBLOCA) data for benchmarking system codes. OTIS was also used to investigate the application of the Abnormal Transient Operating Guidelines (ATOG) used in the Babcock and Wilcox (B and W) designed nuclear steam supply system (NSSS) during the course of an SBLOCA. OTIS was a single-loop facility with a plant to model power scale factor of 1686. OTIS maintained the key elevations, approximate component volumes, and loop flow resistances, and simulated the major component phenomena of a B and W raised-loop nuclearmore » plant. A test matrix consisting of 15 tests divided into four categories was performed. The largest group contained 10 tests and was defined to parametrically obtain an extensive set of plant-typical experimental data for code benchmarking. Parameters such as leak size, leak location, and high-pressure injection (HPI) shut-off head were individually varied. The remaining categories were specified to study the impact of the ATOGs (2 tests), to note the effect of guard heater operation on observed phenomena (2 tests), and to provide a data set for comparison with previous test experience (1 test). A summary of the test results and a detailed discussion of Test 220100 is presented. Test 220100 was the nominal or reference test for the parametric studies. This test was performed with a scaled 10-cm/sup 2/ leak located in the cold leg suction piping.« less
Analysis and flight evaluation of a small, fixed-wing aircraft equipped with hinged plate spoilers
NASA Technical Reports Server (NTRS)
Olcott, J. W.; Sackel, E.; Ellis, D. R.
1981-01-01
The results of a four phase effort to evaluate the application of hinged plate spoilers/dive brakes to a small general aviation aircraft are presented. The test vehicle was a single engine light aircraft modified with an experimental set of upper surface spoilers and lower surface dive brakes similar to the type used on sailplanes. The lift, drag, stick free stability, trim, and dynamic response characteristics of four different spoiler/dive brake configurations were determined. Tests also were conducted, under a wide range of flight conditions and with pilots of various experience levels, to determine the most favorable methods of spoiler control and to evaluate how spoilers might best be used during the approach and landing task. The effects of approach path angle, approach airspeed, and pilot technique using throttle/spoiler integrated control were investigated for day, night, VFR, and IFR approaches and landings. The test results indicated that spoilers offered significant improvements in the vehicle's performance and flying qualities for all elements of the approach and landing task, provided a suitable method of control was available.
Detection of explosive cough events in audio recordings by internal sound analysis.
Rocha, B M; Mendes, L; Couceiro, R; Henriques, J; Carvalho, P; Paiva, R P
2017-07-01
We present a new method for the discrimination of explosive cough events, which is based on a combination of spectral content descriptors and pitch-related features. After the removal of near-silent segments, a vector of event boundaries is obtained and a proposed set of 9 features is extracted for each event. Two data sets, recorded using electronic stethoscopes and comprising a total of 46 healthy subjects and 13 patients, were employed to evaluate the method. The proposed feature set is compared to three other sets of descriptors: a baseline, a combination of both sets, and an automatic selection of the best 10 features from both sets. The combined feature set yields good results on the cross-validated database, attaining a sensitivity of 92.3±2.3% and a specificity of 84.7±3.3%. Besides, this feature set seems to generalize well when it is trained on a small data set of patients, with a variety of respiratory and cardiovascular diseases, and tested on a bigger data set of mostly healthy subjects: a sensitivity of 93.4% and a specificity of 83.4% are achieved in those conditions. These results demonstrate that complementing the proposed feature set with a baseline set is a promising approach.
Role of the Middle Lumbar Fascia on Spinal Mechanics: A Human Biomechanical Assessment.
Ranger, Tom A; Newell, Nicolas; Grant, Caroline A; Barker, Priscilla J; Pearcy, Mark J
2017-04-15
Biomechanical experiment. The aims of the present study were to test the effect of fascial tension on lumbar segmental axial rotation and lateral flexion and the effect of the angle of fascial attachment. Tension in the middle layer of lumbar fascia has been demonstrated to affect mechanical properties of lumbar segmental flexion and extension in the neutral zone. The effect of tension on segmental axial rotation and lateral flexion has, however, not been investigated. Seven unembalmed lumbar spines were divided into segments and mounted for testing. A 6 degree-of-freedom robotic testing facility was used to displace the segments in each anatomical plane (flexion-extension, lateral bending, and axial rotation) with force and moment data recorded by a load cell positioned beneath the test specimen. Tests were performed with and without a 20 N fascia load and the subsequent forces and moments were compared. In addition, forces and moments were compared when the specimens were held in a set position and the fascia loading angle was varied. A fascial tension of 20 N had no measurable effect on the forces or moments measured when the specimens were displaced in any plane of motion (P > 0.05). When 20 N of fascial load were applied to motion segments in a set position small segmental forces and moments were measured. Changing the angle of the fascial load did not significantly alter these measurements. Application of a 20 N fascial load did not produce a measureable effect on the mechanics of a motion segment, even though it did produce small measurable forces and moments on the segments when in a fixed position. Results from the present study are inconsistent with previous studies, suggesting that further investigation using multiple testing protocols and different loading conditions is required to determine the effects of fascial loading on spinal segment behavior. N/A.
West, Howard
2017-09-01
The current standard of care for molecular marker testing in patients with advanced non-small cell lung cancer (NSCLC) has been evolving over several years and is a product of the quality of the evidence supporting a targeted therapy for a specific molecular marker, the pre-test probability of that marker in the population, and the magnitude of benefit seen with that treatment. Among the markers that have one or more matched targeted therapies, only a few are in the subset for which they should be considered as most clearly worthy of prioritizing to detect in the first line setting in order to have them supplant other first line alternatives, and in only a subset of patients, as defined currently by NSCLC histology. Specifically, this currently includes testing for an activating epidermal growth factor receptor ( EGFR ) mutation or an anaplastic lymphoma kinase ( ALK ) or ROS1 rearrangement. This article reviews the history and data supporting the prioritization of these markers in patients with non-squamous NSCLC, a histologically selected population in whom the probability of these markers combined with the anticipated efficacy of targeted therapies against them is high enough to favor these treatments in the first line setting. In reviewing the evidence supporting this very limited core subset of most valuable molecular markers to detect in the initial workup of such patients, we can also see the criteria by which other actionable markers need to reach in order to be widely recognized as reliably valuable enough to warrant prioritization to detect in the initial workup of advanced NSCLC as well.
Performance of Small Pore Microchannel Plates
NASA Technical Reports Server (NTRS)
Siegmund, O. H. W.; Gummin, M. A.; Ravinett, T.; Jelinsky, S. R.; Edgar, M.
1995-01-01
Small pore size microchannel plates (MCP's) are needed to satisfy the requirements for future high resolution small and large format detectors for astronomy. MCP's with pore sizes in the range 5 micron to 8 micron are now being manufactured, but they are of limited availability and are of small size. We have obtained sets of Galileo 8 micron and 6.5 micron MCP's, and Philips 6 micron and 7 micron pore MCP's, and compared them to our larger pore MCP Z stacks. We have tested back to back MCP stacks of four of these MCP's and achieved gains greater than 2 x 1O(exp 7) with pulse height distributions of less than 40% FWHM, and background rates of less than 0.3 events sec(exp -1) cm(exp -2). Local counting rates up to approx. 100 events/pore/sec have been attained with little drop of the MCP gain. The bare MCP quantum efficiencies are somewhat lower than those expected, however. Flat field images are characterized by an absence of MCP fixed pattern noise.
Teng, Shizhu; Jia, Qiaojuan; Huang, Yijian; Chen, Liangcao; Fei, Xufeng; Wu, Jiaping
2015-10-01
Sporadic cases occurring in mall geographic unit could lead to extreme value of incidence due to the small population bases, which would influence the analysis of actual incidence. This study introduced a method of hierarchy clustering and partitioning regionalization, which integrates areas with small population into larger areas with enough population by using Geographic Information System (GIS) based on the principles of spatial continuity and geographical similarity (homogeneity test). This method was applied in spatial epidemiology by using a data set of thyroid cancer incidence in Yiwu, Zhejiang province, between 2010 and 2013. Thyroid cancer incidence data were more reliable and stable in the new regionalized areas. Hotspot analysis (Getis-Ord) on the incidence in new areas indicated that there was obvious case clustering in the central area of Yiwu. This method can effectively solve the problem of small population base in small geographic units in spatial epidemiological analysis of thyroid cancer incidence and can be used for other diseases and in other areas.
48 CFR 819.7006 - Veteran-owned small business set-aside procedures.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Veteran-owned small business set-aside procedures. 819.7006 Section 819.7006 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Service-Disabled Veteran-Owned and...
Learning and liking an artificial musical system: Effects of set size and repeated exposure
Loui, Psyche; Wessel, David
2009-01-01
We report an investigation of humans' musical learning ability using a novel musical system. We designed an artificial musical system based on the Bohlen-Pierce scale, a scale very different from Western music. Melodies were composed from chord progressions in the new scale by applying the rules of a finite-state grammar. After exposing participants to sets of melodies, we conducted listening tests to assess learning, including recognition tests, generalization tests, and subjective preference ratings. In Experiment 1, participants were presented with 15 melodies 27 times each. Forced choice results showed that participants were able to recognize previously encountered melodies and generalize their knowledge to new melodies, suggesting internalization of the musical grammar. Preference ratings showed no differentiation among familiar, new, and ungrammatical melodies. In Experiment 2, participants were given 10 melodies 40 times each. Results showed superior recognition but unsuccessful generalization. Additionally, preference ratings were significantly higher for familiar melodies. Results from the two experiments suggest that humans can internalize the grammatical structure of a new musical system following exposure to a sufficiently large set size of melodies, but musical preference results from repeated exposure to a small number of items. This dissociation between grammar learning and preference will be further discussed. PMID:20151034
Learning and liking an artificial musical system: Effects of set size and repeated exposure.
Loui, Psyche; Wessel, David
2008-10-01
We report an investigation of humans' musical learning ability using a novel musical system. We designed an artificial musical system based on the Bohlen-Pierce scale, a scale very different from Western music. Melodies were composed from chord progressions in the new scale by applying the rules of a finite-state grammar. After exposing participants to sets of melodies, we conducted listening tests to assess learning, including recognition tests, generalization tests, and subjective preference ratings. In Experiment 1, participants were presented with 15 melodies 27 times each. Forced choice results showed that participants were able to recognize previously encountered melodies and generalize their knowledge to new melodies, suggesting internalization of the musical grammar.Preference ratings showed no differentiation among familiar, new, and ungrammatical melodies. In Experiment 2, participants were given 10 melodies 40 times each. Results showed superior recognition but unsuccessful generalization. Additionally, preference ratings were significantly higher for familiar melodies. Results from the two experiments suggest that humans can internalize the grammatical structure of a new musical system following exposure to a sufficiently large set size of melodies, but musical preference results from repeated exposure to a small number of items. This dissociation between grammar learning and preference will be further discussed.
Huang, Yen-Tsung; Pan, Wen-Chi
2016-06-01
Causal mediation modeling has become a popular approach for studying the effect of an exposure on an outcome through a mediator. However, current methods are not applicable to the setting with a large number of mediators. We propose a testing procedure for mediation effects of high-dimensional continuous mediators. We characterize the marginal mediation effect, the multivariate component-wise mediation effects, and the L2 norm of the component-wise effects, and develop a Monte-Carlo procedure for evaluating their statistical significance. To accommodate the setting with a large number of mediators and a small sample size, we further propose a transformation model using the spectral decomposition. Under the transformation model, mediation effects can be estimated using a series of regression models with a univariate transformed mediator, and examined by our proposed testing procedure. Extensive simulation studies are conducted to assess the performance of our methods for continuous and dichotomous outcomes. We apply the methods to analyze genomic data investigating the effect of microRNA miR-223 on a dichotomous survival status of patients with glioblastoma multiforme (GBM). We identify nine gene ontology sets with expression values that significantly mediate the effect of miR-223 on GBM survival. © 2015, The International Biometric Society.
Sykes, Lynn R.; Cifuentes, Inés L.
1984-01-01
Magnitudes of the larger Soviet underground nuclear weapons tests from the start of the Threshold Test Ban Treaty in 1976 through 1982 are determined for short- and long-period seismic waves. Yields are calculated from the surface wave magnitude for those explosions at the eastern Kazakh test site that triggered a small-to-negligible component of tectonic stress and are used to calibrate body wave magnitude-yield relationship that can be used to determine the sizes of other explosions at that test site. The results confirm that a large bias, related to differential attenuation of P waves, exists between Nevada and Central Asia. The yields of the seven largest Soviet explosions are nearly identical and are close to 150 kilotons, the limit set by the Threshold Treaty. PMID:16593440
Grudinin, Sergei; Garkavenko, Maria; Kazennov, Andrei
2017-05-01
A new method called Pepsi-SAXS is presented that calculates small-angle X-ray scattering profiles from atomistic models. The method is based on the multipole expansion scheme and is significantly faster compared with other tested methods. In particular, using the Nyquist-Shannon-Kotelnikov sampling theorem, the multipole expansion order is adapted to the size of the model and the resolution of the experimental data. It is argued that by using the adaptive expansion order, this method has the same quadratic dependence on the number of atoms in the model as the Debye-based approach, but with a much smaller prefactor in the computational complexity. The method has been systematically validated on a large set of over 50 models collected from the BioIsis and SASBDB databases. Using a laptop, it was demonstrated that Pepsi-SAXS is about seven, 29 and 36 times faster compared with CRYSOL, FoXS and the three-dimensional Zernike method in SAStbx, respectively, when tested on data from the BioIsis database, and is about five, 21 and 25 times faster compared with CRYSOL, FoXS and SAStbx, respectively, when tested on data from SASBDB. On average, Pepsi-SAXS demonstrates comparable accuracy in terms of χ 2 to CRYSOL and FoXS when tested on BioIsis and SASBDB profiles. Together with a small allowed variation of adjustable parameters, this demonstrates the effectiveness of the method. Pepsi-SAXS is available at http://team.inria.fr/nano-d/software/pepsi-saxs.
SU-F-BRD-10: Lung IMRT Planning Using Standardized Beam Bouquet Templates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yuan, L; Wu, Q J.; Yin, F
2014-06-15
Purpose: We investigate the feasibility of choosing from a small set of standardized templates of beam bouquets (i.e., entire beam configuration settings) for lung IMRT planning to improve planning efficiency and quality consistency, and also to facilitate automated planning. Methods: A set of beam bouquet templates is determined by learning from the beam angle settings in 60 clinical lung IMRT plans. A k-medoids cluster analysis method is used to classify the beam angle configuration into clusters. The value of the average silhouette width is used to determine the ideal number of clusters. The beam arrangements in each medoid of themore » resulting clusters are taken as the standardized beam bouquet for the cluster, with the corresponding case taken as the reference case. The resulting set of beam bouquet templates was used to re-plan 20 cases randomly selected from the database and the dosimetric quality of the plans was evaluated against the corresponding clinical plans by a paired t-test. The template for each test case was manually selected by a planner based on the match between the test and reference cases. Results: The dosimetric parameters (mean±S.D. in percentage of prescription dose) of the plans using 6 beam bouquet templates and those of the clinical plans, respectively, and the p-values (in parenthesis) are: lung Dmean: 18.8±7.0, 19.2±7.0 (0.28), esophagus Dmean: 32.0±16.3, 34.4±17.9 (0.01), heart Dmean: 19.2±16.5, 19.4±16.6 (0.74), spinal cord D2%: 47.7±18.8, 52.0±20.3 (0.01), PTV dose homogeneity (D2%-D99%): 17.1±15.4, 20.7±12.2 (0.03).The esophagus Dmean, cord D02 and PTV dose homogeneity are statistically better in the plans using the standardized templates, but the improvements (<5%) may not be clinically significant. The other dosimetric parameters are not statistically different. Conclusion: It's feasible to use a small number of standardized beam bouquet templates (e.g. 6) to generate plans with quality comparable to that of clinical plans. Partially supported by NIH/NCI under grant #R21CA161389 and a master research grant by Varian Medical System.« less
Reducing animal experimentation in foot-and-mouth disease vaccine potency tests.
Reeve, Richard; Cox, Sarah; Smitsaart, Eliana; Beascoechea, Claudia Perez; Haas, Bernd; Maradei, Eduardo; Haydon, Daniel T; Barnett, Paul
2011-07-26
The World Organisation for Animal Health (OIE) Terrestrial Manual and the European Pharmacopoeia (EP) still prescribe live challenge experiments for foot-and-mouth disease virus (FMDV) immunogenicity and vaccine potency tests. However, the EP allows for other validated tests for the latter, and specifically in vitro tests if a "satisfactory pass level" has been determined; serological replacements are also currently in use in South America. Much research has therefore focused on validating both ex vivo and in vitro tests to replace live challenge. However, insufficient attention has been given to the sensitivity and specificity of the "gold standard"in vivo test being replaced, despite this information being critical to determining what should be required of its replacement. This paper aims to redress this imbalance by examining the current live challenge tests and their associated statistics and determining the confidence that we can have in them, thereby setting a standard for candidate replacements. It determines that the statistics associated with the current EP PD(50) test are inappropriate given our domain knowledge, but that the OIE test statistics are satisfactory. However, it has also identified a new set of live animal challenge test regimes that provide similar sensitivity and specificity to all of the currently used OIE tests using fewer animals (16 including controls), and can also provide further savings in live animal experiments in exchange for small reductions in sensitivity and specificity. Copyright © 2011 Elsevier Ltd. All rights reserved.
48 CFR 719.270 - Small business policies.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Office of Small and Disadvantaged Business Utilization (SDB) except those exempted by 719.271-6(a). (e... personnel to increase awards to small firms. The goals will be set by SDB after consultation with the... between SDB and the contracting officer concerning: (1) A recommended set-aside, or (2) a request for...
48 CFR 719.270 - Small business policies.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Office of Small and Disadvantaged Business Utilization (SDB) except those exempted by 719.271-6(a). (e... personnel to increase awards to small firms. The goals will be set by SDB after consultation with the... between SDB and the contracting officer concerning: (1) A recommended set-aside, or (2) a request for...
48 CFR 719.270 - Small business policies.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Office of Small and Disadvantaged Business Utilization (SDB) except those exempted by 719.271-6(a). (e... personnel to increase awards to small firms. The goals will be set by SDB after consultation with the... between SDB and the contracting officer concerning: (1) A recommended set-aside, or (2) a request for...
48 CFR 719.270 - Small business policies.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Office of Small and Disadvantaged Business Utilization (SDB) except those exempted by 719.271-6(a). (e... personnel to increase awards to small firms. The goals will be set by SDB after consultation with the... between SDB and the contracting officer concerning: (1) A recommended set-aside, or (2) a request for...
48 CFR 19.203 - Relationship among small business programs.
Code of Federal Regulations, 2011 CFR
2011-10-01
... not proceed with a small business set-aside and purchases on an unrestricted basis, the contracting... agrees to its release in accordance with 13 CFR parts 124, 125 and 126. (d) Small business set-asides have priority over acquisitions using full and open competition. See requirements for establishing a...
PARENT Quick Blind Round-Robin Test Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Braatz, Brett G.; Heasler, Patrick G.; Meyer, Ryan M.
The U.S. Nuclear Regulatory Commission has established the Program to Assess the Reliability of Emerging Nondestructive Techniques (PARENT) whose goal is to investigate the effectiveness of current and novel nondestructive examination procedures and techniques to find flaws in nickel-alloy welds and base materials. This is to be done by conducting a series of open and blind international round-robin tests on a set of piping components that include large-bore dissimilar metal welds, small-bore dissimilar metal welds, and bottom-mounted instrumentation penetration welds. The blind testing is being conducted in two segments, one is called Quick-Blind and the other is called Blind. Themore » Quick-Blind testing and destructive analysis of the test blocks has been completed. This report describes the four Quick-Blind test blocks used, summarizes their destructive analysis, gives an overview of the nondestructive evaluation (NDE) techniques applied, provides an analysis inspection data, and presents the conclusions drawn.« less
Using string invariants for prediction searching for optimal parameters
NASA Astrophysics Data System (ADS)
Bundzel, Marek; Kasanický, Tomáš; Pinčák, Richard
2016-02-01
We have developed a novel prediction method based on string invariants. The method does not require learning but a small set of parameters must be set to achieve optimal performance. We have implemented an evolutionary algorithm for the parametric optimization. We have tested the performance of the method on artificial and real world data and compared the performance to statistical methods and to a number of artificial intelligence methods. We have used data and the results of a prediction competition as a benchmark. The results show that the method performs well in single step prediction but the method's performance for multiple step prediction needs to be improved. The method works well for a wide range of parameters.
Predicting "Hot" and "Warm" Spots for Fragment Binding.
Rathi, Prakash Chandra; Ludlow, R Frederick; Hall, Richard J; Murray, Christopher W; Mortenson, Paul N; Verdonk, Marcel L
2017-05-11
Computational fragment mapping methods aim to predict hotspots on protein surfaces where small fragments will bind. Such methods are popular for druggability assessment as well as structure-based design. However, to date researchers developing or using such tools have had no clear way of assessing the performance of these methods. Here, we introduce the first diverse, high quality validation set for computational fragment mapping. The set contains 52 diverse examples of fragment binding "hot" and "warm" spots from the Protein Data Bank (PDB). Additionally, we describe PLImap, a novel protocol for fragment mapping based on the Protein-Ligand Informatics force field (PLIff). We evaluate PLImap against the new fragment mapping test set, and compare its performance to that of simple shape-based algorithms and fragment docking using GOLD. PLImap is made publicly available from https://bitbucket.org/AstexUK/pli .
Point-of-Care Test Equipment for Flexible Laboratory Automation.
You, Won Suk; Park, Jae Jun; Jin, Sung Moon; Ryew, Sung Moo; Choi, Hyouk Ryeol
2014-08-01
Blood tests are some of the core clinical laboratory tests for diagnosing patients. In hospitals, an automated process called total laboratory automation, which relies on a set of sophisticated equipment, is normally adopted for blood tests. Noting that the total laboratory automation system typically requires a large footprint and significant amount of power, slim and easy-to-move blood test equipment is necessary for specific demands such as emergency departments or small-size local clinics. In this article, we present a point-of-care test system that can provide flexibility and portability with low cost. First, the system components, including a reagent tray, dispensing module, microfluidic disk rotor, and photometry scanner, and their functions are explained. Then, a scheduler algorithm to provide a point-of-care test platform with an efficient test schedule to reduce test time is introduced. Finally, the results of diagnostic tests are presented to evaluate the system. © 2014 Society for Laboratory Automation and Screening.
A Maximum-Likelihood Approach to Force-Field Calibration.
Zaborowski, Bartłomiej; Jagieła, Dawid; Czaplewski, Cezary; Hałabis, Anna; Lewandowska, Agnieszka; Żmudzińska, Wioletta; Ołdziej, Stanisław; Karczyńska, Agnieszka; Omieczynski, Christian; Wirecki, Tomasz; Liwo, Adam
2015-09-28
A new approach to the calibration of the force fields is proposed, in which the force-field parameters are obtained by maximum-likelihood fitting of the calculated conformational ensembles to the experimental ensembles of training system(s). The maximum-likelihood function is composed of logarithms of the Boltzmann probabilities of the experimental conformations, calculated with the current energy function. Because the theoretical distribution is given in the form of the simulated conformations only, the contributions from all of the simulated conformations, with Gaussian weights in the distances from a given experimental conformation, are added to give the contribution to the target function from this conformation. In contrast to earlier methods for force-field calibration, the approach does not suffer from the arbitrariness of dividing the decoy set into native-like and non-native structures; however, if such a division is made instead of using Gaussian weights, application of the maximum-likelihood method results in the well-known energy-gap maximization. The computational procedure consists of cycles of decoy generation and maximum-likelihood-function optimization, which are iterated until convergence is reached. The method was tested with Gaussian distributions and then applied to the physics-based coarse-grained UNRES force field for proteins. The NMR structures of the tryptophan cage, a small α-helical protein, determined at three temperatures (T = 280, 305, and 313 K) by Hałabis et al. ( J. Phys. Chem. B 2012 , 116 , 6898 - 6907 ), were used. Multiplexed replica-exchange molecular dynamics was used to generate the decoys. The iterative procedure exhibited steady convergence. Three variants of optimization were tried: optimization of the energy-term weights alone and use of the experimental ensemble of the folded protein only at T = 280 K (run 1); optimization of the energy-term weights and use of experimental ensembles at all three temperatures (run 2); and optimization of the energy-term weights and the coefficients of the torsional and multibody energy terms and use of experimental ensembles at all three temperatures (run 3). The force fields were subsequently tested with a set of 14 α-helical and two α + β proteins. Optimization run 1 resulted in better agreement with the experimental ensemble at T = 280 K compared with optimization run 2 and in comparable performance on the test set but poorer agreement of the calculated folding temperature with the experimental folding temperature. Optimization run 3 resulted in the best fit of the calculated ensembles to the experimental ones for the tryptophan cage but in much poorer performance on the training set, suggesting that use of a small α-helical protein for extensive force-field calibration resulted in overfitting of the data for this protein at the expense of transferability. The optimized force field resulting from run 2 was found to fold 13 of the 14 tested α-helical proteins and one small α + β protein with the correct topologies; the average structures of 10 of them were predicted with accuracies of about 5 Å C(α) root-mean-square deviation or better. Test simulations with an additional set of 12 α-helical proteins demonstrated that this force field performed better on α-helical proteins than the previous parametrizations of UNRES. The proposed approach is applicable to any problem of maximum-likelihood parameter estimation when the contributions to the maximum-likelihood function cannot be evaluated at the experimental points and the dimension of the configurational space is too high to construct histograms of the experimental distributions.
2015-01-01
Systematic analysis and interpretation of the large number of tandem mass spectra (MS/MS) obtained in metabolomics experiments is a bottleneck in discovery-driven research. MS/MS mass spectral libraries are small compared to all known small molecule structures and are often not freely available. MS2Analyzer was therefore developed to enable user-defined searches of thousands of spectra for mass spectral features such as neutral losses, m/z differences, and product and precursor ions from MS/MS spectra in MSP/MGF files. The software is freely available at http://fiehnlab.ucdavis.edu/projects/MS2Analyzer/. As the reference query set, 147 literature-reported neutral losses and their corresponding substructures were collected. This set was tested for accuracy of linking neutral loss analysis to substructure annotations using 19 329 accurate mass tandem mass spectra of structurally known compounds from the NIST11 MS/MS library. Validation studies showed that 92.1 ± 6.4% of 13 typical neutral losses such as acetylations, cysteine conjugates, or glycosylations are correct annotating the associated substructures, while the absence of mass spectra features does not necessarily imply the absence of such substructures. Use of this tool has been successfully demonstrated for complex lipids in microalgae. PMID:25263576
Rapid parameterization of small molecules using the Force Field Toolkit.
Mayne, Christopher G; Saam, Jan; Schulten, Klaus; Tajkhorshid, Emad; Gumbart, James C
2013-12-15
The inability to rapidly generate accurate and robust parameters for novel chemical matter continues to severely limit the application of molecular dynamics simulations to many biological systems of interest, especially in fields such as drug discovery. Although the release of generalized versions of common classical force fields, for example, General Amber Force Field and CHARMM General Force Field, have posited guidelines for parameterization of small molecules, many technical challenges remain that have hampered their wide-scale extension. The Force Field Toolkit (ffTK), described herein, minimizes common barriers to ligand parameterization through algorithm and method development, automation of tedious and error-prone tasks, and graphical user interface design. Distributed as a VMD plugin, ffTK facilitates the traversal of a clear and organized workflow resulting in a complete set of CHARMM-compatible parameters. A variety of tools are provided to generate quantum mechanical target data, setup multidimensional optimization routines, and analyze parameter performance. Parameters developed for a small test set of molecules using ffTK were comparable to existing CGenFF parameters in their ability to reproduce experimentally measured values for pure-solvent properties (<15% error from experiment) and free energy of solvation (±0.5 kcal/mol from experiment). Copyright © 2013 Wiley Periodicals, Inc.
Albrecht, Johanna S; Bubenzer-Busch, Sarah; Gallien, Anne; Knospe, Eva Lotte; Gaber, Tilman J; Zepf, Florian D
2017-01-01
The aim of this approach was to conduct a structured electroencephalography-based neurofeedback training program for children and adolescents with attention-deficit hyperactivity disorder (ADHD) using slow cortical potentials with an intensive first (almost daily sessions) and second phase of training (two sessions per week) and to assess aspects of attentional performance. A total of 24 young patients with ADHD participated in the 20-session training program. During phase I of training (2 weeks, 10 sessions), participants were trained on weekdays. During phase II, neurofeedback training occurred twice per week (5 weeks). The patients' inattention problems were measured at three assessment time points before (pre, T0) and after (post, T1) the training and at a 6-month follow-up (T2); the assessments included neuropsychological tests (Alertness and Divided Attention subtests of the Test for Attentional Performance; Sustained Attention Dots and Shifting Attentional Set subtests of the Amsterdam Neuropsychological Test) and questionnaire data (inattention subscales of the so-called Fremdbeurteilungsbogen für Hyperkinetische Störungen and Child Behavior Checklist/4-18 [CBCL/4-18]). All data were analyzed retrospectively. The mean auditive reaction time in a Divided Attention task decreased significantly from T0 to T1 (medium effect), which was persistent over time and also found for a T0-T2 comparison (larger effects). In the Sustained Attention Dots task, the mean reaction time was reduced from T0-T1 and T1-T2 (small effects), whereas in the Shifting Attentional Set task, patients were able to increase the number of trials from T1-T2 and significantly diminished the number of errors (T1-T2 & T0-T2, large effects). First positive but very small effects and preliminary results regarding different parameters of attentional performance were detected in young individuals with ADHD. The limitations of the obtained preliminary data are the rather small sample size, the lack of a control group/a placebo condition and the open-label approach because of the clinical setting and retrospective analysis. The value of the current approach lies in providing pilot data for future studies involving larger samples.
Threshold setting by the surround of cat retinal ganglion cells.
Barlow, H B; Levick, W R
1976-08-01
1. The slope of curves relating the log increment threshold to log background luminance in cat retinal ganglion cells is affected by the area and duration of the test stimulus, as it is in human pyschophysical experiments. 2. Using large area, long duration stimuli the slopes average 0-82 and approach close to 1 (Weber's Law) in the steepest cases. Small stimuli gave an average of 0-53 for on-centre units using brief stimuli, and 0-56 for off-centre units, using long stimuli. Slopes under 0-5 (square root law) were not found over an extended range of luminances. 3. On individual units the slope was generally greater for larger and longer test stimulus, but no unit showed the full extent of change from slope of 0-5 to slope of 1. 4. The above differences hold for objective measures of quantum/spike ratio, as well as for thresholds either judged by ear or assessed by calculation. 5. The steeper slope of the curves for large area, long duration test stimuli compared with small, long duration stimuli, is associated with the increased effectiveness of antagonism from the surround at high backgrounds. This change may be less pronounced in off-centre units, one of which (probably transient Y-type) showed no difference of slope, and gave parallel area-threshold curves at widely separated background luminances, confirming the importance of differential surround effectiveness in changing the slope of the curves. 6. In on-centre units, the increased relative effectiveness of the surround is associated with the part of the raised background light that falls on the receptive field centre. 7. It is suggested that the variable surround functions as a zero-offset control that sets the threshold excitation required for generating impulses, and that this is separate from gain-setting adaptive mechanisms. This may be how ganglion cells maintain high incremental sensitivity in spite of a strong maintained excitatory drive that would otherwise cause compressive response non-linearities.
Liu, Jiamin; Kabadi, Suraj; Van Uitert, Robert; Petrick, Nicholas; Deriche, Rachid; Summers, Ronald M.
2011-01-01
Purpose: Surface curvatures are important geometric features for the computer-aided analysis and detection of polyps in CT colonography (CTC). However, the general kernel approach for curvature computation can yield erroneous results for small polyps and for polyps that lie on haustral folds. Those erroneous curvatures will reduce the performance of polyp detection. This paper presents an analysis of interpolation’s effect on curvature estimation for thin structures and its application on computer-aided detection of small polyps in CTC. Methods: The authors demonstrated that a simple technique, image interpolation, can improve the accuracy of curvature estimation for thin structures and thus significantly improve the sensitivity of small polyp detection in CTC. Results: Our experiments showed that the merits of interpolating included more accurate curvature values for simulated data, and isolation of polyps near folds for clinical data. After testing on a large clinical data set, it was observed that sensitivities with linear, quadratic B-spline and cubic B-spline interpolations significantly improved the sensitivity for small polyp detection. Conclusions: The image interpolation can improve the accuracy of curvature estimation for thin structures and thus improve the computer-aided detection of small polyps in CTC. PMID:21859029
Utecht, Joseph; Brochhausen, Mathias; Judkins, John; Schneider, Jodi; Boyce, Richard D
2017-01-01
In this research we aim to demonstrate that an ontology-based system can categorize potential drug-drug interaction (PDDI) evidence items into complex types based on a small set of simple questions. Such a method could increase the transparency and reliability of PDDI evidence evaluation, while also reducing the variations in content and seriousness ratings present in PDDI knowledge bases. We extended the DIDEO ontology with 44 formal evidence type definitions. We then manually annotated the evidence types of 30 evidence items. We tested an RDF/OWL representation of answers to a small number of simple questions about each of these 30 evidence items and showed that automatic inference can determine the detailed evidence types based on this small number of simpler questions. These results show proof-of-concept for a decision support infrastructure that frees the evidence evaluator from mastering relatively complex written evidence type definitions.
Effective phase function of light scattered at small angles by polydisperse particulate media
NASA Astrophysics Data System (ADS)
Turcu, I.
2008-06-01
Particles with typical dimensions higher than the light wavelength and relative refraction indexes close to one, scatter light mainly in the forward direction where the scattered light intensity has a narrow peak. For particulate media accomplishing these requirements the light scattered at small angles in a far-field detecting set-up can be described analytically by an effective phase function (EPF) even in the multiple scattering regime. The EPF model which was built for monodispersed systems has been extended to polydispersed media. The main ingredients consist in the replacement of the single particle phase function and of the optical thickness with their corresponding averaged values. Using a Gamma particle size distribution (PSD) as a testing model, the effect of polydispersity was systematically investigated. The increase of the average radius or/and of the PSD standard deviation leads to the decrease of the angular spreading of the small angle scattered light.
Increasing the Complexity of the Illumination May Reduce Gloss Constancy
Wendt, Gunnar; Faul, Franz
2017-01-01
We examined in which way gradual changes in the geometric structure of the illumination affect the perceived glossiness of a surface. The test stimuli were computer-generated three-dimensional scenes with a single test object that was illuminated by three point light sources, whose relative positions in space were systematically varied. In the first experiment, the subjects were asked to adjust the microscale smoothness of a match object illuminated by a single light source such that it has the same perceived glossiness as the test stimulus. We found that small changes in the structure of the light field can induce dramatic changes in perceived glossiness and that this effect is modulated by the microscale smoothness of the test object. The results of a second experiment indicate that the degree of overlap of nearby highlights plays a major role in this effect: Whenever the degree of overlap in a group of highlights is so large that they perceptually merge into a single highlight, the glossiness of the surface is systematically underestimated. In addition, we examined the predictability of the smoothness settings by a linear model that is based on a set of four different global image statistics. PMID:29250308
Multiphasic Health Testing in the Clinic Setting
LaDou, Joseph
1971-01-01
The economy of automated multiphasic health testing (amht) activities patterned after the high-volume Kaiser program can be realized in low-volume settings. amht units have been operated at daily volumes of 20 patients in three separate clinical environments. These programs have displayed economics entirely compatible with cost figures published by the established high-volume centers. This experience, plus the expanding capability of small, general purpose, digital computers (minicomputers) indicates that a group of six or more physicians generating 20 laboratory appraisals per day can economically justify a completely automated multiphasic health testing facility. This system would reside in the clinic or hospital where it is used and can be configured to do analyses such as electrocardiography and generate laboratory reports, and communicate with large computer systems in university medical centers. Experience indicates that the most effective means of implementing these benefits of automation is to make them directly available to the medical community with the physician playing the central role. Economic justification of a dedicated computer through low-volume health testing then allows, as a side benefit, automation of administrative as well as other diagnostic activities—for example, patient billing, computer-aided diagnosis, and computer-aided therapeutics. PMID:4935771
Cimino, J J
2012-01-01
The reuse of clinical data for research purposes requires methods for the protection of personal privacy. One general approach is the removal of personal identifiers from the data. A frequent part of this anonymization process is the removal of times and dates, which we refer to as "chrononymization." While this step can make the association with identified data (such as public information or a small sample of patient information) more difficult, it comes at a cost to the usefulness of the data for research. We sought to determine whether removal of dates from common laboratory test panels offers any advantage in protecting such data from re-identification. We obtained a set of results for 5.9 million laboratory panels from the National Institutes of Health's (NIH) Biomedical Translational Research Information System (BTRIS), selected a random set of 20,000 panels from the larger source sets, and then identified all matches between the sets. We found that while removal of dates could hinder the re-identification of a single test result, such removal had almost no effect when entire panels were used. Our results suggest that reliance on chrononymization provides a false sense of security for the protection of laboratory test results. As a result of this study, the NIH has chosen to rely on policy solutions, such as strong data use agreements, rather than removal of dates when reusing clinical data for research purposes.
The False Security of Blind Dates
Cimino, J.J.
2012-01-01
Background The reuse of clinical data for research purposes requires methods for the protection of personal privacy. One general approach is the removal of personal identifiers from the data. A frequent part of this anonymization process is the removal of times and dates, which we refer to as “chrononymization.” While this step can make the association with identified data (such as public information or a small sample of patient information) more difficult, it comes at a cost to the usefulness of the data for research. Objectives We sought to determine whether removal of dates from common laboratory test panels offers any advantage in protecting such data from re-identification. Methods We obtained a set of results for 5.9 million laboratory panels from the National Institutes of Health’s (NIH) Biomedical Translational Research Information System (BTRIS), selected a random set of 20,000 panels from the larger source sets, and then identified all matches between the sets. Results We found that while removal of dates could hinder the re-identification of a single test result, such removal had almost no effect when entire panels were used. Conclusions Our results suggest that reliance on chrononymization provides a false sense of security for the protection of laboratory test results. As a result of this study, the NIH has chosen to rely on policy solutions, such as strong data use agreements, rather than removal of dates when reusing clinical data for research purposes. PMID:23646086
Factors affecting the transformation of a pyritic tailing: scaled-up column tests.
García, C; Ballester, A; González, F; Blázquez, M L
2005-02-14
Two different methods for predicting the quality of the water draining from a pyritic tailing are compared; for this, a static test (ABA test) and a kinetic test in large columns were chosen. The different results obtained in the two experimental set-ups show the necessity of being careful in selecting both the adequate predictive method and the conclusions and extrapolations derived from them. The tailing chosen for the weathering tests (previously tested in shake flasks and in small weathering columns) was a pyritic residue produced in a flotation plant of complex polymetallic sulphides (Huelva, Spain). The ABA test was a modification of the conventional ABA test reported in bibliography. The modification consisted in the soft conditions employed in the digestion phase. For column tests, two identical methacrylate columns (150 cm high and 15 cm diameter) were used to study the chemical and microbiological processes controlling the leaching of pyrite. The results obtained in the two tests were very different. The static test predicted a strong potential acidity for the tailing. On the contrary, pH value in the effluents draining from the columns reached values of only 5 units, being the concentration of metals (<600 mg/L) and sulphate ions (<17,000 mg/L) very small and far from the values of a typical acid mine drainage. In consequence, the static test may oversize the potential acidity of the tailing; whereas large columns may be saturated in water, displacing the oxygen and inhibiting the microbial activity necessary to catalyse mineral oxidation.
NASA Technical Reports Server (NTRS)
Schwendemann, M. F.
1981-01-01
A 0.165-scale isolated inlet model was tested in the NASA Lewis Research Center 8-ft by 6-ft Supersonic Wind Tunnel. Ramp boundary layer control was provided by tangential blowing from a row of holes in an aft-facing step set into the ramp surface. Testing was performed at Mach numbers from 1.36 to 1.96 using both cold and heated air in the blowing system. Stable inlet flow was achieved at all Mach numbers. Blowing hole geometry was found to be significant at 1.96M. Blowing air temperature was found to have only a small effect on system performance. High blowing levels were required at the most severe test conditions.
An examination of the challenges influencing science instruction in Florida elementary classrooms
NASA Astrophysics Data System (ADS)
North, Stephanie Gwinn
It has been shown that the mechanical properties of thin films tend to differ from their bulk counterparts. Specifically, the bulge and microtensile testing of thin films used in MEMS have revealed that these films demonstrate an inverse relationship between thickness and strength. A film dimension is not a material property, but it evidently does affect the mechanical performance of materials at very small thicknesses. A hypothetical explanation for this phenomenon is that as the thickness dimension of the film decreases, it is statistically less likely that imperfections exist in the material. It would require a very small thickness (or volume) to limit imperfections in a material, which is why this phenomenon is seen in films with thicknesses on the order of 100 nm to a few microns. Another hypothesized explanation is that the surface tension that exists in bulk material also exists in thin films but has a greater impact at such a small scale. The goal of this research is to identify a theoretical prediction of the strength of thin films based on its microstructural properties such as grain size and film thickness. This would minimize the need for expensive and complicated tests such as the bulge and microtensile tests. In this research, data was collected from the bulge and microtensile testing of copper, aluminum, gold, and polysilicon free-standing thin films. Statistical testing of this data revealed a definitive inverse relationship between thickness and strength, as well as between grain size and strength, as expected. However, due to a lack of a standardized method for either test, there were significant variations in the data. This research compares and analyzes the methods used by other researchers to develop a suggested set of instructions for a standardized bulge test and standardized microtensile test. The most important parameters to be controlled in each test were found to be strain rate, temperature, film deposition method, film length, and strain measurement.
NASA Astrophysics Data System (ADS)
Laiti, L.; Mallucci, S.; Piccolroaz, S.; Bellin, A.; Zardi, D.; Fiori, A.; Nikulin, G.; Majone, B.
2018-03-01
Assessing the accuracy of gridded climate data sets is highly relevant to climate change impact studies, since evaluation, bias correction, and statistical downscaling of climate models commonly use these products as reference. Among all impact studies those addressing hydrological fluxes are the most affected by errors and biases plaguing these data. This paper introduces a framework, coined Hydrological Coherence Test (HyCoT), for assessing the hydrological coherence of gridded data sets with hydrological observations. HyCoT provides a framework for excluding meteorological forcing data sets not complying with observations, as function of the particular goal at hand. The proposed methodology allows falsifying the hypothesis that a given data set is coherent with hydrological observations on the basis of the performance of hydrological modeling measured by a metric selected by the modeler. HyCoT is demonstrated in the Adige catchment (southeastern Alps, Italy) for streamflow analysis, using a distributed hydrological model. The comparison covers the period 1989-2008 and includes five gridded daily meteorological data sets: E-OBS, MSWEP, MESAN, APGD, and ADIGE. The analysis highlights that APGD and ADIGE, the data sets with highest effective resolution, display similar spatiotemporal precipitation patterns and produce the largest hydrological efficiency indices. Lower performances are observed for E-OBS, MESAN, and MSWEP, especially in small catchments. HyCoT reveals deficiencies in the representation of spatiotemporal patterns of gridded climate data sets, which cannot be corrected by simply rescaling the meteorological forcing fields, as often done in bias correction of climate model outputs. We recommend this framework to assess the hydrological coherence of gridded data sets to be used in large-scale hydroclimatic studies.
Sander, Ulrich; Lubbe, Nils
2018-04-01
Intersection accidents are frequent and harmful. The accident types 'straight crossing path' (SCP), 'left turn across path - oncoming direction' (LTAP/OD), and 'left-turn across path - lateral direction' (LTAP/LD) represent around 95% of all intersection accidents and one-third of all police-reported car-to-car accidents in Germany. The European New Car Assessment Program (Euro NCAP) have announced that intersection scenarios will be included in their rating from 2020; however, how these scenarios are to be tested has not been defined. This study investigates whether clustering methods can be used to identify a small number of test scenarios sufficiently representative of the accident dataset to evaluate Intersection Automated Emergency Braking (AEB). Data from the German In-Depth Accident Study (GIDAS) and the GIDAS-based Pre-Crash Matrix (PCM) from 1999 to 2016, containing 784 SCP and 453 LTAP/OD accidents, were analyzed with principal component methods to identify variables that account for the relevant total variances of the sample. Three different methods for data clustering were applied to each of the accident types, two similarity-based approaches, namely Hierarchical Clustering (HC) and Partitioning Around Medoids (PAM), and the probability-based Latent Class Clustering (LCC). The optimum number of clusters was derived for HC and PAM with the silhouette method. The PAM algorithm was both initiated with random start medoid selection and medoids from HC. For LCC, the Bayesian Information Criterion (BIC) was used to determine the optimal number of clusters. Test scenarios were defined from optimal cluster medoids weighted by their real-life representation in GIDAS. The set of variables for clustering was further varied to investigate the influence of variable type and character. We quantified how accurately each cluster variation represents real-life AEB performance using pre-crash simulations with PCM data and a generic algorithm for AEB intervention. The usage of different sets of clustering variables resulted in substantially different numbers of clusters. The stability of the resulting clusters increased with prioritization of categorical over continuous variables. For each different set of cluster variables, a strong in-cluster variance of avoided versus non-avoided accidents for the specified Intersection AEB was present. The medoids did not predict the most common Intersection AEB behavior in each cluster. Despite thorough analysis using various cluster methods and variable sets, it was impossible to reduce the diversity of intersection accidents into a set of test scenarios without compromising the ability to predict real-life performance of Intersection AEB. Although this does not imply that other methods cannot succeed, it was observed that small changes in the definition of a scenario resulted in a different avoidance outcome. Therefore, we suggest using limited physical testing to validate more extensive virtual simulations to evaluate vehicle safety. Copyright © 2018 Elsevier Ltd. All rights reserved.
Whole blood FPR1 mRNA expression predicts both non‐small cell and small cell lung cancer
Vachani, Anil; Pass, Harvey I.; Rom, William N.; Ryden, Kirk; Weiss, Glen J.; Hogarth, D. K.; Runger, George; Richards, Donald; Shelton, Troy; Mallery, David W.
2018-01-01
While long‐term survival rates for early‐stage lung cancer are high, most cases are diagnosed in later stages that can negatively impact survival rates. We aim to design a simple, single biomarker blood test for early‐stage lung cancer that is robust to preclinical variables and can be readily implemented in the clinic. Whole blood was collected in PAXgene tubes from a training set of 29 patients, and a validation set of 260 patients, of which samples from 58 patients were prospectively collected in a clinical trial specifically for our study. After RNA was extracted, the expressions of FPR1 and a reference gene were quantified by an automated one‐step Taqman RT‐PCR assay. Elevated levels of FPR1 mRNA in whole blood predicted lung cancer status with a sensitivity of 55% and a specificity of 87% on all validation specimens. The prospectively collected specimens had a significantly higher 68% sensitivity and 89% specificity. Results from patients with benign nodules were similar to healthy volunteers. No meaningful correlation was present between our test results and any clinical characteristic other than lung cancer diagnosis. FPR1 mRNA levels in whole blood can predict the presence of lung cancer. Using this as a reflex test for positive lung cancer screening computed tomography scans has the potential to increase the positive predictive value. This marker can be easily measured in an automated process utilizing off‐the‐shelf equipment and reagents. Further work is justified to explain the source of this biomarker. PMID:29313979
Matoušková, Petra; Bártíková, Hana; Boušová, Iva; Hanušová, Veronika; Szotáková, Barbora; Skálová, Lenka
2014-01-01
Obesity and metabolic syndrome is increasing health problem worldwide. Among other ways, nutritional intervention using phytochemicals is important method for treatment and prevention of this disease. Recent studies have shown that certain phytochemicals could alter the expression of specific genes and microRNAs (miRNAs) that play a fundamental role in the pathogenesis of obesity. For study of the obesity and its treatment, monosodium glutamate (MSG)-injected mice with developed central obesity, insulin resistance and liver lipid accumulation are frequently used animal models. To understand the mechanism of phytochemicals action in obese animals, the study of selected genes expression together with miRNA quantification is extremely important. For this purpose, real-time quantitative PCR is a sensitive and reproducible method, but it depends on proper normalization entirely. The aim of present study was to identify the appropriate reference genes for mRNA and miRNA quantification in MSG mice treated with green tea catechins, potential anti-obesity phytochemicals. Two sets of reference genes were tested: first set contained seven commonly used genes for normalization of messenger RNA, the second set of candidate reference genes included ten small RNAs for normalization of miRNA. The expression stability of these reference genes were tested upon treatment of mice with catechins using geNorm, NormFinder and BestKeeper algorithms. Selected normalizers for mRNA quantification were tested and validated on expression of quinone oxidoreductase, biotransformation enzyme known to be modified by catechins. The effect of selected normalizers for miRNA quantification was tested on two obesity- and diabetes- related miRNAs, miR-221 and miR-29b, respectively. Finally, the combinations of B2M/18S/HPRT1 and miR-16/sno234 were validated as optimal reference genes for mRNA and miRNA quantification in liver and 18S/RPlP0/HPRT1 and sno234/miR-186 in small intestine of MSG mice. These reference genes will be used for mRNA and miRNA normalization in further study of green tea catechins action in obese mice.
2014-02-11
ISS038-E-045009 (11 Feb. 2014) --- The Small Satellite Orbital Deployer (SSOD), in the grasp of the Kibo laboratory robotic arm, is photographed by an Expedition 38 crew member on the International Space Station as it deploys a set of NanoRacks CubeSats. The CubeSats program contains a variety of experiments such as Earth observations and advanced electronics testing. Station solar array panels, Earth's horizon and the blackness of space provide the backdrop for the scene.
Generalized Phenomenological Cyclic Stress-Strain-Strength Characterization of Granular Media.
1984-09-02
could be fitted to a comprehensive data set. i ’../., Unfortunately, such equipment is not available at present, and most researchers still rely on the...notably, Lade and Duncan (1975), using a comprehensive series of test data obtained from a true triaxial device (Lade, 1973), have suggested that failure...0 VV 2. Shear Strain, low indeterminate (prior to failure) (at failure) 3. Deformation small large 4. Void Ratio (e) any e ecritical 5. Grain
Allen, Jennifer D.; Pérez, John E.; Tom, Laura; Leyva, Bryan; Diaz, Daisy; Torres, Maria Idali
2013-01-01
We assessed the feasibility, acceptability, and initial impact of a church-based educational program to promote breast, cervical, and colorectal cancer screening among Latinas ages 18 and over. We used a one-group pre/post evaluation within a low-income, Latino Baptist church in Boston, MA. Participants completed interviewer-administered assessments at baseline and at the end of the six-month intervention. Under the guidance of a patient navigator (PN), women from the church (peer health advisors, or PHAs) were trained to deliver evidence-based screening interventions, including one-to-one outreach, small group education, client reminders, and reduction of structural barriers to screening. The PN and PHAs also implemented a health fair and the pastor integrated health information into regular sermons. At pre-intervention, nearly half of the sample did not meet screening guidelines. The majority (97%, n = 35) of those who completed the post-intervention assessment participated in intervention activities. Two-thirds (67%) reported talking with the PN or PHAs about health issues. Participation in small group education sessions was highest (72%), with health fairs (61%), and goal setting (50%) also being popular activities. Fourteen percent also reported receiving help from the PN to access screening tests. This study supports the feasibility and acceptability of churches as a setting to promote cancer screening among Latinas. PMID:24132541
The realization of temperature controller for small resistance measurement system
NASA Astrophysics Data System (ADS)
Sobecki, Jakub; Walendziuk, Wojciech; Idzkowski, Adam
2017-08-01
This paper concerns the issues of construction and experimental tests of a temperature stabilization system for small resistance increments measurement circuits. After switching the system on, a PCB board heats up and the long-term temperature drift altered the measurement result. The aim of this work is reducing the time of achieving constant nominal temperature by the measurement system, which would enable decreasing the time of measurements in the steady state. Moreover, the influence of temperatures higher than the nominal on the measurement results and the obtained heating curve were tested. During the working process, the circuit heats up to about 32 °C spontaneously, and it has the time to reach steady state of about 1200 s. Implementing a USART terminal on the PC and an NI USB-6341 data acquisition card makes recording the data (concerning temperature and resistance) in the digital form and its further processing easier. It also enables changing the quantity of the regulator settings. This paper presents sample results of measurements for several temperature values and the characteristics of the temperature and resistance changes in time as well as their comparison with the output values. The object identification is accomplished due to the Ziegler-Nichols method. The algorithm of determining the step characteristics parameters and examples of computations of the regulator settings are included together with example characteristics of the object regulation.
Small-Scale Experiments.10-gallon drum experiment summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenberg, David M.
2015-02-05
A series of sub-scale (10-gallon) drum experiments were conducted to characterize the reactivity, heat generation, and gas generation of mixtures of chemicals believed to be present in the drum (68660) known to have breached in association with the radiation release event at the Waste Isolation Pilot Plant (WIPP) on February 14, 2014, at a scale expected to be large enough to replicate the environment in that drum but small enough to be practical, safe, and cost effective. These tests were not intended to replicate all the properties of drum 68660 or the event that led to its breach, or tomore » validate a particular hypothesis of the release event. They were intended to observe, in a controlled environment and with suitable diagnostics, the behavior of simple mixtures of chemicals in order to determine if they could support reactivity that could result in ignition or if some other ingredient or event would be necessary. There is a significant amount of uncertainty into the exact composition of the barrel; a limited sub-set of known components was identified, reviewed with Technical Assessment Team (TAT) members, and used in these tests. This set of experiments was intended to provide a framework to postulate realistic, data-supported hypotheses for processes that occur in a “68660-like” configuration, not definitively prove what actually occurred in 68660.« less
Park, Bo-Yong; Lee, Mi Ji; Lee, Seung-Hak; Cha, Jihoon; Chung, Chin-Sang; Kim, Sung Tae; Park, Hyunjin
2018-01-01
Migraineurs show an increased load of white matter hyperintensities (WMHs) and more rapid deep WMH progression. Previous methods for WMH segmentation have limited efficacy to detect small deep WMHs. We developed a new fully automated detection pipeline, DEWS (DEep White matter hyperintensity Segmentation framework), for small and superficially-located deep WMHs. A total of 148 non-elderly subjects with migraine were included in this study. The pipeline consists of three components: 1) white matter (WM) extraction, 2) WMH detection, and 3) false positive reduction. In WM extraction, we adjusted the WM mask to re-assign misclassified WMHs back to WM using many sequential low-level image processing steps. In WMH detection, the potential WMH clusters were detected using an intensity based threshold and region growing approach. For false positive reduction, the detected WMH clusters were classified into final WMHs and non-WMHs using the random forest (RF) classifier. Size, texture, and multi-scale deep features were used to train the RF classifier. DEWS successfully detected small deep WMHs with a high positive predictive value (PPV) of 0.98 and true positive rate (TPR) of 0.70 in the training and test sets. Similar performance of PPV (0.96) and TPR (0.68) was attained in the validation set. DEWS showed a superior performance in comparison with other methods. Our proposed pipeline is freely available online to help the research community in quantifying deep WMHs in non-elderly adults.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-15
... reinstate an appeals process for decisions concerning recomputation of Small Business Set-Aside shares... to purchase a fair proportion of National Forest System timber offered for sale. Under the program... businesses every 5 years based on the actual volume of sawtimber that has been purchased by small businesses...
Minică, Camelia C; Dolan, Conor V; Hottenga, Jouke-Jan; Willemsen, Gonneke; Vink, Jacqueline M; Boomsma, Dorret I
2013-05-01
When phenotypic, but no genotypic data are available for relatives of participants in genetic association studies, previous research has shown that family-based imputed genotypes can boost the statistical power when included in such studies. Here, using simulations, we compared the performance of two statistical approaches suitable to model imputed genotype data: the mixture approach, which involves the full distribution of the imputed genotypes and the dosage approach, where the mean of the conditional distribution features as the imputed genotype. Simulations were run by varying sibship size, size of the phenotypic correlations among siblings, imputation accuracy and minor allele frequency of the causal SNP. Furthermore, as imputing sibling data and extending the model to include sibships of size two or greater requires modeling the familial covariance matrix, we inquired whether model misspecification affects power. Finally, the results obtained via simulations were empirically verified in two datasets with continuous phenotype data (height) and with a dichotomous phenotype (smoking initiation). Across the settings considered, the mixture and the dosage approach are equally powerful and both produce unbiased parameter estimates. In addition, the likelihood-ratio test in the linear mixed model appears to be robust to the considered misspecification in the background covariance structure, given low to moderate phenotypic correlations among siblings. Empirical results show that the inclusion in association analysis of imputed sibling genotypes does not always result in larger test statistic. The actual test statistic may drop in value due to small effect sizes. That is, if the power benefit is small, that the change in distribution of the test statistic under the alternative is relatively small, the probability is greater of obtaining a smaller test statistic. As the genetic effects are typically hypothesized to be small, in practice, the decision on whether family-based imputation could be used as a means to increase power should be informed by prior power calculations and by the consideration of the background correlation.
Bonessio, N; Pereira, E S J; Lomiento, G; Arias, A; Bahia, M G A; Buono, V T L; Peters, O A
2015-05-01
To validate torsional analysis, based on finite elements, of WaveOne instruments against in vitro tests and to model the effects of different nickel-titanium (NiTi) materials. WaveOne reciprocating instruments (Small, Primary and Large, n = 8 each, M-Wire) were tested under torsion according to standard ISO 3630-1. Torsional profiles including torque and angle at fracture were determined. Test conditions were reproduced through Finite Element Analysis (FEA) simulations based on micro-CT scans at 10-μm resolution; results were compared to experimental data using analysis of variance and two-sided one sample t-tests. The same simulation was performed on virtual instruments with identical geometry and load condition, based on M-Wire or conventional NiTi alloy. Torsional profiles from FEA simulations were in significant agreement with the in vitro results. Therefore, the models developed in this study were accurate and able to provide reliable simulation of the torsional performance. Stock NiTi files under torsional tests had up to 44.9%, 44.9% and 44.1% less flexibility than virtual M-Wire files at small deflections for Small, Primary and Large instruments, respectively. As deflection levels increased, the differences in flexibility between the two sets of simulated instruments decreased until fracture. Stock NiTi instruments had a torsional fracture resistance up to 10.3%, 8.0% and 7.4% lower than the M-Wire instruments, for the Small, Primary and Large file, respectively. M-Wire instruments benefitted primarily through higher material flexibility while still at low deflection levels, compared with conventional NiTi alloy. At fracture, the instruments did not take complete advantage of the enhanced fractural resistance of the M-Wire material, which determines only limited improvements of the torsional performance. © 2014 International Endodontic Journal. Published by John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maurer, Simon A.; Clin, Lucien; Ochsenfeld, Christian, E-mail: christian.ochsenfeld@uni-muenchen.de
2014-06-14
Our recently developed QQR-type integral screening is introduced in our Cholesky-decomposed pseudo-densities Møller-Plesset perturbation theory of second order (CDD-MP2) method. We use the resolution-of-the-identity (RI) approximation in combination with efficient integral transformations employing sparse matrix multiplications. The RI-CDD-MP2 method shows an asymptotic cubic scaling behavior with system size and a small prefactor that results in an early crossover to conventional methods for both small and large basis sets. We also explore the use of local fitting approximations which allow to further reduce the scaling behavior for very large systems. The reliability of our method is demonstrated on test sets formore » interaction and reaction energies of medium sized systems and on a diverse selection from our own benchmark set for total energies of larger systems. Timings on DNA systems show that fast calculations for systems with more than 500 atoms are feasible using a single processor core. Parallelization extends the range of accessible system sizes on one computing node with multiple cores to more than 1000 atoms in a double-zeta basis and more than 500 atoms in a triple-zeta basis.« less
Using Metaphors to Explain Molecular Testing to Cancer Patients.
Pinheiro, Ana P M; Pocock, Rachel H; Dixon, Margie D; Shaib, Walid L; Ramalingam, Suresh S; Pentz, Rebecca D
2017-04-01
Molecular testing to identify targetable molecular alterations is routine practice for several types of cancer. Explaining the underlying molecular concepts can be difficult, and metaphors historically have been used in medicine to provide a common language between physicians and patients. Although previous studies have highlighted the use and effectiveness of metaphors to help explain germline genetic concepts to the general public, this study is the first to describe the use of metaphors to explain molecular testing to cancer patients in the clinical setting. Oncologist-patient conversations about molecular testing were recorded, transcribed verbatim, and coded. If a metaphor was used, patients were asked to explain it and assess its helpfulness. Sixty-six patients participated. Nine oncologists used metaphors to describe molecular testing; 25 of 66 (38%) participants heard a metaphor, 13 of 25 (52%) were questioned, 11 of 13 (85%) demonstrated understanding and reported the metaphor as being useful. Seventeen metaphors (bus driver, boss, switch, battery, circuit, broken light switch, gas pedal, key turning off an engine, key opening a lock, food for growth, satellite and antenna, interstate, alternate circuit, traffic jam, blueprint, room names, Florida citrus) were used to explain eight molecular testing terms (driver mutations, targeted therapy, hormones, receptors, resistance, exon specificity, genes, and cancer signatures). Because metaphors have proven to be a useful communication tool in other settings, these 17 metaphors may be useful for oncologists to adapt to their own setting to explain molecular testing terms. The Oncologist 2017;22:445-449 Implications for Practice: This article provides a snapshot of 17 metaphors that proved useful in describing 8 complicated molecular testing terms at 3 sites. As complex tumor sequencing becomes standard of care in clinics and widely used in clinical research, the use of metaphors may prove a useful communication tool, as it has in other settings. Although this study had a small sample, almost all of the patients who were exposed to metaphors in explaining molecular testing reported it as being helpful to their understanding. These 17 metaphors are examples of potentially useful communication tools that oncologists can adapt to their own practice. © AlphaMed Press 2017.
Impact resistance and prescription compliance with AS/NZS 1337.6:2010.
Dain, Stephen J; Ngo, Thao P T; Cheng, Brian B
2013-09-01
Australian/New Zealand Standard 1337.6 deals with prescription eye protection and has been in place since 2007. There have been many standards marking licences granted since then. The issue of the worst-case situations for assessment in a certification scheme, in particular -1.50 m(-1) lenses, has been the subject of discussion in Standards Australia/Standards New Zealand Committee SF-006. Given that a body of data from testing exists, this was explored to advise the Committee. Data from testing 40 sets of prescription eye protectors were analysed retrospectively for compliance with the impact and refractive power requirements in 2010-11. The testing had been carried out according to the methods of AS/NZS 1337.6:2007 under the terms and conditions of the accreditation of the Optics & Radiometry Laboratory by the National Association of Testing Authorities. No eye protector failed the low-impact resistance test. Failure rates of 1.6 per cent (two of the 40 sets) to the medium impact test and 1.6 per cent (three of the sets) to the medium impact test in the elevated temperature stability test were seen. These are too small for useful statistical analysis. Only -1.50 m(-1) lenses were in all failing sets and these lenses were over-represented in the failures and borderlines, especially compared with the +1.50 D lenses. Failures in prismatic power were equally distributed over all prescriptions. This over-representation of -1.50 m(-1) lenses was not related to the ocular/lens material or to the company manufacturing the eye protectors. The proposal is made that glazing lenses tightly to ensure they are retained in the frame on impact may result in unwanted refractive power in those lenses most prone to flex. These data support the proposal that -1.50 m(-1) lenses should form part of a worst-case testing regime in a certification scheme. © 2012 The Authors. Clinical and Experimental Optometry © 2012 Optometrists Association Australia.
Entrepreneurialism and health-promoting retail food environments in Canadian city-regions.
Mah, Catherine L; Hasdell, Rebecca; Minaker, Leia M; Soo, Stephanie D; Cook, Brian; Demaio, Alessandro R
2017-09-02
The retail sector is a dynamic and challenging component of contemporary food systems with an important influence on population health and nutrition. Global consensus is clear that policy and environmental changes in retail food environments are essential to promote healthier diets and reduce the burden of obesity and non-communicable diseases. In this article, we explore entrepreneurialism as a form of social change-making within retail food environments, focusing on small food businesses. Small businesses face structural barriers within food systems. However, conceptual work in multiple disciplines and evidence from promising health interventions tested in small stores suggest that these retail places may have a dual role in health promotion: settings to strengthen regional economies and social networks, and consumer environments to support healthier diets. We will discuss empirical examples of health-promoting entrepreneurialism based on two sets of in-depth interviews we conducted with public health intervention actors in Toronto, Canada, and food entrepreneurs and city-region policy actors in St. John's, Canada. We will explore the practices of entrepreneurialism in the retail food environment and examine the implications for population health interventions. We contend that entrepreneurialism is important to understand on its own and also as a dimension of population health intervention context. A growing social scientific literature offers a multifaceted lens through which we might consider entrepreneurialism not only as a set of personal characteristics but also as a practice in networked and intersectoral cooperation for public and population health. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Ver Donck, L; Lammers, W J E P; Moreaux, B; Smets, D; Voeten, J; Vekemans, J; Schuurkes, J A J; Coulie, B
2006-03-01
Myoelectric recordings from the intestines in conscious animals have been limited to a few electrode sites with relatively large inter-electrode distances. The aim of this project was to increase the number of recording sites to allow high-resolution reconstruction of the propagation of myoelectrical signals. Sets of six unipolar electrodes, positioned in a 3x2 array, were constructed. A silver ring close to each set served as the reference electrodes. Inter-electrode distances varied from 4 to 8 mm. Electrode sets, to a maximum of 4, were implanted in various configurations allowing recording from 24 sites simultaneously. Four sets of 6 electrodes each were implanted successfully in 11 female Beagles. Implantation sites evaluated were the upper small intestine (n=10), the lower small intestine (n=4) and the stomach (n=3). The implants remained functional for 7.2 months (median; range 1.4-27.3 months). Recorded signals showed slow waves at regular intervals and spike potentials. In addition, when the sets were positioned close together, it was possible to re-construct the propagation of individual slow waves, to determine their direction of propagation and to calculate their propagation velocity. No signs or symptoms of interference with normal GI-function were observed in the tested animals. With this approach, it is possible to implant 24 extracellular electrodes on the serosal surface of the intestines without interfering with its normal physiology. This approach makes it possible to study the electrical activities of the GI system at high resolution in vivo in the conscious animal.
Setterbo, Jacob J.; Chau, Anh; Fyhrie, Patricia B.; Hubbard, Mont; Upadhyaya, Shrini K.; Symons, Jennifer E.; Stover, Susan M.
2012-01-01
Background Racetrack surface is a risk factor for racehorse injuries and fatalities. Current research indicates that race surface mechanical properties may be influenced by material composition, moisture content, temperature, and maintenance. Race surface mechanical testing in a controlled laboratory setting would allow for objective evaluation of dynamic properties of surface and factors that affect surface behavior. Objective To develop a method for reconstruction of race surfaces in the laboratory and validate the method by comparison with racetrack measurements of dynamic surface properties. Methods Track-testing device (TTD) impact tests were conducted to simulate equine hoof impact on dirt and synthetic race surfaces; tests were performed both in situ (racetrack) and using laboratory reconstructions of harvested surface materials. Clegg Hammer in situ measurements were used to guide surface reconstruction in the laboratory. Dynamic surface properties were compared between in situ and laboratory settings. Relationships between racetrack TTD and Clegg Hammer measurements were analyzed using stepwise multiple linear regression. Results Most dynamic surface property setting differences (racetrack-laboratory) were small relative to surface material type differences (dirt-synthetic). Clegg Hammer measurements were more strongly correlated with TTD measurements on the synthetic surface than the dirt surface. On the dirt surface, Clegg Hammer decelerations were negatively correlated with TTD forces. Conclusions Laboratory reconstruction of racetrack surfaces guided by Clegg Hammer measurements yielded TTD impact measurements similar to in situ values. The negative correlation between TTD and Clegg Hammer measurements confirms the importance of instrument mass when drawing conclusions from testing results. Lighter impact devices may be less appropriate for assessing dynamic surface properties compared to testing equipment designed to simulate hoof impact (TTD). Potential Relevance Dynamic impact properties of race surfaces can be evaluated in a laboratory setting, allowing for further study of factors affecting surface behavior under controlled conditions. PMID:23227183
48 CFR 206.203 - Set-asides for small business concerns.
Code of Federal Regulations, 2010 CFR
2010-10-01
... REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE ACQUISITION PLANNING COMPETITION REQUIREMENTS Full and Open Competition After Exclusion of Sources 206.203 Set-asides for small business concerns. (b) Also no separate...
Vortex generator design for aircraft inlet distortion as a numerical optimization problem
NASA Technical Reports Server (NTRS)
Anderson, Bernhard H.; Levy, Ralph
1991-01-01
Aerodynamic compatibility of aircraft/inlet/engine systems is a difficult design problem for aircraft that must operate in many different flight regimes. Takeoff, subsonic cruise, supersonic cruise, transonic maneuvering, and high altitude loiter each place different constraints on inlet design. Vortex generators, small wing like sections mounted on the inside surfaces of the inlet duct, are used to control flow separation and engine face distortion. The design of vortex generator installations in an inlet is defined as a problem addressable by numerical optimization techniques. A performance parameter is suggested to account for both inlet distortion and total pressure loss at a series of design flight conditions. The resulting optimization problem is difficult since some of the design parameters take on integer values. If numerical procedures could be used to reduce multimillion dollar development test programs to a small set of verification tests, numerical optimization could have a significant impact on both cost and elapsed time to design new aircraft.
Nichols, James D.; Pollock, Kenneth H.; Hines, James E.
1984-01-01
The robust design of Pollock (1982) was used to estimate parameters of a Maryland M. pennsylvanicus population. Closed model tests provided strong evidence of heterogeneity of capture probability, and model M eta (Otis et al., 1978) was selected as the most appropriate model for estimating population size. The Jolly-Seber model goodness-of-fit test indicated rejection of the model for this data set, and the M eta estimates of population size were all higher than the Jolly-Seber estimates. Both of these results are consistent with the evidence of heterogeneous capture probabilities. The authors thus used M eta estimates of population size, Jolly-Seber estimates of survival rate, and estimates of birth-immigration based on a combination of the population size and survival rate estimates. Advantages of the robust design estimates for certain inference procedures are discussed, and the design is recommended for future small mammal capture-recapture studies directed at estimation.
Verification of a Remaining Flying Time Prediction System for Small Electric Aircraft
NASA Technical Reports Server (NTRS)
Hogge, Edward F.; Bole, Brian M.; Vazquez, Sixto L.; Celaya, Jose R.; Strom, Thomas H.; Hill, Boyd L.; Smalling, Kyle M.; Quach, Cuong C.
2015-01-01
This paper addresses the problem of building trust in online predictions of a battery powered aircraft's remaining available flying time. A set of ground tests is described that make use of a small unmanned aerial vehicle to verify the performance of remaining flying time predictions. The algorithm verification procedure described here uses a fully functional vehicle that is restrained to a platform for repeated run-to-functional-failure experiments. The vehicle under test is commanded to follow a predefined propeller RPM profile in order to create battery demand profiles similar to those expected in flight. The fully integrated aircraft is repeatedly operated until the charge stored in powertrain batteries falls below a specified lower-limit. The time at which the lower-limit on battery charge is crossed is then used to measure the accuracy of remaining flying time predictions. Accuracy requirements are considered in this paper for an alarm that warns operators when remaining flying time is estimated to fall below a specified threshold.
Harte, Philip T.
1994-01-01
Proper discretization of a ground-water-flow field is necessary for the accurate simulation of ground-water flow by models. Although discretiza- tion guidelines are available to ensure numerical stability, current guidelines arc flexible enough (particularly in vertical discretization) to allow for some ambiguity of model results. Testing of two common types of vertical-discretization schemes (horizontal and nonhorizontal-model-layer approach) were done to simulate sloping hydrogeologic units characteristic of New England. Differences of results of model simulations using these two approaches are small. Numerical errors associated with use of nonhorizontal model layers are small (4 percent). even though this discretization technique does not adhere to the strict formulation of the finite-difference method. It was concluded that vertical discretization by means of the nonhorizontal layer approach has advantages in representing the hydrogeologic units tested and in simplicity of model-data input. In addition, vertical distortion of model cells by this approach may improve the representation of shallow flow processes.
NASA Astrophysics Data System (ADS)
Krygier, Michael; Crowley, Christopher J.; Schatz, Michael F.; Grigoriev, Roman O.
2017-11-01
As suggested by recent theoretical and experimental studies, fluid turbulence can be described as a walk between neighborhoods of unstable nonchaotic solutions of the Navier-Stokes equation known as exact coherent structures (ECS). Finding ECS in an experimentally-accessible setting is the first step toward rigorous testing of the dynamical role of ECS in 3D turbulence. We found several ECS (both relative periodic orbits and relative equilibria) in a weakly turbulent regime of small-aspect-ratio Taylor-Couette flow with counter-rotating cylinders. This talk will discuss how the geometry of these solutions guides the evolution of turbulent flow in the simulations. This work is supported by the Army Research Office (Contract # W911NF-15-1-0471).
Predicting the helix packing of globular proteins by self-correcting distance geometry.
Mumenthaler, C; Braun, W
1995-05-01
A new self-correcting distance geometry method for predicting the three-dimensional structure of small globular proteins was assessed with a test set of 8 helical proteins. With the knowledge of the amino acid sequence and the helical segments, our completely automated method calculated the correct backbone topology of six proteins. The accuracy of the predicted structures ranged from 2.3 A to 3.1 A for the helical segments compared to the experimentally determined structures. For two proteins, the predicted constraints were not restrictive enough to yield a conclusive prediction. The method can be applied to all small globular proteins, provided the secondary structure is known from NMR analysis or can be predicted with high reliability.
Silvey, Garry M.; Lobach, David F.; Macri, Jennifer M.; Hunt, Megan; Kacmaz, Roje O.; Lee, Paul P.
2006-01-01
Collecting clinical data directly from clinicians is a challenge. Many standard development environments designed to expedite the creation of user interfaces for electronic healthcare applications do not provide acceptable components for satisfying the requirements for collecting and displaying clinical data at the point of care on the tablet computer. Through an iterative design and testing approach using think-aloud sessions in the eye care setting, we were able to identify and resolve several user interface issues. Issues that we discovered and subsequently resolved included checkboxes that were too small to be selectable with a stylus, radio buttons that could not be unselected, and font sizes that were too small to be read at arm’s length. PMID:17238715
Lee, Kichol; Casali, John G
2017-01-01
To design a test battery and conduct a proof-of-concept experiment of a test method that can be used to measure the detection performance afforded by military advanced hearing protection devices (HPDs) and tactical communication and protective systems (TCAPS). The detection test was conducted with each of the four loudspeakers located at front, right, rear and left of the participant. Participants wore 2 in-ear-type TCAPS, 1 earmuff-type TCAPS, a passive Combat Arms Earplug in its "open" or pass-through setting and an EB-15LE™ electronic earplug. Devices with electronic gain systems were tested under two gain settings: "unity" and "max". Testing without any device (open ear) was conducted as a control. Ten participants with audiometric requirements of 25 dBHL or better at 500, 1000, 2000, 4000, 8000 Hz in both ears. Detection task performance varied with different signals and speaker locations. The test identified performance differences among certain TCAPS and protectors, and the open ear. A computer-controlled detection subtest of the Detection-Recognition/Identification-Localisation-Communication (DRILCOM) test battery was designed and implemented. Tested in a proof-of-concept experiment, it showed statistically-significant sensitivity to device differences in detection effects with the small sample of participants (10). This result has important implications for selection and deployment of TCAPS and HPDs on soldiers and workers in dynamic situations.
Rose, Bonnie E; Hill, Walter E; Umholtz, Robert; Ransom, Gerri M; James, William O
2002-06-01
The Food Safety and Inspection Service (FSIS) issued Pathogen Reduction; Hazard Analysis and Critical Control Point (HACCP) Systems; Final Rule (the PR/HACCP rule) on 25 July 1996. To verify that industry PR/HACCP systems are effective in controlling the contamination of raw meat and poultry products with human disease-causing bacteria, this rule sets product-specific Salmonella performance standards that must be met by slaughter establishments and establishments producing raw ground products. These performance standards are based on the prevalence of Salmonella as determined from the FSIS's nationwide microbial baseline studies and are expressed in terms of the maximum number of Salmonella-positive samples that are allowed in a given sample set. From 26 January 1998 through 31 December 2000, federal inspectors collected 98,204 samples and 1,502 completed sample sets for Salmonella analysis from large, small, and very small establishments that produced at least one of seven raw meat and poultry products: broilers, market hogs, cows and bulls, steers and heifers, ground beef, ground chicken, and ground turkey. Salmonella prevalence in most of the product categories was lower after the implementation of PR/HACCP than in pre-PR/HACCP baseline studies and surveys conducted by the FSIS. The results of 3 years of testing at establishments of all sizes combined show that >80% of the sample sets met the following Salmonella prevalence performance standards: 20.0% for broilers, 8.7% for market hogs, 2.7% for cows and bulls, 1.0% for steers and heifers, 7.5% for ground beef, 44.6% for ground chicken, and 49.9% for ground turkey. The decreased Salmonella prevalences may partly reflect industry improvements, such as improved process control, incorporation of antimicrobial interventions, and increased microbial-process control monitoring, in conjunction with PR/HACCP implementation.
Summary of small group discussions: Detection of forest degradation drivers
Patricia Manley
2013-01-01
Workshop participants were asked to address sets of questions in small group discussions, which were subsequently brought to the entire group for discussion. The first set of questions was directed at identifying a set of degradation activities that could be a primary focus for developing or refining methods and techniques for monitoring:
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-10
... Acquisition Regulation; Small Business Set Asides for Research and Development Contracts AGENCY: Department of... Acquisition Regulation (FAR) to clarify that contracting officers shall set aside acquisitions for research..., and NASA are proposing to amend the Federal Acquisition Regulation (FAR) to revise paragraph (b)(2) of...
Spontaneous Analog Number Representations in 3-Year-Old Children
ERIC Educational Resources Information Center
Cantlon, Jessica F.; Safford, Kelley E.; Brannon, Elizabeth M.
2010-01-01
When enumerating small sets of elements nonverbally, human infants often show a set-size limitation whereby they are unable to represent sets larger than three elements. This finding has been interpreted as evidence that infants spontaneously represent small numbers with an object-file system instead of an analog magnitude system (Feigenson,…
Summary of small group discussions: Monitoring objectives and thresholds
Patricia Manley
2013-01-01
Workshop participants were asked to address sets of questions in small group discussions, which were subsequently brought to the entire group for discussion. The second set of questions was directed at identifying a set of degradation activities that could be a primary focus for developing or refining methods and techniques for monitoring:
48 CFR 6.206 - Set-asides for service-disabled veteran-owned small business concerns.
Code of Federal Regulations, 2010 CFR
2010-10-01
... FEDERAL ACQUISITION REGULATION ACQUISITION PLANNING COMPETITION REQUIREMENTS Full and Open Competition After Exclusion of Sources 6.206 Set-asides for service-disabled veteran-owned small business concerns...
Safeguard: Progress and Test Results for a Reliable Independent On-Board Safety Net for UAS
NASA Technical Reports Server (NTRS)
Young, Steven D.; Dill, Evan T.; Hayhurst, Kelly J.; Gilabert, Russell V.
2017-01-01
As demands increase to use unmanned aircraft systems (UAS) for a broad spectrum of commercial applications, regulatory authorities are examining how to safely integrate them without compromising safety or disrupting traditional airspace operations. For small UAS, several operational rules have been established; e.g., do not operate beyond visual line-of-sight, do not fly within five miles of a commercial airport, do not fly above 400 feet above ground level. Enforcing these rules is challenging for UAS, as evidenced by the number of incident reports received by the Federal Aviation Administration (FAA). This paper reviews the development of an onboard system - Safeguard - designed to monitor and enforce conformance to a set of operational rules defined prior to flight (e.g., geospatial stay-out or stay-in regions, speed limits, and altitude constraints). Unlike typical geofencing or geo-limitation functions, Safeguard operates independently of the off-the-shelf UAS autopilot and is designed in a way that can be realized by a small set of verifiable functions to simplify compliance with existing standards for safety-critical systems (e.g. for spacecraft and manned commercial transportation aircraft systems). A framework is described that decouples the system from any other devices on the UAS as well as introduces complementary positioning source(s) for applications that require integrity and availability beyond what can be provided by the Global Positioning System (GPS). This paper summarizes the progress and test results for Safeguard research and development since presentation of the design concept at the 35th Digital Avionics Systems Conference (DASC '16). Significant accomplishments include completion of software verification and validation in accordance with NASA standards for spacecraft systems (to Class B), development of improved hardware prototypes, development of a simulation platform that allows for hardware-in-the-loop testing and fast-time Monte Carlo evaluations, and flight testing on multiple air vehicles. Integration testing with NASA's UAS Traffic Management (UTM) service-oriented architecture was also demonstrated.
Characterization of Turbulent Open Channel Flow in a Full-Scale Spiral Corrugated Culvert
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guensch, Greg R.; Richmond, Marshall C.; Tritico, Hans
2004-02-26
A micro-Acoustic Doppler Velocimeter (ADV) was used to characterize the three-dimensional velocity and turbulence characteristics in a full-scale culvert with spiral corrugations. The culvert was set up in a test bed constructed to examine juvenile salmon passage success in various culvert types. The test culvert was 12.2 m long and 1.83m in diameter, and set at a 1.14% slope. The corrugations were 2.54 cm deep by 7.62 cm peak to peak with a 5° right-handed pitch. Tailwater elevation was adjustable with a stop-log system and was set slightly above the water surface level at the culvert exit. Cross-sectional grids ofmore » ADV measurements were taken at discharges of 28, 57, 113, 227, and 453 lps at 9 locations within the culvert barrel and just inside the headwater and tailwater tanks. Results indicated that the spiral corrugations induced asymmetries in the velocity and turbulence distributions. These asymmetries caused a Reduced Velocity Zone (RVZ) on the right side of the culvert as seen looking upstream, which could aid small fish during upstream passage. Velocity and turbulence levels in the RVZ were found to be less than in mid channel or on the left side of the culvert, and the difference became greater at increased flow rates. Lateral and vertical velocity components were very small relative to the axial component, while lateral and vertical turbulence intensities were comparable to the axial component. Inlet loss coefficients were calculated as well and ranged from 0.32 to 0.42. Relationships between the average velocity and the velocity and turbulence intensity in the RVZ were developed, which may be useful for evaluating whether the barrel of a culvert is passable for juvenile fish.« less
Geometric Limitations Of Ultrasonic Measurements
NASA Astrophysics Data System (ADS)
von Nicolai, C.; Schilling, F.
2006-12-01
Laboratory experiments are a key for interpreting seismic field observations. Due to their potential in many experimental set-ups, the determination of elastic properties of minerals and rocks by ultrasonic measurements is common in Geosciences. The quality and thus use of ultrasonic data, however, strongly depends on the sample geometry and wavelength of the sound wave. Two factors, the diameter-to-wavelength- ratio and the diameter-to-length-ratio, are believed to be the essential parameters to affect ultrasonic signal quality. In this study, we determined under well defined conditions the restricting dimensional parameters to test the validity of published assumptions. By the use of commercial ultrasonic transducers a number of experiments were conducted on aluminium, alumina, and acrylic glass rods of varying diameter (30-10 mm) and constant length. At each diameter compressional wave travel times were measured by pulse- transmission method. From the observed travel times ultrasonic wave velocities were calculated. One additional experiment was performed with a series of square-shaped aluminium blocks in order to investigate the effect of the geometry of the samples cross-sectional area. The experimental results show that the simple diameter-to-wavelength ratios are not valid even under idealized experimental conditions and more complex relation has to be talen into account. As diameter decreases the P-waves direct phase is increasingly interfered and weakened by sidewall reflections. At very small diameters compressional waves are replaced by bar waves and P-wave signals become non resolvable. Considering the suppression of both effects, a critical D/ë-ratio was determined and compared to experimental set-ups from various publications. These tests indicate that some published and cited data derived from small diameter set-ups are out off the range of physical possibility.
The impact of worksite wellness in a small business setting.
Merrill, Ray M; Aldana, Steven G; Vyhlidal, Tonya P; Howe, Greg; Anderson, David R; Whitmer, R William
2011-02-01
This study evaluates the level of participation and effectiveness of a worksite wellness program in a small business setting. Three years of wellness participation and risk data from Lincoln Industries was analyzed. All Lincoln Industry employees participated in at least some level of wellness programming. Significant improvements in body fat, blood pressure, and flexibility were observed across time. The largest improvements in risk were seen among older employees and those with the highest baseline values. This small business was able to improve the health of the entire workforce population by integrating wellness deeply into their culture and operations. Replication of this program in other small business settings could have a large impact on public health since 60 million adults in the United States work in small businesses.
Brecko, Jonathan; Mathys, Aurore; Dekoninck, Wouter; Leponce, Maurice; VandenSpiegel, Didier; Semal, Patrick
2014-01-01
Abstract In this manuscript we present a focus stacking system, composed of commercial photographic equipment. The system is inexpensive compared to high-end commercial focus stacking solutions. We tested this system and compared the results with several different software packages (CombineZP, Auto-Montage, Helicon Focus and Zerene Stacker). We tested our final stacked picture with a picture obtained from two high-end focus stacking solutions: a Leica MZ16A with DFC500 and a Leica Z6APO with DFC290. Zerene Stacker and Helicon Focus both provided satisfactory results. However, Zerene Stacker gives the user more possibilities in terms of control of the software, batch processing and retouching. The outcome of the test on high-end solutions demonstrates that our approach performs better in several ways. The resolution of the tested extended focus pictures is much higher than those from the Leica systems. The flash lighting inside the Ikea closet creates an evenly illuminated picture, without struggling with filters, diffusers, etc. The largest benefit is the price of the set-up which is approximately € 3,000, which is 8 and 10 times less than the LeicaZ6APO and LeicaMZ16A set-up respectively. Overall, this enables institutions to purchase multiple solutions or to start digitising the type collection on a large scale even with a small budget. PMID:25589866
Brecko, Jonathan; Mathys, Aurore; Dekoninck, Wouter; Leponce, Maurice; VandenSpiegel, Didier; Semal, Patrick
2014-01-01
In this manuscript we present a focus stacking system, composed of commercial photographic equipment. The system is inexpensive compared to high-end commercial focus stacking solutions. We tested this system and compared the results with several different software packages (CombineZP, Auto-Montage, Helicon Focus and Zerene Stacker). We tested our final stacked picture with a picture obtained from two high-end focus stacking solutions: a Leica MZ16A with DFC500 and a Leica Z6APO with DFC290. Zerene Stacker and Helicon Focus both provided satisfactory results. However, Zerene Stacker gives the user more possibilities in terms of control of the software, batch processing and retouching. The outcome of the test on high-end solutions demonstrates that our approach performs better in several ways. The resolution of the tested extended focus pictures is much higher than those from the Leica systems. The flash lighting inside the Ikea closet creates an evenly illuminated picture, without struggling with filters, diffusers, etc. The largest benefit is the price of the set-up which is approximately € 3,000, which is 8 and 10 times less than the LeicaZ6APO and LeicaMZ16A set-up respectively. Overall, this enables institutions to purchase multiple solutions or to start digitising the type collection on a large scale even with a small budget.
Collaborative essay testing: group work that counts.
Gallagher, Peggy A
2009-01-01
Because much of a nurse's work is accomplished through working in groups, nursing students need an understanding of group process as well as opportunities to problem-solve in groups. Despite an emphasis on group activities as critical for classroom learning, there is a lack of evidence in the nursing literature that describes collaborative essay testing as a teaching strategy. In this class, nursing students worked together in small groups to answer examination questions before submitting a common set of answers. In a follow-up survey, students reported that collaborative testing was a positive experience (e.g., promoting critical thinking, confidence in knowledge, and teamwork). Faculty were excited by the lively dialog heard during the testing in what appeared to be an atmosphere of teamwork. Future efforts could include providing nursing students with direct instruction on group process and more opportunities to work and test collaboratively.
Significance tests for functional data with complex dependence structure.
Staicu, Ana-Maria; Lahiri, Soumen N; Carroll, Raymond J
2015-01-01
We propose an L 2 -norm based global testing procedure for the null hypothesis that multiple group mean functions are equal, for functional data with complex dependence structure. Specifically, we consider the setting of functional data with a multilevel structure of the form groups-clusters or subjects-units, where the unit-level profiles are spatially correlated within the cluster, and the cluster-level data are independent. Orthogonal series expansions are used to approximate the group mean functions and the test statistic is estimated using the basis coefficients. The asymptotic null distribution of the test statistic is developed, under mild regularity conditions. To our knowledge this is the first work that studies hypothesis testing, when data have such complex multilevel functional and spatial structure. Two small-sample alternatives, including a novel block bootstrap for functional data, are proposed, and their performance is examined in simulation studies. The paper concludes with an illustration of a motivating experiment.
2010-01-01
Background MicroRNAs (miRNAs) are a class of small non-coding regulatory RNAs that regulate gene expression by guiding target mRNA cleavage or translational inhibition. MiRNAs can have large-scale regulatory effects on development and stress response in plants. Results To test whether miRNAs play roles in regulating response to powdery mildew infection and heat stress in wheat, by using Solexa high-throughput sequencing we cloned the small RNA from wheat leaves infected by preponderant physiological strain Erysiphe graminis f. sp. tritici (Egt) or by heat stress treatment. A total of 153 miRNAs were identified, which belong to 51 known and 81 novel miRNA families. We found that 24 and 12 miRNAs were responsive to powdery mildew infection and heat stress, respectively. We further predicted that 149 target genes were potentially regulated by the novel wheat miRNA. Conclusions Our results indicated that diverse set of wheat miRNAs were responsive to powdery mildew infection and heat stress and could function in wheat responses to both biotic and abiotic stresses. PMID:20573268
Neural network approach to proximity effect corrections in electron-beam lithography
NASA Astrophysics Data System (ADS)
Frye, Robert C.; Cummings, Kevin D.; Rietman, Edward A.
1990-05-01
The proximity effect, caused by electron beam backscattering during resist exposure, is an important concern in writing submicron features. It can be compensated by appropriate local changes in the incident beam dose, but computation of the optimal correction usually requires a prohibitively long time. We present an example of such a computation on a small test pattern, which we performed by an iterative method. We then used this solution as a training set for an adaptive neural network. After training, the network computed the same correction as the iterative method, but in a much shorter time. Correcting the image with a software based neural network resulted in a decrease in the computation time by a factor of 30, and a hardware based network enhanced the computation speed by more than a factor of 1000. Both methods had an acceptably small error of 0.5% compared to the results of the iterative computation. Additionally, we verified that the neural network correctly generalized the solution of the problem to include patterns not contained in its training set.
Optimal Sparse Upstream Sensor Placement for Hydrokinetic Turbines
NASA Astrophysics Data System (ADS)
Cavagnaro, Robert; Strom, Benjamin; Ross, Hannah; Hill, Craig; Polagye, Brian
2016-11-01
Accurate measurement of the flow field incident upon a hydrokinetic turbine is critical for performance evaluation during testing and setting boundary conditions in simulation. Additionally, turbine controllers may leverage real-time flow measurements. Particle image velocimetry (PIV) is capable of rendering a flow field over a wide spatial domain in a controlled, laboratory environment. However, PIV's lack of suitability for natural marine environments, high cost, and intensive post-processing diminish its potential for control applications. Conversely, sensors such as acoustic Doppler velocimeters (ADVs), are designed for field deployment and real-time measurement, but over a small spatial domain. Sparsity-promoting regression analysis such as LASSO is utilized to improve the efficacy of point measurements for real-time applications by determining optimal spatial placement for a small number of ADVs using a training set of PIV velocity fields and turbine data. The study is conducted in a flume (0.8 m2 cross-sectional area, 1 m/s flow) with laboratory-scale axial and cross-flow turbines. Predicted turbine performance utilizing the optimal sparse sensor network and associated regression model is compared to actual performance with corresponding PIV measurements.
Trail making task performance in inpatients with anorexia nervosa and bulimia nervosa.
Vall, Eva; Wade, Tracey D
2015-07-01
Set-shifting inefficiencies have been consistently identified in adults with anorexia nervosa (AN). It is less clear to what degree similar inefficiencies are present in those with bulimia nervosa (BN). It is also unknown whether perfectionism is related to set-shifting performance. We employed a commonly used set-shifting measure, the Trail Making Test (TMT), to compare the performance of inpatients with AN and BN with a healthy control sample. We also investigated whether perfectionism predicted TMT scores. Only the BN sample showed significantly suboptimal performance, while the AN sample was indistinguishable from controls on all measures. There were no differences between the AN subtypes (restrictive or binge/purge), but group sizes were small. Higher personal standards perfectionism was associated with better TMT scores across groups. Higher concern over mistakes perfectionism predicted better accuracy in the BN sample. Further research into the set-shifting profile of individuals with BN or binge/purge behaviours is needed. Copyright © 2015 John Wiley & Sons, Ltd and Eating Disorders Association.
Evaluation of supercapacitors for space applications under thermal vacuum conditions
NASA Astrophysics Data System (ADS)
Chin, Keith C.; Green, Nelson W.; Brandon, Erik J.
2018-03-01
Commercially available supercapacitor cells from three separate vendors were evaluated for use in a space environment using thermal vacuum (Tvac) testing. Standard commercial cells are not hermetically sealed, but feature crimp or double seam seals between the header and the can, which may not maintain an adequate seal under vacuum. Cells were placed in a small vacuum chamber, and cycled between three separate temperature set points. Charging and discharging of cells was executed following each temperature soak, to confirm there was no significant impact on performance. A final electrical performance check, visual inspection and mass check following testing were also performed, to confirm the integrity of the cells had not been compromised during exposure to thermal cycling under vacuum. All cells tested were found to survive this testing protocol and exhibited no significant impact on electrical performance.
Janet, Jon Paul; Kulik, Heather J
2017-11-22
Machine learning (ML) of quantum mechanical properties shows promise for accelerating chemical discovery. For transition metal chemistry where accurate calculations are computationally costly and available training data sets are small, the molecular representation becomes a critical ingredient in ML model predictive accuracy. We introduce a series of revised autocorrelation functions (RACs) that encode relationships of the heuristic atomic properties (e.g., size, connectivity, and electronegativity) on a molecular graph. We alter the starting point, scope, and nature of the quantities evaluated in standard ACs to make these RACs amenable to inorganic chemistry. On an organic molecule set, we first demonstrate superior standard AC performance to other presently available topological descriptors for ML model training, with mean unsigned errors (MUEs) for atomization energies on set-aside test molecules as low as 6 kcal/mol. For inorganic chemistry, our RACs yield 1 kcal/mol ML MUEs on set-aside test molecules in spin-state splitting in comparison to 15-20× higher errors for feature sets that encode whole-molecule structural information. Systematic feature selection methods including univariate filtering, recursive feature elimination, and direct optimization (e.g., random forest and LASSO) are compared. Random-forest- or LASSO-selected subsets 4-5× smaller than the full RAC set produce sub- to 1 kcal/mol spin-splitting MUEs, with good transferability to metal-ligand bond length prediction (0.004-5 Å MUE) and redox potential on a smaller data set (0.2-0.3 eV MUE). Evaluation of feature selection results across property sets reveals the relative importance of local, electronic descriptors (e.g., electronegativity, atomic number) in spin-splitting and distal, steric effects in redox potential and bond lengths.
Students' perceptions of laboratory science careers: changing ideas with an education module.
Haun, Daniel; Leach, Argie; Lawrence, Louann; Jarreau, Patsy
2005-01-01
To assess the effectiveness of a Web-based education module in changing students' perceptions of laboratory science careers. Perception was measured with a short examination and then a Web-based exercise was presented. Following the exercise, the test was administered again. Frequency data from the pre-test and post-test were compared for changes in perception. The correlated pre-test/post-test pairs were also examined for opinion changes and these were analyzed for significance. Large parochial high schools in New Orleans, Louisiana. A small team visited the schools during their appointed class times for biology. Study participants were high school biology students in grades 9-10. Two-hundred-forty-five students participated (149 male and 96 female). A Web-based exercise on blood film examination was presented to the students in a classroom setting (www.mclno.org/labpartners/index_03.htm). The exercise contained focused messages about: (1) the numbers of healthcare workers acquiring AIDS from on-the-job exposure and (2) common career paths available to the laboratory science workforce. The shift in perception of: What medical service generates the most diagnostic data. Which professional group performs laboratory tests. The risk of acquiring AIDS while working in the healthcare setting. Interest in a science-related career. How much education is required to work in a science-related field. The intervention significantly shifted perception in all areas measured except that of interest in a science-related career. Many students perceive that the risk of acquiring AIDS while working in the healthcare setting is "high". Web-based presentations and similar partnerships with science teachers can change perceptions that might lead to increased interest in clinical laboratory science careers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pražnikar, Jure; University of Primorska,; Turk, Dušan, E-mail: dusan.turk@ijs.si
2014-12-01
The maximum-likelihood free-kick target, which calculates model error estimates from the work set and a randomly displaced model, proved superior in the accuracy and consistency of refinement of crystal structures compared with the maximum-likelihood cross-validation target, which calculates error estimates from the test set and the unperturbed model. The refinement of a molecular model is a computational procedure by which the atomic model is fitted to the diffraction data. The commonly used target in the refinement of macromolecular structures is the maximum-likelihood (ML) function, which relies on the assessment of model errors. The current ML functions rely on cross-validation. Theymore » utilize phase-error estimates that are calculated from a small fraction of diffraction data, called the test set, that are not used to fit the model. An approach has been developed that uses the work set to calculate the phase-error estimates in the ML refinement from simulating the model errors via the random displacement of atomic coordinates. It is called ML free-kick refinement as it uses the ML formulation of the target function and is based on the idea of freeing the model from the model bias imposed by the chemical energy restraints used in refinement. This approach for the calculation of error estimates is superior to the cross-validation approach: it reduces the phase error and increases the accuracy of molecular models, is more robust, provides clearer maps and may use a smaller portion of data for the test set for the calculation of R{sub free} or may leave it out completely.« less
Luongo, Francisco J.; Zimmerman, Chris A.; Horn, Meryl E.
2016-01-01
Sequential patterns of prefrontal activity are believed to mediate important behaviors, e.g., working memory, but it remains unclear exactly how they are generated. In accordance with previous studies of cortical circuits, we found that prefrontal microcircuits in young adult mice spontaneously generate many more stereotyped sequences of activity than expected by chance. However, the key question of whether these sequences depend on a specific functional organization within the cortical microcircuit, or emerge simply as a by-product of random interactions between neurons, remains unanswered. We observed that correlations between prefrontal neurons do follow a specific functional organization—they have a small-world topology. However, until now it has not been possible to directly link small-world topologies to specific circuit functions, e.g., sequence generation. Therefore, we developed a novel analysis to address this issue. Specifically, we constructed surrogate data sets that have identical levels of network activity at every point in time but nevertheless represent various network topologies. We call this method shuffling activity to rearrange correlations (SHARC). We found that only surrogate data sets based on the actual small-world functional organization of prefrontal microcircuits were able to reproduce the levels of sequences observed in actual data. As expected, small-world data sets contained many more sequences than surrogate data sets with randomly arranged correlations. Surprisingly, small-world data sets also outperformed data sets in which correlations were maximally clustered. Thus the small-world functional organization of cortical microcircuits, which effectively balances the random and maximally clustered regimes, is optimal for producing stereotyped sequential patterns of activity. PMID:26888108
Fetal well-being: nonimaging assessment and the biophysical profile.
Jackson, G M; Forouzan, I; Cohen, A W
1991-01-01
All of the testing methods described above are very good at predicting continued fetal health when test results are reassuring. Each test also suffers from a very poor ability to predict compromise when results are abnormal. Thus, the primary value of antepartum fetal monitoring is in identifying those pregnancies that do not require immediate intervention and may be allowed to continue. Certainly, all pregnant women (regardless of risk status) should monitor fetal movement as part of their fetal surveillance. For patients at risk, a variety of testing schemes are available using combinations of the NST, CST and BPP. There are several reasons for using the NST as the primary testing method for those at risk. Even a small antenatal testing area can accommodate three or four FHR monitors, and a single antenatal testing nurse can perform several NSTs at a time. Because the BPP requires an ultrasound machine and a trained technician to perform, and because only one BPP can be done at a time, many obstetricians who do their own in-office fetal testing are unable to adopt BPP testing as their primary means of surveillance. Additionally, it is more economical to use the NST than the BPP for first-line testing. Assuming charges of $150 and $300 for the NST and BPP, respectively, and assuming that 20% of NSTs are nonreactive and require a BPP for second-line testing, the weekly cost of testing 100 patients is $21,000 for the NST and $37,500 for the BPP. This increase-in-testing cost must be balanced against the small improvement in perinatal mortality rates achieved with the use of the BPP. Because it must be performed in a hospital setting and takes an average of 90 minutes to complete, the CST is more expensive and time-consuming than either the NST or BPP and it is less frequently used as the primary method of fetal testing. In the past the CST was the most commonly used secondary test after a nonreactive NST, but use of the BPP in this situation has now become commonplace. Although the CST still has an important role in fetal testing, the BPP is better suited for use in this setting because of its technical ease and low incidence of abnormal results. Thus, many centers use the NST as the primary mode of testing for the fetus at risk, often with a sonographic assessment of AFV.(ABSTRACT TRUNCATED AT 400 WORDS)
Meghezi, Sébastien; Couet, Frédéric; Chevallier, Pascale; Mantovani, Diego
2012-01-01
Vascular tissue engineering focuses on the replacement of diseased small-diameter blood vessels with a diameter less than 6 mm for which adequate substitutes still do not exist. One approach to vascular tissue engineering is to culture vascular cells on a scaffold in a bioreactor. The bioreactor establishes pseudophysiological conditions for culture (medium culture, 37°C, mechanical stimulation). Collagen gels are widely used as scaffolds for tissue regeneration due to their biological properties; however, they exhibit low mechanical properties. Mechanical characterization of these scaffolds requires establishing the conditions of testing in regard to the conditions set in the bioreactor. The effects of different parameters used during mechanical testing on the collagen gels were evaluated in terms of mechanical and viscoelastic properties. Thus, a factorial experiment was adopted, and three relevant factors were considered: temperature (23°C or 37°C), hydration (aqueous saline solution or air), and mechanical preconditioning (with or without). Statistical analyses showed significant effects of these factors on the mechanical properties which were assessed by tensile tests as well as stress relaxation tests. The last tests provide a more consistent understanding of the gels' viscoelastic properties. Therefore, performing mechanical analyses on hydrogels requires setting an adequate environment in terms of temperature and aqueous saline solution as well as choosing the adequate test. PMID:22844285
Zheng, Ling-Ling; Xu, Wei-Lin; Liu, Shun; Sun, Wen-Ju; Li, Jun-Hao; Wu, Jie; Yang, Jian-Hua; Qu, Liang-Hu
2016-07-08
tRNA-derived small RNA fragments (tRFs) are one class of small non-coding RNAs derived from transfer RNAs (tRNAs). tRFs play important roles in cellular processes and are involved in multiple cancers. High-throughput small RNA (sRNA) sequencing experiments can detect all the cellular expressed sRNAs, including tRFs. However, distinguishing genuine tRFs from RNA fragments generated by random degradation remains a major challenge. In this study, we developed an integrated web-based computing system, tRF2Cancer, to accurately identify tRFs from sRNA deep-sequencing data and evaluate their expression in multiple cancers. The binomial test was introduced to evaluate whether reads from a small RNA-seq data set represent tRFs or degraded fragments. A classification method was then used to annotate the types of tRFs based on their sites of origin in pre-tRNA or mature tRNA. We applied the pipeline to analyze 10 991 data sets from 32 types of cancers and identified thousands of expressed tRFs. A tool called 'tRFinCancer' was developed to facilitate the users to inspect the expression of tRFs across different types of cancers. Another tool called 'tRFBrowser' shows both the sites of origin and the distribution of chemical modification sites in tRFs on their source tRNA. The tRF2Cancer web server is available at http://rna.sysu.edu.cn/tRFfinder/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
How To Set Up Your Own Small Business. Study Guide.
ERIC Educational Resources Information Center
American Inst. of Small Business, Minneapolis, MN.
This study guide is intended for use with the separately available entrepreneurship education text "How To Set Up Your Own Business." The guide includes student exercises that have been designed to accompany chapters dealing with the following topics: determining whether or not to set up a small business, doing market research, forecasting sales,…
ERIC Educational Resources Information Center
Peterson, Ken
An instructor's manual and student activity guide on small appliance repair are provided in this set of prevocational education materials which focuses on the vocational area of trade and industry. (This set of materials is one of ninety-two prevocational education sets arranged around a cluster of seven vocational offerings: agriculture, home…
Effect of missing data on multitask prediction methods.
de la Vega de León, Antonio; Chen, Beining; Gillet, Valerie J
2018-05-22
There has been a growing interest in multitask prediction in chemoinformatics, helped by the increasing use of deep neural networks in this field. This technique is applied to multitarget data sets, where compounds have been tested against different targets, with the aim of developing models to predict a profile of biological activities for a given compound. However, multitarget data sets tend to be sparse; i.e., not all compound-target combinations have experimental values. There has been little research on the effect of missing data on the performance of multitask methods. We have used two complete data sets to simulate sparseness by removing data from the training set. Different models to remove the data were compared. These sparse sets were used to train two different multitask methods, deep neural networks and Macau, which is a Bayesian probabilistic matrix factorization technique. Results from both methods were remarkably similar and showed that the performance decrease because of missing data is at first small before accelerating after large amounts of data are removed. This work provides a first approximation to assess how much data is required to produce good performance in multitask prediction exercises.
Williams, Thomas N
2015-09-23
Each year, at least 280,000 children are born with sickle cell disease (SCD) in resource-limited settings. For cost, logistic and political reasons, the availability of SCD testing is limited in such settings and consequently 50-90 % of affected children die undiagnosed before their fifth birthday. The recent development of a point of care method for the diagnosis of SCD - the Sickle SCAN™ device - could afford such children the prompt access to appropriate services that has transformed the outlook for affected children in resource-rich areas. In research published in BMC Medicine, Kanter and colleagues describe a small but carefully conducted study involving 208 children and adults, in which they found that by using Sickle SCAN™ it was possible to diagnose the common forms of SCD with 99 % sensitivity and 99 % specificity, in under 5 minutes. If repeatable both in newborn babies and under real-life conditions, and if marketed at an affordable price, Sickle SCAN™ could revolutionize the survival prospects for children born with SCD in resource-limited areas.Please see related article: http://dx.doi.org/10.1186/s12916-015-0473-6.
NASA Astrophysics Data System (ADS)
Fisher, W. P., Jr.; Elbaum, B.; Coulter, A.
2010-07-01
Reliability coefficients indicate the proportion of total variance attributable to differences among measures separated along a quantitative continuum by a testing, survey, or assessment instrument. Reliability is usually considered to be influenced by both the internal consistency of a data set and the number of items, though textbooks and research papers rarely evaluate the extent to which these factors independently affect the data in question. Probabilistic formulations of the requirements for unidimensional measurement separate consistency from error by modelling individual response processes instead of group-level variation. The utility of this separation is illustrated via analyses of small sets of simulated data, and of subsets of data from a 78-item survey of over 2,500 parents of children with disabilities. Measurement reliability ultimately concerns the structural invariance specified in models requiring sufficient statistics, parameter separation, unidimensionality, and other qualities that historically have made quantification simple, practical, and convenient for end users. The paper concludes with suggestions for a research program aimed at focusing measurement research more on the calibration and wide dissemination of tools applicable to individuals, and less on the statistical study of inter-variable relations in large data sets.
Artificial neural network and classical least-squares methods for neurotransmitter mixture analysis.
Schulze, H G; Greek, L S; Gorzalka, B B; Bree, A V; Blades, M W; Turner, R F
1995-02-01
Identification of individual components in biological mixtures can be a difficult problem regardless of the analytical method employed. In this work, Raman spectroscopy was chosen as a prototype analytical method due to its inherent versatility and applicability to aqueous media, making it useful for the study of biological samples. Artificial neural networks (ANNs) and the classical least-squares (CLS) method were used to identify and quantify the Raman spectra of the small-molecule neurotransmitters and mixtures of such molecules. The transfer functions used by a network, as well as the architecture of a network, played an important role in the ability of the network to identify the Raman spectra of individual neurotransmitters and the Raman spectra of neurotransmitter mixtures. Specifically, networks using sigmoid and hyperbolic tangent transfer functions generalized better from the mixtures in the training data set to those in the testing data sets than networks using sine functions. Networks with connections that permit the local processing of inputs generally performed better than other networks on all the testing data sets. and better than the CLS method of curve fitting, on novel spectra of some neurotransmitters. The CLS method was found to perform well on noisy, shifted, and difference spectra.
Protecting Privacy Using k-Anonymity
El Emam, Khaled; Dankar, Fida Kamal
2008-01-01
Objective There is increasing pressure to share health information and even make it publicly available. However, such disclosures of personal health information raise serious privacy concerns. To alleviate such concerns, it is possible to anonymize the data before disclosure. One popular anonymization approach is k-anonymity. There have been no evaluations of the actual re-identification probability of k-anonymized data sets. Design Through a simulation, we evaluated the re-identification risk of k-anonymization and three different improvements on three large data sets. Measurement Re-identification probability is measured under two different re-identification scenarios. Information loss is measured by the commonly used discernability metric. Results For one of the re-identification scenarios, k-Anonymity consistently over-anonymizes data sets, with this over-anonymization being most pronounced with small sampling fractions. Over-anonymization results in excessive distortions to the data (i.e., high information loss), making the data less useful for subsequent analysis. We found that a hypothesis testing approach provided the best control over re-identification risk and reduces the extent of information loss compared to baseline k-anonymity. Conclusion Guidelines are provided on when to use the hypothesis testing approach instead of baseline k-anonymity. PMID:18579830
ARCADE small-scale docking mechanism for micro-satellites
NASA Astrophysics Data System (ADS)
Boesso, A.; Francesconi, A.
2013-05-01
The development of on-orbit autonomous rendezvous and docking (ARD) capabilities represents a key point for a number of appealing mission scenarios that include activities of on-orbit servicing, automated assembly of modular structures and active debris removal. As of today, especially in the field of micro-satellites ARD, many fundamental technologies are still missing or require further developments and micro-gravity testing. In this framework, the University of Padova, Centre of Studies and Activities for Space (CISAS), developed the Autonomous Rendezvous Control and Docking Experiment (ARCADE), a technology demonstrator intended to fly aboard a BEXUS stratospheric balloon. The goal was to design, build and test, in critical environment conditions, a proximity relative navigation system, a custom-made reaction wheel and a small-size docking mechanism. The ARCADE docking mechanism was designed against a comprehensive set of requirements and it can be classified as small-scale, central, gender mating and unpressurized. The large use of commercial components makes it low-cost and simple to be manufactured. Last, it features a good tolerance to off-nominal docking conditions and a by-design soft docking capability. The final design was extensively verified to be compliant with its requirements by means of numerical simulations and physical testing. In detail, the dynamic behaviour of the mechanism in both nominal and off-nominal conditions was assessed with the multibody dynamics analysis software MD ADAMS 2010 and functional tests were carried out within the fully integrated ARCADE experiment to ensure the docking system efficacy and to highlight possible issues. The most relevant results of the study will be presented and discussed in conclusion to this paper.
An efficient transport solver for tokamak plasmas
Park, Jin Myung; Murakami, Masanori; St. John, H. E.; ...
2017-01-03
A simple approach to efficiently solve a coupled set of 1-D diffusion-type transport equations with a stiff transport model for tokamak plasmas is presented based on the 4th order accurate Interpolated Differential Operator scheme along with a nonlinear iteration method derived from a root-finding algorithm. Here, numerical tests using the Trapped Gyro-Landau-Fluid model show that the presented high order method provides an accurate transport solution using a small number of grid points with robust nonlinear convergence.
Measurement of αΩ in Ω- → ΛΚ- Decays
NASA Astrophysics Data System (ADS)
Lu, Lan-Chun; Chan, A.; Chen, Y. C.; Ho, C.; Teng, P. K.; Choong, W. S.; Fu, Y.; Gidal, G.; Gu, P.; Jones, T.; Luk, K. B.; Turko, B.; Zyla, P.; James, C.; Volk, J.; Felix, J.; Burnstein, R. A.; Chakravorty, A.; Kaplan, D. M.; Lederman, L. M.; Luebke, W.; Rajaram, D.; Rubin, H. A.; Solomey, N.; Torun, Y.; White, C. G.; White, S. L.; Leros, N.; Perroud, J.-P.; Gustafson, H. R.; Longo, M. J.; Lopez, F.; Park, H. K.; Jenkins, M.; Clark, K.; Dukes, E. C.; Durandet, C.; Holmstrom, T.; Huang, M.; Lu, L. C.; Hypercp Collaboration
2003-07-01
The HyperCP experiment (E871) at Fermilab has collected the largest sample of hyperon decays in the world. With a data set of over a million Ω- → ΛΚ- decays we have measured the product of αΩαΛ from which we have extracted αΩ. This preliminary result indicates that αΩ is small, but non-zero. Prospects for a test of CP symmetry by comparing the α parameters in Ω- and Ω¯+ decays will be discussed.
24th International Symposium on Ballistics
2008-09-26
production Samples dimensions were 0.3x0.05 m. Test set up Gas gun 5.5 mm diameter steel spheres and sabot Velocity measuring systems High speed rate...Oilwell perforators – small caliber shaped charges – create the pathway for oil or gas to flow from the reservoir rock into the wellbore Deep, clean ...overburden, tectonic) – Pore fluid pressure – Pore fluid type ( liquid vs. gas ) Background Geomechanics considerations: – In-situ stresses (“total
Acoustic detection of air shower cores
NASA Technical Reports Server (NTRS)
Gao, X.; Liu, Y.; Du, S.
1985-01-01
At an altitude of 1890m, a pre-test with an Air shower (AS) core selector and a small acoustic array set up in an anechoic pool with a volume of 20x7x7 cu m was performed, beginning in Aug. 1984. In analyzing the waveforms recorded during the effective working time of 186 hrs, three acoustic signals which cannot be explained as from any source other than AS cores were obtained, and an estimation of related parameters was made.
Development of a Low-Emission Spray Combustor for Emulsified Crude Oil
2017-03-03
has been studied for diesel [11,12] and gas turbine [13,14] engines for some time, but none of these fuels contain the wide range of hydrocarbons...measurements. We conducted the tests on the flight deck and discovered that the constantly shifting wind moved the exhaust plume away from the plume...when the wind speed was very low. This small period of about one hour, and the required set up time, severely 10 limited our ability to conduct
2015-10-01
first tested on a sera from healthy patients (n=20), systemic lupus erythematosus patients (n=29) and systemic sclerosis patients (n=20). Patients...with lupus and systemic sclerosis both had an increase in circulating soluble cadherin-11 levels that was statistically significant over levels seen...level. These data suggest that cadherin-11 levels are increased in patients with systemic sclerosis and lupus . These data use a small set of samples
2016-05-01
first tested on a sera from healthy patients (n=20), systemic lupus erythematosus patients (n=29) and systemic sclerosis patients (n=20). Patients...with lupus and systemic sclerosis both had an increase in circulating soluble cadherin-11 levels that was statistically significant over levels seen...level. These data suggest that cadherin-11 levels are increased in patients with systemic sclerosis and lupus . These data use a small set of samples
Genetic Evolution of Shape-Altering Programs for Supersonic Aerodynamics
NASA Technical Reports Server (NTRS)
Kennelly, Robert A., Jr.; Bencze, Daniel P. (Technical Monitor)
2002-01-01
Two constrained shape optimization problems relevant to aerodynamics are solved by genetic programming, in which a population of computer programs evolves automatically under pressure of fitness-driven reproduction and genetic crossover. Known optimal solutions are recovered using a small, naive set of elementary operations. Effectiveness is improved through use of automatically defined functions, especially when one of them is capable of a variable number of iterations, even though the test problems lack obvious exploitable regularities. An attempt at evolving new elementary operations was only partially successful.
2003-04-15
KENNEDY SPACE CENTER, FLA. -- In the Payload Hazardous Servicing Facility, the lander petals of the Mars Exploration Rover 2 (MER-2) have been reopened to allow technicians access to one of the spacecraft's circuit boards. A concern arose during prelaunch testing regarding how the spacecraft interprets signals sent from its main computer to peripherals in the cruise stage, lander and small deep space transponder. The MER Mission consists of two identical rovers set to launch in June 2003. The problem will be fixed on both rovers.
2003-04-15
KENNEDY SPACE CENTER, FLA. -- In the Payload Hazardous Servicing Facility, technicians reopen the lander petals of the Mars Exploration Rover 2 (MER-2) to allow access to one of the spacecraft's circuit boards. A concern arose during prelaunch testing regarding how the spacecraft interprets signals sent from its main computer to peripherals in the cruise stage, lander and small deep space transponder. The MER Mission consists of two identical rovers set to launch in June 2003. The problem will be fixed on both rovers.
2003-04-15
KENNEDY SPACE CENTER, FLA. -- In the Payload Hazardous Servicing Facility, technicians reopen the lander petals of the Mars Exploration Rover 2 (MER-2) to allow access to one of the spacecraft's circuit boards. A concern arose during prelaunch testing regarding how the spacecraft interprets signals sent from its main computer to peripherals in the cruise stage, lander and small deep space transponder. The MER Mission consists of two identical rovers set to launch in June 2003. The problem will be fixed on both rovers.
An analysis of a digital variant of the Trail Making Test using machine learning techniques.
Dahmen, Jessamyn; Cook, Diane; Fellows, Robert; Schmitter-Edgecombe, Maureen
2017-01-01
The goal of this work is to develop a digital version of a standard cognitive assessment, the Trail Making Test (TMT), and assess its utility. This paper introduces a novel digital version of the TMT and introduces a machine learning based approach to assess its capabilities. Using digital Trail Making Test (dTMT) data collected from (N = 54) older adult participants as feature sets, we use machine learning techniques to analyze the utility of the dTMT and evaluate the insights provided by the digital features. Predicted TMT scores correlate well with clinical digital test scores (r = 0.98) and paper time to completion scores (r = 0.65). Predicted TICS exhibited a small correlation with clinically derived TICS scores (r = 0.12 Part A, r = 0.10 Part B). Predicted FAB scores exhibited a small correlation with clinically derived FAB scores (r = 0.13 Part A, r = 0.29 for Part B). Digitally derived features were also used to predict diagnosis (AUC of 0.65). Our findings indicate that the dTMT is capable of measuring the same aspects of cognition as the paper-based TMT. Furthermore, the dTMT's additional data may be able to help monitor other cognitive processes not captured by the paper-based TMT alone.
On fractography of shallow and deep HY-100 cracked bend specimens
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yuan, D.W.; Zarzour, J.F.; Kleinosky, M.J.
1994-12-01
The influence of shallow cracks on the fracture behavior of structural components has been studied extensively in recent years. Finite element analyses have indicated dramatic differences in the crack-tip stress states between shallow and deep cracked bend specimens. In this study, an experimental program was carried out to investigate the fracture behavior of HY-100 steel containing various initial flaw depths. Four a/w ratios ranging from 0.05 to 0.5 were chosen for the notched three-point bend tests. Test results showed that higher fracture toughness values are associated with specimens having shorter surface cracks. Also, fractographic studies indicated that two sets ofmore » dimples are present for a/w = 0.5 specimen, one set of equiaxed dimple for a/w = 0.05 specimen near the crack initiation zone. As the crack grows, increase in the volume fraction of the small dimple were observed. Finally, it showed that the characteristic features of the fracture surfaces can be correlated with the previous numerical predictions.« less
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks
2017-01-01
In de novo drug design, computational strategies are used to generate novel molecules with good affinity to the desired biological target. In this work, we show that recurrent neural networks can be trained as generative models for molecular structures, similar to statistical language models in natural language processing. We demonstrate that the properties of the generated molecules correlate very well with the properties of the molecules used to train the model. In order to enrich libraries with molecules active toward a given biological target, we propose to fine-tune the model with small sets of molecules, which are known to be active against that target. Against Staphylococcus aureus, the model reproduced 14% of 6051 hold-out test molecules that medicinal chemists designed, whereas against Plasmodium falciparum (Malaria), it reproduced 28% of 1240 test molecules. When coupled with a scoring function, our model can perform the complete de novo drug design cycle to generate large sets of novel molecules for drug discovery. PMID:29392184