Sample records for computer scientist prog

  1. Using Electrically-evoked Compound Action Potentials to Estimate Perceptive Levels in Experienced Adult Cochlear Implant Users.

    PubMed

    Joly, Charles-Alexandre; Péan, Vincent; Hermann, Ruben; Seldran, Fabien; Thai-Van, Hung; Truy, Eric

    2017-10-01

    The cochlear implant (CI) fitting level prediction accuracy of electrically-evoked compound action potential (ECAP) should be enhanced by the addition of demographic data in models. No accurate automated fitting of CI based on ECAP has yet been proposed. We recorded ECAP in 45 adults who had been using MED-EL CIs for more than 11 months and collected the most comfortable loudness level (MCL) used for CI fitting (prog-MCL), perception thresholds (meas-THR), and MCL (meas-MCL) measured with the stimulation used for ECAP recording. Linear mixed models taking into account cochlear site factors were computed to explain prog-MCL, meas-MCL, and meas-THR. Cochlear region and ECAP threshold were predictors of the three levels. In addition, significant predictors were the ECAP amplitude for the prog-MCL and the duration of deafness for the prog-MCL and the meas-THR. Estimations were more accurate for the meas-THR, then the meas-MCL, and finally the prog-MCL. These results show that 1) ECAP thresholds are more closely related to perception threshold than to comfort level, 2) predictions are more accurate when the inter-subject and cochlear regions variations are considered, and 3) differences between the stimulations used for ECAP recording and for CI fitting make it difficult to accurately predict the prog-MCL from the ECAP recording. Predicted prog-MCL could be used as bases for fitting but should be used with care to avoid any uncomfortable or painful stimulation.

  2. Efficient Computations and Representations of Visible Surfaces.

    DTIC Science & Technology

    1979-12-01

    position as stated. The smooth contour generator may lie along a sharp ridge, for instance. Richards & Stevens -28- 6m lace contout s ?S ,.......... ceoonec...From understanding computation to understanding neural circuitry. Neurosci. Res. Prog. Bull. 13. 470-488. Metelli, F. 1970 An algebraic development of

  3. The 1984 ARI Survey of Army Recruits: Codebook for Summer 84 USAR and ARNG Survey Respondents

    DTIC Science & Technology

    1986-05-01

    THE FOLLOWING PROGRAMS OR PROGRAMMING TYPES ON TV: NBA BASKETBALL . RAW DATA ICARD #I COLS ILENGTHI I _ _ _ I _ _ I _ _ _ I05 0-2-043 20I __ I I SAS...LEAG BASEBALL REG SEAS 249 T259 WATCH TV PROG:MJR LEAG BASEBALL PLAYOFFS 250 T260 WATCH TV PROG:WORLD SERIES 251 V T261 WATCH TV PROG:NBA BASKETBALL 252...T262 WATCH TV PROG:COLLEGE BASKETBALL 253 T263 WATCH TV PROG:NHL HOCKEY 254 T264 WATCH TV PROG:PROFESSIONAL WRESTLING 255 T265 WATCH TV PROG:CAR RACES

  4. The 1984 ARI Survey of Army Recruits: Tabular Description of NPS Army Reserve Accessions. Volume 2

    DTIC Science & Technology

    1986-05-01

    12 PROB. 0.1867 I 186 T261 — NATCH TV PROG’NBA BASKETBALL NARK ONE LETTER FOR EACH OF THE FOLLOWING PROGRAMS OR PROGRAMMING TYPES ON TV: NBA ...Major league baseball — regular seaaon games 105. Major league baaaball playoffs 106. World Series 107. NBA baaketball 106. College basketball 109...BASEBALL PLAYOFFS HATCH TV PROG:WORLD SERIES HATCH TV PROG:NBA BASKETBALL HATCH TV PROG:COLLEGE BASKETBALL HATCH TV PROG:NHL HOCKEY HATCH TV

  5. The 1984 ARI Survey of Army Recruits: Tabular Description of NPS (active) Army Accessions. Volume 1

    DTIC Science & Technology

    1986-05-01

    WATCH TV PROG MJR LEAG BASEBALL PLAYOFFS 232-233 WATCH TV PROG WORLD SERIES 23<«-235 WATCH TV PROG NBA BASKETBALL 236-237 WATCH TV PROG COLLEGE...PROG:NBA BASKETBALL DO YOU ’^ATCH ANY OF THE FOLLOWING PROGRAMS OR PROGRAMMING TYPES ON TV? - NBA BASKETBALL , 1 - REGULARLY TURN ON THE TV TO WATCH...107. NBA basketball 108. College basketball 109. NHL hockey 110. Professional wrestling 111. Car r.ices 112. Golf tournaments 113. Tennis

  6. The 1985 ARI Survey of Army Recruits: Tabular Description of NPS (active) Army Accessions. Volume 1

    DTIC Science & Technology

    1987-04-01

    T261 -- WATCH TV PROG:NBA BASKETBALL DO YOU WATCH ANY OF THE FOLLOWING PROGRAMS OR PROGRAMMING TYPES ON REGULAR TV STATIONS? .- NBA BASKETBALL . 1...327𔃻 261 WATCH TV PROG:NBA BASKETBALL 328-329 T262 WATCH TV PROG:COLLEGE BASKETBALL 330-331 T263 WATCH TV PROG:NHL HOCKEY 332-333 T264 WlATCH TV...T262 -- WATCH TV PROG:COLLEGE BASKETBALL DO YOU WATCH ANY OF THE FOLLOWING PROGRAMS OR PROGRAMMING TYPES ON REGULAR TV STATIONS? - COLLEGE BASKETBALL . 1

  7. Effect of Progesterone on Cerebral Vasospasm and Neurobehavioral Outcomes in a Rodent Model of Subarachnoid Hemorrhage.

    PubMed

    Turan, Nefize; Miller, Brandon A; Huie, J Russell; Heider, Robert A; Wang, Jun; Wali, Bushra; Yousuf, Seema; Ferguson, Adam R; Sayeed, Iqbal; Stein, Donald G; Pradilla, Gustavo

    2018-02-01

    Subarachnoid hemorrhage (SAH) induces widespread inflammation leading to cellular injury, vasospasm, and ischemia. Evidence suggests that progesterone (PROG) can improve functional recovery in acute brain injury owing to its anti-inflammatory and neuroprotective properties, which could also be beneficial in SAH. We hypothesized that PROG treatment attenuates inflammation-mediated cerebral vasospasm and microglial activation, improves synaptic connectivity, and ameliorates functional recovery after SAH. We investigated the effect of PROG in a cisternal SAH model in adult male C57BL/6 mice. Neurobehavioral outcomes were evaluated using rotarod latency and grip strength tests. Basilar artery perimeter, α-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid glutamate receptor 1 (GluR1)/synaptophysin colocalization, and Iba-1 immunoreactivity were quantified histologically. PROG (8 mg/kg) significantly improved rotarod latency at day 6 and grip strength at day 9. PROG-treated mice had significantly reduced basilar artery vasospasm at 24 hours. GluR1/synaptophysin colocalization, indicative of synaptic GluR1, was significantly reduced in the SAH+Vehicle group at 24 hours, and PROG treatment significantly attenuated this reduction. PROG treatment significantly reduced microglial cell activation and proliferation in cerebellum and cortex but not in the brainstem at 10 days. PROG treatment ameliorated cerebral vasospasm, reduced microglial activation, restored synaptic GluR1 localization, and improved neurobehavioral performance in a murine model of SAH. These results provide a rationale for further translational testing of PROG therapy in SAH. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Screening for strains with 11α-hydroxylase activity for 17α-hydroxy progesterone biotransformation.

    PubMed

    Gao, Qian; Qiao, Yuqian; Shen, Yanbing; Wang, Min; Wang, Xibo; Liu, Yang

    2017-08-01

    Various corticosteroids are prepared by using 11α,17α-diOH-progesterone (11α,17α-diOH-PROG) as an important intermediate and raw material. Hence, strains that can improve the yields of 11α,17α-diOH-PROG should be screened. Cunninghamella elegans CICC40250 was singled out from five common 11α hydroxylation strains. The reaction parameters of 11α,17α-diOH-PROG production were also investigated. C. elegans CICC40250 could efficiently catalyze the hydroxylation of 17α-hydroxy progesterone (17α-OH-PROG) at C-11α position. This strain could also effectively convert 11α,17α-diOH-PROG at high substrate concentrations (up to 30g/L). After the coenzyme precursor glucose was added, the rate of 11α,17α-diOH-PROG formation reached 84.2%, which was 11.4% higher than that of the control group. Our study established a simple and feasible mechanism to increase 11α,17α-diOH-PROG production levels. This mechanism involves C. elegans CICC40250 that can be efficiently applied to induce the biotransformation of 17α-OH-PROG with a hydroxylation biocatalytic ability. Copyright © 2017. Published by Elsevier Inc.

  9. Transdermal absorption of natural progesterone from alcoholic gel formulations with hydrophilic surfactant.

    PubMed

    Matsui, Rakan; Ueda, Osamu; Uchida, Shinya; Namiki, Noriyuki

    2015-06-01

    The aim of this study was to evaluate the in vitro skin permeation and in vivo transdermal absorption of natural progesterone (Prog) from alcoholic gel-based transdermal formulations containing Prog dissolved stably at a concentration of 3%. 3% Prog dissolved gel formulations were prepared containing with water, ethanol, 1,3-butylene glycol, carboxyvinylpolymer, diisopropanolamine, polyoxyethylene (2) oleylether and benzyl alcohol. The gel formulations added different hydrophilic surfactants and isopropyl myristate or propylene glycol dicaprylate (PGDC) as oily solvents were applied in vitro permeation study through excised rat skin on unocclusive condition. The gel formulations added polyoxyethylene (20) oleylether (Oleth-20) as hydrophilic surfactant and PGDC were applied in vivo single- and repeated-dose transdermal absorption study of rat on unocclusive condition. The results of evaluation of the gel formulations by an in vitro skin permeation study revealed a high flux of Prog from the formulation containing Oleth-20 and Oleth-20 with PGDC. The results of single and repeated in vivo transdermal absorption studies confirmed that good plasma levels of Prog were achieved and maintained by Oleth-20 and PGDC containing gel formulation. The Oleth-20 and PGDC containing ethanolic gel formulation seemed to have the ability to maintain a high activity of Prog and high diffusivity or solubility of Prog in the epidermis on the practical formulation application.

  10. Recent advances in the multimodel hydrologic ensemble forecasting using the HydroProg system in the Nysa Klodzka river basin (southwestern Poland)

    NASA Astrophysics Data System (ADS)

    Niedzielski, Tomasz; Mizinski, Bartlomiej; Swierczynska-Chlasciak, Malgorzata

    2017-04-01

    The HydroProg system, the real-time multimodel hydrologic ensemble system elaborated at the University of Wroclaw (Poland) in frame of the research grant no. 2011/01/D/ST10/04171 financed by National Science Centre of Poland, has been experimentally launched in 2013 in the Nysa Klodzka river basin (southwestern Poland). Since that time the system has been working operationally to provide water level predictions in real time. At present, depending on a hydrologic gauge, up to eight hydrologic models are run. They are data- and physically-based solutions, with the majority of them being the data-based ones. The paper aims to report on the performance of the implementation of the HydroProg system for the basin in question. We focus on several high flows episodes and discuss the skills of the individual models in forecasting them. In addition, we present the performance of the multimodel ensemble solution. We also introduce a new prognosis which is determined in the following way: for a given lead time we select the most skillful prediction (from the set of all individual models running at a given gauge and their multimodel ensemble) using the performance statistics computed operationally in real time as a function of lead time.

  11. Modeling of Stability of Electrostatic and Magnetostatic Systems

    DTIC Science & Technology

    2017-06-01

    unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Electromagnetic systems undergo a variety of different instabilities. A broad class of those...15. SUBJECT TERMS electromagnetism , morphological instabilities, computational algorithm, gradient minimization, morphology patterns, motion by mean...Nordmark AB. Magnetic field and current are zero inside ideal conductors. Prog Electromagn Res B. 2011(27):187–212. 4. Stratton JA. Electromagnetic theory

  12. Gender difference in the effect of progesterone on neonatal hypoxic/ischemic brain injury in mouse.

    PubMed

    Dong, Shuyu; Zhang, Qian; Kong, Delian; Zhou, Chao; Zhou, Jie; Han, Jingjing; Zhou, Yan; Jin, Guoliang; Hua, Xiaodong; Wang, Jun; Hua, Fang

    2018-08-01

    This study investigated the effects of progesterone (PROG) on neonatal hypoxic/ischemic (NHI) brain injury, the differences in effects between genders, and the underlying mechanisms. NHI brain injury was established in both male and female neonatal mice induced by occlusion of the left common carotid artery followed by hypoxia. The mice were treated with PROG or vehicle. Fluoro-Jade B staining (F-JB), long term behavior testing, and brain magnetic resonance image (MRI) were applied to evaluate neuronal death, neurological function, and brain damage. The underlying molecular mechanisms were also investigated by Western blots. The results showed that, in the male mice, administration of PROG significantly reduced neuronal death, improved the learning and memory function impaired by cerebral HI, decreased infarct size, and maintained the thickness of the cortex after cerebral HI. PROG treatment, however, did not show significant neuroprotective effects on female mice subjected to HI. In addition, the data demonstrated a gender difference in the expression of tumor necrosis factor receptor 1 (TNFR1), TNF receptor associated factor 6 (TRAF6), Fas associated protein with death domain (FADD), and TIR-domain-containing adapter-inducing interferon-β (TRIF) between males and females. Our results indicated that treatment with PROG had beneficial effects on NHI injured brain in acute stage and improved the long term cognitive function impaired by cerebral HI in male mice. In addition, the activation of TNF and TRIF mediated signaling in response to cerebral HI and the treatment of PROG varied between genders, which highly suggested that gender differences should be emphasized in evaluating neonatal HI brain injury and PROG effects, as well as the underlying mechanisms. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. Thermodynamic and conformational analysis of the interaction between antibody binding proteins and IgG.

    PubMed

    Tanwar, Neetu; Munde, Manoj

    2018-06-01

    Studying interaction of IgG with bacterial proteins such as proA (Protein A) and proG is essential for development in the areas of drug discovery and biotechnology. Some solution studies in the past have hinted at the possibility of variable binding ratios for IgG with proA and proG. Since earlier crystallographic studies focussed mostly on monomeric complexes, the knowledge about the binding interfaces and protein conformational changes involved in multimeric complexes is scarce. In this paper, we observed that single proA molecule was able to bind to three IgG molecules (1:3, proA:IgG) in ITC accentuating the presence of conformational flexibility in proA, corroborated also by CD results. By contrast, proG binds with 1:1 stoichiometry to IgG, which also involves key structural rearrangement within the binding interface of IgG-proG complex, confirmed by fluorescence KI quenching study. It is implicit from CD and fluorescence results that IgG does not undergo any significant conformational changes, which further suggests that proA and proG dictate the phenomenon of recognition in antibody complexes. ANS as a hydrophobic probe helped in revealing the distinctive antibody binding mechanism of proA and proG. Additionally, the binding competition experiments using ITC established that proA and proG cannot bind IgG concurrently. Copyright © 2018. Published by Elsevier B.V.

  14. The 1984 ARI Survey of Army Recruits. Codebook for Summer 84 Active Army Survey Respondents

    DTIC Science & Technology

    1986-05-01

    ARMY SURVEY RESPONDENTS T261 - DO YOU HATCH ANY OF THE FOLLOWING PROGRAMS OR PROGRAMMING TYPES ON TV? - NBA BASKETBALL . RAN DATA ICARD i1 COLS ILENGTHII... BASKETBALL 280 T262 WATCH TV PROG:COLLEGE BASKETBALL 281 T263 WATCH TV PROG:NHL HOCKEY 282 T264 WATCH TV PROG:PROFESSIONAL WRESTLING 283 T265 WATCH TV...SURVEY RESPONDENTS T262 - DO YOU HATCH ANY OF THE FOLLOWING PROGRAMS OR PROGRAMMING TYPES ON TV? - COLLEGE BASKETBALL . RAW DATA ICARD #1 COLS ILENGTHII

  15. Semi-automatic handling of meteorological ground measurements using WeatherProg: prospects and practical implications

    NASA Astrophysics Data System (ADS)

    Langella, Giuliano; Basile, Angelo; Bonfante, Antonello; De Mascellis, Roberto; Manna, Piero; Terribile, Fabio

    2016-04-01

    WeatherProg is a computer program for the semi-automatic handling of data measured at ground stations within a climatic network. The program performs a set of tasks ranging from gathering raw point-based sensors measurements to the production of digital climatic maps. Originally the program was developed as the baseline asynchronous engine for the weather records management within the SOILCONSWEB Project (LIFE08 ENV/IT/000408), in which daily and hourly data where used to run water balance in the soil-plant-atmosphere continuum or pest simulation models. WeatherProg can be configured to automatically perform the following main operations: 1) data retrieval; 2) data decoding and ingestion into a database (e.g. SQL based); 3) data checking to recognize missing and anomalous values (using a set of differently combined checks including logical, climatological, spatial, temporal and persistence checks); 4) infilling of data flagged as missing or anomalous (deterministic or statistical methods); 5) spatial interpolation based on alternative/comparative methods such as inverse distance weighting, iterative regression kriging, and a weighted least squares regression (based on physiography), using an approach similar to PRISM. 6) data ingestion into a geodatabase (e.g. PostgreSQL+PostGIS or rasdaman). There is an increasing demand for digital climatic maps both for research and development (there is a gap between the major of scientific modelling approaches that requires digital climate maps and the gauged measurements) and for practical applications (e.g. the need to improve the management of weather records which in turn raises the support provided to farmers). The demand is particularly burdensome considering the requirement to handle climatic data at the daily (e.g. in the soil hydrological modelling) or even at the hourly time step (e.g. risk modelling in phytopathology). The key advantage of WeatherProg is the ability to perform all the required operations and calculations in an automatic fashion, except the need of a human interaction upon specific issues (such as the decision whether a measurement is an anomaly or not according to the detected temporal and spatial variations with contiguous points). The presented computer program runs from command line and shows peculiar characteristics in the cascade modelling within different contexts belonging to agriculture, phytopathology and environment. In particular, it can be a powerful tool to set up cutting-edge regional web services based on weather information. Indeed, it can support territorial agencies in charge of meteorological and phytopathological bulletins.

  16. Progesterone neuroprotection in the Wobbler mouse, a genetic model of spinal cord motor neuron disease.

    PubMed

    Gonzalez Deniselle, María Claudia; López-Costa, Juan José; Saavedra, Jorge Pecci; Pietranera, Luciana; Gonzalez, Susana L; Garay, Laura; Guennoun, Rachida; Schumacher, Michael; De Nicola, Alejandro F

    2002-12-01

    Motor neuron degeneration characterizes the spinal cord of patients with amyotrophic lateral sclerosis and the Wobbler mouse mutant. Considering that progesterone (PROG) provides neuroprotection in experimental ischemia and injury, its potential role in neurodegeneration was studied in the murine model. Two-month-old symptomatic Wobbler mice were left untreated or received sc a 20-mg PROG implant for 15 days. Both light and electron microscopy of Wobbler mice spinal cord showed severely affected motor neurons with profuse cytoplasmic vacuolation of the endoplasmic reticulum and/or Golgi apparatus and ruptured mitochondria with damaged cristae, a profile indicative of a type II cytoplasmic form of cell death. In contrast to untreated mice, neuropathology was less severe in Wobbler mice receiving PROG; including a reduction of vacuolation and of the number of vacuolated cells and better conservation of the mitochondrial ultrastructure. In biochemical studies, we determined the mRNA for the alpha3 subunit of Na,K-ATPase, a neuronal enzyme controlling ion fluxes, neurotransmission, membrane potential, and nutrient uptake. In untreated Wobbler mice, mRNA levels in motor neurons were reduced by half compared to controls, whereas PROG treatment of Wobbler mice restored the expression of alpha3 subunit Na,K-ATPase mRNA. Therefore, PROG was able to rescue motor neurons from degeneration, based on recovery of histopathological abnormalities and of mRNA levels of the sodium pump. However, because the gene mutation in Wobbler mice is still unknown, further studies are needed to unveil the action of PROG and the mechanism of neuronal death in this genetic model of neurodegeneration.

  17. Geovisualization in the HydroProg web map service

    NASA Astrophysics Data System (ADS)

    Spallek, Waldemar; Wieczorek, Malgorzata; Szymanowski, Mariusz; Niedzielski, Tomasz; Swierczynska, Malgorzata

    2016-04-01

    The HydroProg system, built at the University of Wroclaw (Poland) in frame of the research project no. 2011/01/D/ST10/04171 financed by the National Science Centre of Poland, has been designed for computing predictions of river stages in real time on a basis of multimodelling. This experimental system works on the upper Nysa Klodzka basin (SW Poland) above the gauge in the town of Bardo, with the catchment area of 1744 square kilometres. The system operates in association with the Local System for Flood Monitoring of Klodzko County (LSOP), and produces hydrograph prognoses as well as inundation predictions. For presenting the up-to-date predictions and their statistics in the online mode, the dedicated real-time web map service has been designed. Geovisualisation in the HydroProg map service concerns: interactive maps of study area, interactive spaghetti hydrograms of water level forecasts along with observed river stages, animated images of inundation. The LSOP network offers a high spatial and temporal resolution of observations, as the length of the sampling interval is equal to 15 minutes. The main environmental elements related to hydrological modelling are shown on the main map. This includes elevation data (hillshading and hypsometric tints), rivers and reservoirs as well as catchment boundaries. Furthermore, we added main towns, roads as well as political and administrative boundaries for better map understanding. The web map was designed as a multi-scale representation, with levels of detail and zooming according to scales: 1:100 000, 1:250 000 and 1:500 000. Observations of water level in LSOP are shown on interactive hydrographs for each gauge. Additionally, predictions and some of their statistical characteristics (like prediction errors and Nash-Sutcliffe efficiency) are shown for selected gauges. Finally, predictions of inundation are presented on animated maps which have been added for four experimental sites. The HydroProg system is a strictly scientific project, but the web map service has been designed for all web users. The main objective of the paper is to present the design process of the web map service, following the cartographic and graphic principles.

  18. dLocAuth: a dynamic multifactor authentication scheme for mCommerce applications using independent location-based obfuscation

    NASA Astrophysics Data System (ADS)

    Kuseler, Torben; Lami, Ihsan A.

    2012-06-01

    This paper proposes a new technique to obfuscate an authentication-challenge program (named LocProg) using randomly generated data together with a client's current location in real-time. LocProg can be used to enable any handsetapplication on mobile-devices (e.g. mCommerce on Smartphones) that requires authentication with a remote authenticator (e.g. bank). The motivation of this novel technique is to a) enhance the security against replay attacks, which is currently based on using real-time nonce(s), and b) add a new security factor, which is location verified by two independent sources, to challenge / response methods for authentication. To assure a secure-live transaction, thus reducing the possibility of replay and other remote attacks, the authors have devised a novel technique to obtain the client's location from two independent sources of GPS on the client's side and the cellular network on authenticator's side. The algorithm of LocProg is based on obfuscating "random elements plus a client's data" with a location-based key, generated on the bank side. LocProg is then sent to the client and is designed so it will automatically integrate into the target application on the client's handset. The client can then de-obfuscate LocProg if s/he is within a certain range around the location calculated by the bank and if the correct personal data is supplied. LocProg also has features to protect against trial/error attacks. Analysis of LocAuth's security (trust, threat and system models) and trials based on a prototype implementation (on Android platform) prove the viability and novelty of LocAuth.

  19. Cytogenetic and oxidative status of human lymphocytes after exposure to clinically relevant concentrations of antimalarial drugs atovaquone and proguanil hydrochloride in vitro.

    PubMed

    Dinter, Domagoj; Gajski, Goran; Domijan, Ana-Marija; Garaj-Vrhovac, Vera

    2015-12-01

    Atovaquone (ATO) and proguanil hydrochloride (PROG) is the fixed combination for the prevention and treatment of Plasmodium falciparum malaria. As safe and effective antimalarial drugs are needed in both the treatment and the prophylaxis of malaria, this study was performed to investigate their possible cyto/genotoxic potential towards human lymphocytes and the possible mechanism responsible for it. Two different concentrations of ATO and PROG were used with and without S9 metabolic activation. The concentrations used were those found in human plasma when a fixed-dose combination of ATO and PROG was used: 2950/130 ng/mL after prophylactic treatment and 11 800/520 ng/mL after treatment of malaria, respectively. Possible cellular and DNA-damaging effects were evaluated by cell viability and alkaline comet assays, while oxidative stress potential was evaluated by formamidopyrimidine-DNA glycosylase (Fpg)-modified comet assay, in addition to measuring malondialdehyde and glutathione levels. According to our results, the ATO/PROG combination displayed only weak cyto/genotoxic potential towards human lymphocytes with no impact on oxidative stress parameters, suggesting that oxidative stress is not implicated in their mechanism of action towards human lymphocytes. Given that the key portion of the damaging effects was induced after S9 metabolic activation, it is to presume that the principal metabolite of PROG, cycloguanil, had the greatest impact. The obtained results indicate that the ATO/PROG combination is relatively safe for the consumption from the aspect of cyto/genotoxicity, especially if used for prophylactic treatment. Nevertheless, further cytogenetic research and regular patient monitoring are needed to minimize the risk of adverse events especially among frequent travellers. © 2015 Société Française de Pharmacologie et de Thérapeutique.

  20. JCL (Job Control Language) Procedures to Run the Hull Code on the Cyber 205 Computer Installed on CSIRONET.

    DTIC Science & Technology

    1986-11-01

    START THE RUN>>> USERNUIDNUPW. CHARGEGROUPNPID. SETJOB, DC= NO . COMMENT . GET CR ATTACH THE INPUT DATA TO GO TO VSOS. GET, INDATA=DATFILE/NA. IFE...NtPW. CHARGEGROUPNPID. SETTL, 200. SETJOB. DC= NO . COMMENT . RUN SAIL ON NOS TO GENERATE THE MAIN PROGRAM. PURGE, SAl LOUT/NA. PURGE, PROG-PROBLEMID...NOSPASS. CHARGEDFCDFCPR.F. SETJOB. DC= NO . COMMENT . GET OR ATTACH THE INPUT DATA To Go To VSOS. GET. INDATA=MYDATA/NA. IFE. .NOT.FILE(INDATA.AS) .DOATT

  1. 77 FR 12792 - Notice of Forest Service Land Management Plans To Be Amended To Incorporate Greater Sage-Grouse...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-02

    ... Basin Region: Web site: http://www.blm.gov/wo/st/en/prog/more/sagegrouse/western.html . Email: sagewest...: Lauren Mermejo, Great Basin Region Project Manager, telephone 775-861-6400; address 1340 Financial.../sagegrouse/eastern.html , and for the Great Basin Region at http://www.blm.gov/wo/st/en/prog/more/sagegrouse...

  2. Structural and Kinetic Basis of Steroid 17α,20-Lyase Activity in Teleost Fish Cytochrome P450 17A1 and Its Absence in Cytochrome P450 17A2*

    PubMed Central

    Pallan, Pradeep S.; Nagy, Leslie D.; Lei, Li; Gonzalez, Eric; Kramlinger, Valerie M.; Azumaya, Caleigh M.; Wawrzak, Zdzislaw; Waterman, Michael R.; Guengerich, F. Peter; Egli, Martin

    2015-01-01

    Cytochrome P450 (P450) 17A enzymes play a critical role in the oxidation of the steroids progesterone (Prog) and pregnenolone (Preg) to glucocorticoids and androgens. In mammals, a single enzyme, P450 17A1, catalyzes both 17α-hydroxylation and a subsequent 17α,20-lyase reaction with both Prog and Preg. Teleost fish contain two 17A P450s; zebrafish P450 17A1 catalyzes both 17α-hydroxylation and lyase reactions with Prog and Preg, and P450 17A2 is more efficient in pregnenolone 17α-hydroxylation but does not catalyze the lyase reaction, even in the presence of cytochrome b5. P450 17A2 binds all substrates and products, although more loosely than P450 17A1. Pulse-chase and kinetic spectral experiments and modeling established that the two-step P450 17A1 Prog oxidation is more distributive than the Preg reaction, i.e. 17α-OH product dissociates more prior to the lyase step. The drug orteronel selectively blocked the lyase reaction of P450 17A1 but only in the case of Prog. X-ray crystal structures of zebrafish P450 17A1 and 17A2 were obtained with the ligand abiraterone and with Prog for P450 17A2. Comparison of the two fish P450 17A-abiraterone structures with human P450 17A1 (DeVore, N. M., and Scott, E. E. (2013) Nature 482, 116–119) showed only a few differences near the active site, despite only ∼50% identity among the three proteins. The P450 17A2 structure differed in four residues near the heme periphery. These residues may allow the proposed alternative ferric peroxide mechanism for the lyase reaction, or residues removed from the active site may allow conformations that lead to the lyase activity. PMID:25533464

  3. Implicit Theories of Creativity in Computer Science in the United States and China

    ERIC Educational Resources Information Center

    Tang, Chaoying; Baer, John; Kaufman, James C.

    2015-01-01

    To study implicit concepts of creativity in computer science in the United States and mainland China, we first asked 308 Chinese computer scientists for adjectives that would describe a creative computer scientist. Computer scientists and non-computer scientists from China (N = 1069) and the United States (N = 971) then rated how well those…

  4. The monoamine-oxidase B inhibitor deprenyl increases selection of high-effort activity in rats tested on a progressive ratio/chow feeding choice procedure: Implications for treating motivational dysfunctions.

    PubMed

    Yohn, Samantha E; Reynolds, Shanika; Tripodi, Giuseppe; Correa, Merce; Salamone, John D

    2018-04-16

    Motivated behaviors often are characterized by a high degree of behavioral activation and work output, and organisms frequently make effort-related decisions based upon cost/benefit analyses. Moreover, people with depression and other disorders frequently show effort-related motivational symptoms, such as anergia, psychomotor retardation, and fatigue. Tasks measuring effort-related choice are being used as animal models of these motivational symptoms. The present studies characterized the ability of the monoamine oxidase -B (MAO-B) inhibitor deprenyl (selegiline) to enhance selection of high-effort lever pressing in rats tested on a concurrent progressive ratio (PROG)/chow feeding choice task. Deprenyl is widely used as an antiparkinsonian drug, but it also has been shown to have antidepressant effects in humans, and to induce antidepressant-like effects in traditional rodent models of depression. Systemic administration of deprenyl (1.5-12.0 mg/kg IP) shifted choice behavior, significantly increasing markers of PROG lever pressing at a moderate dose (6.0 mg/kg), and decreasing chow intake at 6.0 and 12.0 mg/kg. Intracranial injections of deprenyl into nucleus accumbens (2.0 and 4.0 μg) also increased PROG lever pressing and decreased chow intake. Microdialysis studies showed that the dose of deprenyl that was effective at increasing PROG lever pressing (6.0 mg/kg) also significantly elevated extracellular dopamine in nucleus accumbens. Thus, similar to the well-known antidepressant bupropion, deprenyl is capable of increasing selection of high-effort PROG lever pressing at doses that increase extracellular dopamine in nucleus accumbens. These studies have implications for the potential use of MAO-B inhibitors as treatments for the motivational symptoms of depression and Parkinsonism. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. The VMAT-2 Inhibitor Tetrabenazine Affects Effort-Related Decision Making in a Progressive Ratio/Chow Feeding Choice Task: Reversal with Antidepressant Drugs

    PubMed Central

    Randall, Patrick A.; Lee, Christie A.; Nunes, Eric J.; Yohn, Samantha E.; Nowak, Victoria; Khan, Bilal; Shah, Priya; Pandit, Saagar; Vemuri, V. Kiran; Makriyannis, Alex; Baqi, Younis; Müller, Christa E.; Correa, Merce; Salamone, John D.

    2014-01-01

    Behavioral activation is a fundamental feature of motivation, and organisms frequently make effort-related decisions based upon evaluations of reinforcement value and response costs. Furthermore, people with major depression and other disorders often show anergia, psychomotor retardation, fatigue, and alterations in effort-related decision making. Tasks measuring effort-based decision making can be used as animal models of the motivational symptoms of depression, and the present studies characterized the effort-related effects of the vesicular monoamine transport (VMAT-2) inhibitor tetrabenazine. Tetrabenazine induces depressive symptoms in humans, and also preferentially depletes dopamine (DA). Rats were assessed using a concurrent progressive ratio (PROG)/chow feeding task, in which they can either lever press on a PROG schedule for preferred high-carbohydrate food, or approach and consume a less-preferred lab chow that is freely available in the chamber. Previous work has shown that the DA antagonist haloperidol reduced PROG work output on this task, but did not reduce chow intake, effects that differed substantially from those of reinforcer devaluation or appetite suppressant drugs. The present work demonstrated that tetrabenazine produced an effort-related shift in responding on the PROG/chow procedure, reducing lever presses, highest ratio achieved and time spent responding, but not reducing chow intake. Similar effects were produced by administration of the subtype selective DA antagonists ecopipam (D1) and eticlopride (D2), but not by the cannabinoid CB1 receptor neutral antagonist and putative appetite suppressant AM 4413, which suppressed both lever pressing and chow intake. The adenosine A2A antagonist MSX-3, the antidepressant and catecholamine uptake inhibitor bupropion, and the MAO-B inhibitor deprenyl, all reversed the impairments induced by tetrabenazine. This work demonstrates the potential utility of the PROG/chow procedure as a rodent model of the effort-related deficits observed in depressed patients. PMID:24937131

  6. The VMAT-2 inhibitor tetrabenazine affects effort-related decision making in a progressive ratio/chow feeding choice task: reversal with antidepressant drugs.

    PubMed

    Randall, Patrick A; Lee, Christie A; Nunes, Eric J; Yohn, Samantha E; Nowak, Victoria; Khan, Bilal; Shah, Priya; Pandit, Saagar; Vemuri, V Kiran; Makriyannis, Alex; Baqi, Younis; Müller, Christa E; Correa, Merce; Salamone, John D

    2014-01-01

    Behavioral activation is a fundamental feature of motivation, and organisms frequently make effort-related decisions based upon evaluations of reinforcement value and response costs. Furthermore, people with major depression and other disorders often show anergia, psychomotor retardation, fatigue, and alterations in effort-related decision making. Tasks measuring effort-based decision making can be used as animal models of the motivational symptoms of depression, and the present studies characterized the effort-related effects of the vesicular monoamine transport (VMAT-2) inhibitor tetrabenazine. Tetrabenazine induces depressive symptoms in humans, and also preferentially depletes dopamine (DA). Rats were assessed using a concurrent progressive ratio (PROG)/chow feeding task, in which they can either lever press on a PROG schedule for preferred high-carbohydrate food, or approach and consume a less-preferred lab chow that is freely available in the chamber. Previous work has shown that the DA antagonist haloperidol reduced PROG work output on this task, but did not reduce chow intake, effects that differed substantially from those of reinforcer devaluation or appetite suppressant drugs. The present work demonstrated that tetrabenazine produced an effort-related shift in responding on the PROG/chow procedure, reducing lever presses, highest ratio achieved and time spent responding, but not reducing chow intake. Similar effects were produced by administration of the subtype selective DA antagonists ecopipam (D1) and eticlopride (D2), but not by the cannabinoid CB1 receptor neutral antagonist and putative appetite suppressant AM 4413, which suppressed both lever pressing and chow intake. The adenosine A2A antagonist MSX-3, the antidepressant and catecholamine uptake inhibitor bupropion, and the MAO-B inhibitor deprenyl, all reversed the impairments induced by tetrabenazine. This work demonstrates the potential utility of the PROG/chow procedure as a rodent model of the effort-related deficits observed in depressed patients.

  7. Developing the next generation of diverse computer scientists: the need for enhanced, intersectional computing identity theory

    NASA Astrophysics Data System (ADS)

    Rodriguez, Sarah L.; Lehman, Kathleen

    2017-10-01

    This theoretical paper explores the need for enhanced, intersectional computing identity theory for the purpose of developing a diverse group of computer scientists for the future. Greater theoretical understanding of the identity formation process specifically for computing is needed in order to understand how students come to understand themselves as computer scientists. To ensure that the next generation of computer scientists is diverse, this paper presents a case for examining identity development intersectionally, understanding the ways in which women and underrepresented students may have difficulty identifying as computer scientists and be systematically oppressed in their pursuit of computer science careers. Through a review of the available scholarship, this paper suggests that creating greater theoretical understanding of the computing identity development process will inform the way in which educational stakeholders consider computer science practices and policies.

  8. Calculations of Intersection Cross-Slip Activation Energies in FCC Metals Using Nudged Elastic Band Method

    DTIC Science & Technology

    2011-08-01

    PROGRAM ELEMENT NUMBER 6. AUTHOR( S ) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES...AGENCY NAME( S ) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM( S ) 11. SPONSOR/MONITOR’S REPORT NUMBER( S ) 12. DISTRIBUTION/AVAILABILITY STATEMENT...References [1] Puschl W. Prog Mater Sci 2002;47:415. [2] Jackson PJ. Prog Mater Sci 1985;29:139. [3] Rao S , Parthasarathy TA, Woodward C. Philos Mag A

  9. Impact of Viral Infection on Absorption and Scattering Properties of Marine Bacteria and Phytoplankton

    DTIC Science & Technology

    2001-09-30

    Opt. Eng. 2963: 260-265. 5 Bratbak, G., J. K. Egge, and M. Heldal. 1993. Viral mortality of the marine alga Emiliania huxleyi (Haptophyceae...and termination of algal blooms. Mar. Ecol. Prog. Ser. 93: 39-48. Bratbak, G., W. Wilson, and M. Heldal. 1996. Viral control of Emiliania huxleyi...relation to Emiliania huxleyi blooms: a mechanism of DMSP release? Mar. Ecol. Prog. Ser. 128: 133-142. Brussaard, C. P. D., R. S. Kempers, A. J

  10. Elastin hydrolysate derived from fish enhances proliferation of human skin fibroblasts and elastin synthesis in human skin fibroblasts and improves the skin conditions.

    PubMed

    Shiratsuchi, Eri; Nakaba, Misako; Yamada, Michio

    2016-03-30

    Recent studies have shown that certain peptides significantly improve skin conditions, such as skin elasticity and the moisture content of the skin of healthy woman. This study aimed to investigate the effects of elastin hydrolysate on human skin. Proliferation and elastin synthesis were evaluated in human skin fibroblasts exposed to elastin hydrolysate and proryl-glycine (Pro-Gly), which is present in human blood after elastin hydrolysate ingestion. We also performed an ingestion test with elastin hydrolysate in humans and evaluated skin condition. Elastin hydrolysate and Pro-Gly enhanced the proliferation of fibroblasts and elastin synthesis. Maximal proliferation response was observed at 25 ng mL(-1) Pro-Gly. Ingestion of elastin hydrolysate improved skin condition, such as elasticity, number of wrinkles, and blood flow. Elasticity improved by 4% in the elastin hydrolysate group compared with 2% in the placebo group. Therefore, elastin hydrolysate activates human skin fibroblasts and has beneficial effects on skin conditions. © 2015 Society of Chemical Industry.

  11. Design of Computer-Related Workstations in Relation to Job Functions and Productivity.

    DTIC Science & Technology

    1984-12-01

    I nadequat e Neut ra I Adequat e Mana,’,ement 36.6% 17. 3 46.2 C Compiter Prog. 45.1 25.3 29.7 Syst ems Analyst 44.6 20.0 35.4 FPmctional Analyst...2~ der md prJgrvTm~r C S 4 0 6 I 0 I S I S g tS * Ar . ± " S . l _ - ---- ,E- , ! ---- - -.-- _EA E L 2 . ... .. __., ____ I - / 1: - i - - " I...there ;ire lrina van :ibl -e; which .0 tect fati I I ty desi gn and l ayout. In a professional * oputr-rlatd ’vi ronimInt , sat i stact 110 with

  12. Recruitment of Foreigners in the Market for Computer Scientists in the United States

    PubMed Central

    Bound, John; Braga, Breno; Golden, Joseph M.

    2016-01-01

    We present and calibrate a dynamic model that characterizes the labor market for computer scientists. In our model, firms can recruit computer scientists from recently graduated college students, from STEM workers working in other occupations or from a pool of foreign talent. Counterfactual simulations suggest that wages for computer scientists would have been 2.8–3.8% higher, and the number of Americans employed as computers scientists would have been 7.0–13.6% higher in 2004 if firms could not hire more foreigners than they could in 1994. In contrast, total CS employment would have been 3.8–9.0% lower, and consequently output smaller. PMID:27170827

  13. MiR-29b affects the secretion of PROG and promotes the proliferation of bovine corpus luteum cells

    PubMed Central

    Zhang, Li-Qun; Sun, Xu-Lei; Luo, Dan; Fu, Yao; Gao, Yan; Zhang, Jia-Bao

    2018-01-01

    The regulatory role of miRNAs has been explored in ovarian cells, and their effects on gonadal development, apoptosis, ovulation, steroid production and corpus luteum (CL) development have been revealed. In this study, we analyzed the expression of miR-29b at different stages of bovine CL development and predicted the target genes of miR-29b. We confirmed that miR-29b reduces the expression of the oxytocin receptor (OXTR), affects progesterone (PROG) secretion and regulates the function of the CL. RT-PCR showed that the expression of miR-29b was significantly higher in functional CL phases than in the regressed CL phase. Immunohistochemistry showed that OXTR was expressed in both large and small CL cells and was mainly located in the cell membrane and cytoplasm of these cells. We analyzed the expression levels of OXTR and found that transfection with a miR-29b mimic decreased OXTR expression, but transfection with the inhibitor had a limited effect on the expression of the OXTR protein. At the same time, the secretion of PROG was significantly increased in the miR-29b mimic-transfected group. We also analyzed the effect of miR-29b on the apoptosis of CL cells. Finally, we found that miR-29b could promote the proliferation of bovine CL cells. In conclusion, we found that miR-29b reduces the expression of OXTR and can promote PROG secretion and the proliferation of CL cells via OXTR. PMID:29617446

  14. 2017 ISCB Accomplishment by a Senior Scientist Award: Pavel Pevzner

    PubMed Central

    Fogg, Christiana N.; Kovats, Diane E.; Berger, Bonnie

    2017-01-01

    The International Society for Computational Biology ( ISCB) recognizes an established scientist each year with the Accomplishment by a Senior Scientist Award for significant contributions he or she has made to the field. This award honors scientists who have contributed to the advancement of computational biology and bioinformatics through their research, service, and education work. Pavel Pevzner, PhD, Ronald R. Taylor Professor of Computer Science and Director of the NIH Center for Computational Mass Spectrometry at University of California, San Diego, has been selected as the winner of the 2017 Accomplishment by a Senior Scientist Award. The ISCB awards committee, chaired by Dr. Bonnie Berger of the Massachusetts Institute of Technology, selected Pevzner as the 2017 winner. Pevzner will receive his award and deliver a keynote address at the 2017 Intelligent Systems for Molecular Biology-European Conference on Computational Biology joint meeting ( ISMB/ECCB 2017) held in Prague, Czech Republic from July 21-July 25, 2017. ISMB/ECCB is a biennial joint meeting that brings together leading scientists in computational biology and bioinformatics from around the globe. PMID:28713548

  15. Interactive visualization of Earth and Space Science computations

    NASA Technical Reports Server (NTRS)

    Hibbard, William L.; Paul, Brian E.; Santek, David A.; Dyer, Charles R.; Battaiola, Andre L.; Voidrot-Martinez, Marie-Francoise

    1994-01-01

    Computers have become essential tools for scientists simulating and observing nature. Simulations are formulated as mathematical models but are implemented as computer algorithms to simulate complex events. Observations are also analyzed and understood in terms of mathematical models, but the number of these observations usually dictates that we automate analyses with computer algorithms. In spite of their essential role, computers are also barriers to scientific understanding. Unlike hand calculations, automated computations are invisible and, because of the enormous numbers of individual operations in automated computations, the relation between an algorithm's input and output is often not intuitive. This problem is illustrated by the behavior of meteorologists responsible for forecasting weather. Even in this age of computers, many meteorologists manually plot weather observations on maps, then draw isolines of temperature, pressure, and other fields by hand (special pads of maps are printed for just this purpose). Similarly, radiologists use computers to collect medical data but are notoriously reluctant to apply image-processing algorithms to that data. To these scientists with life-and-death responsibilities, computer algorithms are black boxes that increase rather than reduce risk. The barrier between scientists and their computations can be bridged by techniques that make the internal workings of algorithms visible and that allow scientists to experiment with their computations. Here we describe two interactive systems developed at the University of Wisconsin-Madison Space Science and Engineering Center (SSEC) that provide these capabilities to Earth and space scientists.

  16. Theory and High-Energy-Density Laser Experiments Relevant to Accretion Processes in Cataclysmic Variables

    NASA Astrophysics Data System (ADS)

    Krauland, Christine; Drake, R.; Loupias, B.; Falize, E.; Busschaert, C.; Ravasio, A.; Yurchak, R.; Pelka, A.; Koenig, M.; Kuranz, C. C.; Plewa, T.; Huntington, C. M.; Kaczala, D. N.; Klein, S.; Sweeney, R.; Villete, B.; Young, R.; Keiter, P. A.

    2012-05-01

    We present results from high-energy-density (HED) laboratory experiments that explore the contribution of radiative shock waves to the evolving dynamics of the cataclysmic variable (CV) systems in which they reside. CVs can be classified under two main categories, non-magnetic and magnetic. In the process of accretion, both types involve strongly radiating shocks that provide the main source of radiation in the binary systems. This radiation can cause varying structure to develop depending on the optical properties of the material on either side of the shock. The ability of high-intensity lasers to create large energy densities in targets of millimeter-scale volume makes it feasible to create similar radiative shocks in the laboratory. We provide an overview of both CV systems and their connection to the designed and executed laboratory experiments preformed on two laser facilities. Available data and accompanying simulations will likewise be shown. Funded by the NNSA-DS and SC-OFES Joint Prog. in High-Energy-Density Lab. Plasmas, by the Nat. Laser User Facility Prog. in NNSA-DS and by the Predictive Sci. Acad. Alliances Prog. in NNSA-ASC, under grant numbers are DE-FG52-09NA29548, DE-FG52-09NA29034, and DE-FC52-08NA28616.

  17. Reverse Radiative Shock Experiments Relevant to Accreting Stream-Disk Impact in Interacting Binaries

    NASA Astrophysics Data System (ADS)

    Krauland, Christine; Drake, R. P.; Kuranz, C. K.; Huntington, C. M.; Grosskopf, M. J.; Marion, D. C.; Young, R.; Plewa, T.

    2011-05-01

    In many Cataclysmic Binary systems, mass onto an accretion disk produces a `hot spot’ where the infalling supersonic flow obliquely strikes the rotating accretion disk. This collision region has many ambiguities as a radiation hydrodynamic system, but shock development in the infalling flow can be modeled. Depending upon conditions, it has been argued (Armitage & Livio, ApJ 493, 898) that the shocked region may be optically thin, thick, or intermediate, which has the potential to significantly alter the hot spot's structure and emissions. We report the first experimental attempt to produce colliding flows that create a radiative reverse shock at the Omega-60 laser facility. Obtaining a radiative reverse shock in the laboratory requires producing a sufficiently fast flow (> 100 km/s) within a material whose opacity is large enough to produce energetically significant emission from experimentally achievable layers. We will discuss the experimental design, the available data, and our astrophysical context. Funded by the NNSA-DS and SC-OFES Joint Prog. in High-Energy-Density Lab. Plasmas, by the Nat. Laser User Facility Prog. in NNSA-DS and by the Predictive Sci. Acad. Alliances Prog. in NNSA-ASC, under grant numbers are DE-FG52-09NA29548, DE-FG52-09NA29034, and DE-FC52-08NA28616.

  18. Climate@Home: Crowdsourcing Climate Change Research

    NASA Astrophysics Data System (ADS)

    Xu, C.; Yang, C.; Li, J.; Sun, M.; Bambacus, M.

    2011-12-01

    Climate change deeply impacts human wellbeing. Significant amounts of resources have been invested in building super-computers that are capable of running advanced climate models, which help scientists understand climate change mechanisms, and predict its trend. Although climate change influences all human beings, the general public is largely excluded from the research. On the other hand, scientists are eagerly seeking communication mediums for effectively enlightening the public on climate change and its consequences. The Climate@Home project is devoted to connect the two ends with an innovative solution: crowdsourcing climate computing to the general public by harvesting volunteered computing resources from the participants. A distributed web-based computing platform will be built to support climate computing, and the general public can 'plug-in' their personal computers to participate in the research. People contribute the spare computing power of their computers to run a computer model, which is used by scientists to predict climate change. Traditionally, only super-computers could handle such a large computing processing load. By orchestrating massive amounts of personal computers to perform atomized data processing tasks, investments on new super-computers, energy consumed by super-computers, and carbon release from super-computers are reduced. Meanwhile, the platform forms a social network of climate researchers and the general public, which may be leveraged to raise climate awareness among the participants. A portal is to be built as the gateway to the climate@home project. Three types of roles and the corresponding functionalities are designed and supported. The end users include the citizen participants, climate scientists, and project managers. Citizen participants connect their computing resources to the platform by downloading and installing a computing engine on their personal computers. Computer climate models are defined at the server side. Climate scientists configure computer model parameters through the portal user interface. After model configuration, scientists then launch the computing task. Next, data is atomized and distributed to computing engines that are running on citizen participants' computers. Scientists will receive notifications on the completion of computing tasks, and examine modeling results via visualization modules of the portal. Computing tasks, computing resources, and participants are managed by project managers via portal tools. A portal prototype has been built for proof of concept. Three forums have been setup for different groups of users to share information on science aspect, technology aspect, and educational outreach aspect. A facebook account has been setup to distribute messages via the most popular social networking platform. New treads are synchronized from the forums to facebook. A mapping tool displays geographic locations of the participants and the status of tasks on each client node. A group of users have been invited to test functions such as forums, blogs, and computing resource monitoring.

  19. Basic instincts

    NASA Astrophysics Data System (ADS)

    Hutson, Matthew

    2018-05-01

    In their adaptability, young children demonstrate common sense, a kind of intelligence that, so far, computer scientists have struggled to reproduce. Gary Marcus, a developmental cognitive scientist at New York University in New York City, believes the field of artificial intelligence (AI) would do well to learn lessons from young thinkers. Researchers in machine learning argue that computers trained on mountains of data can learn just about anything—including common sense—with few, if any, programmed rules. But Marcus says computer scientists are ignoring decades of work in the cognitive sciences and developmental psychology showing that humans have innate abilities—programmed instincts that appear at birth or in early childhood—that help us think abstractly and flexibly. He believes AI researchers ought to include such instincts in their programs. Yet many computer scientists, riding high on the successes of machine learning, are eagerly exploring the limits of what a naïve AI can do. Computer scientists appreciate simplicity and have an aversion to debugging complex code. Furthermore, big companies such as Facebook and Google are pushing AI in this direction. These companies are most interested in narrowly defined, near-term problems, such as web search and facial recognition, in which blank-slate AI systems can be trained on vast data sets and work remarkably well. But in the longer term, computer scientists expect AIs to take on much tougher tasks that require flexibility and common sense. They want to create chatbots that explain the news, autonomous taxis that can handle chaotic city traffic, and robots that nurse the elderly. Some computer scientists are already trying. Such efforts, researchers hope, will result in AIs that sit somewhere between pure machine learning and pure instinct. They will boot up following some embedded rules, but will also learn as they go.

  20. Facilities | Computational Science | NREL

    Science.gov Websites

    technology innovation by providing scientists and engineers the ability to tackle energy challenges that scientists and engineers to take full advantage of advanced computing hardware and software resources

  1. An Analysis of Computer-Mediated Communication between Middle School Students and Scientist Role Models: A Pilot Study.

    ERIC Educational Resources Information Center

    Murfin, Brian

    1994-01-01

    Reports on a study of the effectiveness of computer-mediated communication (CMC) in providing African American and female middle school students with scientist role models. Quantitative and qualitative data gathered by analyzing messages students and scientists posted on a shared electronic bulletin board showed that CMC could be an effective…

  2. Scout: high-performance heterogeneous computing made simple

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jablin, James; Mc Cormick, Patrick; Herlihy, Maurice

    2011-01-26

    Researchers must often write their own simulation and analysis software. During this process they simultaneously confront both computational and scientific problems. Current strategies for aiding the generation of performance-oriented programs do not abstract the software development from the science. Furthermore, the problem is becoming increasingly complex and pressing with the continued development of many-core and heterogeneous (CPU-GPU) architectures. To acbieve high performance, scientists must expertly navigate both software and hardware. Co-design between computer scientists and research scientists can alleviate but not solve this problem. The science community requires better tools for developing, optimizing, and future-proofing codes, allowing scientists to focusmore » on their research while still achieving high computational performance. Scout is a parallel programming language and extensible compiler framework targeting heterogeneous architectures. It provides the abstraction required to buffer scientists from the constantly-shifting details of hardware while still realizing higb-performance by encapsulating software and hardware optimization within a compiler framework.« less

  3. From Both Sides, Now: Librarians Team up with Computer Scientist to Deliver Virtual Computer-Information Literacy Instruction

    ERIC Educational Resources Information Center

    Loesch, Martha Fallahay

    2011-01-01

    Two members of the library faculty at Seton Hall University teamed up with a respected professor of mathematics and computer science, in order to create an online course that introduces information literacy both from the perspectives of the computer scientist and from the instruction librarian. This collaboration is unique in that it addresses the…

  4. A whole ecosystem approach to studying climate change in interior Alaska

    USGS Publications Warehouse

    Riggins, Susan; Striegl, Robert G.; McHale, Michael

    2011-01-01

    Yukon River Basin Principal Investigators Workshop; Portland, Oregon, 18-20 January 2011; High latitudes are known to be particularly susceptible to climate warming, leading to an emphasis of field and modeling research on arctic regions. Subarctic and boreal regions such as the Yukon River Basin (YRB) of interior Alaska and western Canada are less well studied, although they encompass large areas that are vulnerable to changes in forest composition, permafrost distribution, and hydrology. There is an urgent need to understand the resiliency and vulnerability of these complex ecosystems as well as their feedbacks to the global climate system. Consequently, U.S. Geological Survey scientists, with other federal agency, university, and private industry partners, is focusing subarctic interdisciplinary studies on the Beaver Creek Wild and Scenic River watershed (http://www.blm.gov/pgdata/content/ak/en/prog/nlcs/beavercrk_nwsr.html) and Yukon Flats National Wildlife Refuge (http://yukonflats.fws.gov/) in the YRB, south and west of Fort Yukon, Alaska. These areas are national treasures of wetlands, lakes, and uplands that support large populations of wildlife and waterfowl and are home to vibrant native Alaskan communities that depend on the area for a subsistence lifestyle.

  5. Enabling drug discovery project decisions with integrated computational chemistry and informatics

    NASA Astrophysics Data System (ADS)

    Tsui, Vickie; Ortwine, Daniel F.; Blaney, Jeffrey M.

    2017-03-01

    Computational chemistry/informatics scientists and software engineers in Genentech Small Molecule Drug Discovery collaborate with experimental scientists in a therapeutic project-centric environment. Our mission is to enable and improve pre-clinical drug discovery design and decisions. Our goal is to deliver timely data, analysis, and modeling to our therapeutic project teams using best-in-class software tools. We describe our strategy, the organization of our group, and our approaches to reach this goal. We conclude with a summary of the interdisciplinary skills required for computational scientists and recommendations for their training.

  6. Provenance-Powered Automatic Workflow Generation and Composition

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Lee, S.; Pan, L.; Lee, T. J.

    2015-12-01

    In recent years, scientists have learned how to codify tools into reusable software modules that can be chained into multi-step executable workflows. Existing scientific workflow tools, created by computer scientists, require domain scientists to meticulously design their multi-step experiments before analyzing data. However, this is oftentimes contradictory to a domain scientist's daily routine of conducting research and exploration. We hope to resolve this dispute. Imagine this: An Earth scientist starts her day applying NASA Jet Propulsion Laboratory (JPL) published climate data processing algorithms over ARGO deep ocean temperature and AMSRE sea surface temperature datasets. Throughout the day, she tunes the algorithm parameters to study various aspects of the data. Suddenly, she notices some interesting results. She then turns to a computer scientist and asks, "can you reproduce my results?" By tracking and reverse engineering her activities, the computer scientist creates a workflow. The Earth scientist can now rerun the workflow to validate her findings, modify the workflow to discover further variations, or publish the workflow to share the knowledge. In this way, we aim to revolutionize computer-supported Earth science. We have developed a prototyping system to realize the aforementioned vision, in the context of service-oriented science. We have studied how Earth scientists conduct service-oriented data analytics research in their daily work, developed a provenance model to record their activities, and developed a technology to automatically generate workflow starting from user behavior and adaptability and reuse of these workflows for replicating/improving scientific studies. A data-centric repository infrastructure is established to catch richer provenance to further facilitate collaboration in the science community. We have also established a Petri nets-based verification instrument for provenance-based automatic workflow generation and recommendation.

  7. Center for computation and visualization of geometric structures. Final report, 1992 - 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-11-01

    This report describes the overall goals and the accomplishments of the Geometry Center of the University of Minnesota, whose mission is to develop, support, and promote computational tools for visualizing geometric structures, for facilitating communication among mathematical and computer scientists and between these scientists and the public at large, and for stimulating research in geometry.

  8. Sediment and erosion control laboratory facility expansion.

    DOT National Transportation Integrated Search

    2016-08-01

    The Sediment and Erosion Control Laboratory (SEC Lab), formerly the Hydraulics, Sedimentation, and : Erosion Control Laboratory, is operated by the Texas A&M Transportation Institutes Environment and : Planning Program. Performance evaluation prog...

  9. What do computer scientists tweet? Analyzing the link-sharing practice on Twitter.

    PubMed

    Schmitt, Marco; Jäschke, Robert

    2017-01-01

    Twitter communication has permeated every sphere of society. To highlight and share small pieces of information with possibly vast audiences or small circles of the interested has some value in almost any aspect of social life. But what is the value exactly for a scientific field? We perform a comprehensive study of computer scientists using Twitter and their tweeting behavior concerning the sharing of web links. Discerning the domains, hosts and individual web pages being tweeted and the differences between computer scientists and a Twitter sample enables us to look in depth at the Twitter-based information sharing practices of a scientific community. Additionally, we aim at providing a deeper understanding of the role and impact of altmetrics in computer science and give a glance at the publications mentioned on Twitter that are most relevant for the computer science community. Our results show a link sharing culture that concentrates more heavily on public and professional quality information than the Twitter sample does. The results also show a broad variety in linked sources and especially in linked publications with some publications clearly related to community-specific interests of computer scientists, while others with a strong relation to attention mechanisms in social media. This refers to the observation that Twitter is a hybrid form of social media between an information service and a social network service. Overall the computer scientists' style of usage seems to be more on the information-oriented side and to some degree also on professional usage. Therefore, altmetrics are of considerable use in analyzing computer science.

  10. New Frontiers in Analyzing Dynamic Group Interactions: Bridging Social and Computer Science

    PubMed Central

    Lehmann-Willenbrock, Nale; Hung, Hayley; Keyton, Joann

    2017-01-01

    This special issue on advancing interdisciplinary collaboration between computer scientists and social scientists documents the joint results of the international Lorentz workshop, “Interdisciplinary Insights into Group and Team Dynamics,” which took place in Leiden, The Netherlands, July 2016. An equal number of scholars from social and computer science participated in the workshop and contributed to the papers included in this special issue. In this introduction, we first identify interaction dynamics as the core of group and team models and review how scholars in social and computer science have typically approached behavioral interactions in groups and teams. Next, we identify key challenges for interdisciplinary collaboration between social and computer scientists, and we provide an overview of the different articles in this special issue aimed at addressing these challenges. PMID:29249891

  11. Human computers: the first pioneers of the information age.

    PubMed

    Grier, D A

    2001-03-01

    Before computers were machines, they were people. They were men and women, young and old, well educated and common. They were the workers who convinced scientists that large-scale calculation had value. Long before Presper Eckert and John Mauchly built the ENIAC at the Moore School of Electronics, Philadelphia, or Maurice Wilkes designed the EDSAC for Manchester University, human computers had created the discipline of computation. They developed numerical methodologies and proved them on practical problems. These human computers were not savants or calculating geniuses. Some knew little more than basic arithmetic. A few were near equals of the scientists they served and, in a different time or place, might have become practicing scientists had they not been barred from a scientific career by their class, education, gender or ethnicity.

  12. New Frontiers in Analyzing Dynamic Group Interactions: Bridging Social and Computer Science.

    PubMed

    Lehmann-Willenbrock, Nale; Hung, Hayley; Keyton, Joann

    2017-10-01

    This special issue on advancing interdisciplinary collaboration between computer scientists and social scientists documents the joint results of the international Lorentz workshop, "Interdisciplinary Insights into Group and Team Dynamics," which took place in Leiden, The Netherlands, July 2016. An equal number of scholars from social and computer science participated in the workshop and contributed to the papers included in this special issue. In this introduction, we first identify interaction dynamics as the core of group and team models and review how scholars in social and computer science have typically approached behavioral interactions in groups and teams. Next, we identify key challenges for interdisciplinary collaboration between social and computer scientists, and we provide an overview of the different articles in this special issue aimed at addressing these challenges.

  13. Child restraint device loaner programs

    DOT National Transportation Integrated Search

    1981-06-01

    The child restraint device (CRD) loaner programs in Tennessee were evaluated. In-Lerviews were conducted with loaner program clients in Memphis, Chattanooga, and Knoxville. Administrators of programs in all three sites also were interviewed. The prog...

  14. Building place-based collaborations to develop high school students' groundwater systems knowledge and decision-making capacity

    NASA Astrophysics Data System (ADS)

    Podrasky, A.; Covitt, B. A.; Woessner, W.

    2017-12-01

    The availability of clean water to support human uses and ecological integrity has become an urgent interest for many scientists, decision makers and citizens. Likewise, as computational capabilities increasingly revolutionize and become integral to the practice of science, technology, engineering and math (STEM) disciplines, the STEM+ Computing (STEM+C) Partnerships program seeks to integrate the use of computational approaches in K-12 STEM teaching and learning. The Comp Hydro project, funded by a STEM+C grant from the National Science Foundation, brings together a diverse team of scientists, educators, professionals and citizens at sites in Arizona, Colorado, Maryland and Montana to foster water literacy, as well as computational science literacy, by integrating authentic, place- and data- based learning using physical, mathematical, computational and conceptual models. This multi-state project is currently engaging four teams of six teachers who work during two academic years with educators and scientists at each site. Teams work to develop instructional units specific to their region that integrate hydrologic science and computational modeling. The units, currently being piloted in high school earth and environmental science classes, provide a classroom context to investigate student understanding of how computation is used in Earth systems science. To develop effective science instruction that is rich in place- and data- based learning, effective collaborations between researchers, educators, scientists, professionals and citizens are crucial. In this poster, we focus on project implementation in Montana, where an instructional unit has been developed and is being tested through collaboration among University scientists, researchers and educators, high school teachers and agency and industry scientists and engineers. In particular, we discuss three characteristics of effective collaborative science education design for developing and implementing place- and data- based science education to support students in developing socio-scientific and computational literacy sufficient for making decisions about real world issues such as groundwater contamination. These characteristics include that science education experiences are real, responsive/accessible and rigorous.

  15. Scientific Computing Paradigm

    NASA Technical Reports Server (NTRS)

    VanZandt, John

    1994-01-01

    The usage model of supercomputers for scientific applications, such as computational fluid dynamics (CFD), has changed over the years. Scientific visualization has moved scientists away from looking at numbers to looking at three-dimensional images, which capture the meaning of the data. This change has impacted the system models for computing. This report details the model which is used by scientists at NASA's research centers.

  16. Message from the ISCB: 2015 ISCB Accomplishment by a Senior Scientist Award: Cyrus Chothia.

    PubMed

    Fogg, Christiana N; Kovats, Diane E

    2015-07-01

    The International Society for Computational Biology (ISCB; http://www.iscb.org) honors a senior scientist annually for his or her outstanding achievements with the ISCB Accomplishment by a Senior Scientist Award. This award recognizes a leader in the field of computational biology for his or her significant contributions to the community through research, service and education. Cyrus Chothia, an emeritus scientist at the Medical Research Council Laboratory of Molecular Biology and emeritus fellow of Wolfson College at Cambridge University, England, is the 2015 ISCB Accomplishment by a Senior Scientist Award winner.Chothia was selected by the Awards Committee, which is chaired by Dr Bonnie Berger of the Massachusetts Institute of Technology. He will receive his award and deliver a keynote presentation at 2015 Intelligent Systems for Molecular Biology/European Conference on Computational Biology in Dublin, Ireland, in July 2015. dkovats@iscb.org. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. Enabling Earth Science: The Facilities and People of the NCCS

    NASA Technical Reports Server (NTRS)

    2002-01-01

    The NCCS's mass data storage system allows scientists to store and manage the vast amounts of data generated by these computations, and its high-speed network connections allow the data to be accessed quickly from the NCCS archives. Some NCCS users perform studies that are directly related to their ability to run computationally expensive and data-intensive simulations. Because the number and type of questions scientists research often are limited by computing power, the NCCS continually pursues the latest technologies in computing, mass storage, and networking technologies. Just as important as the processors, tapes, and routers of the NCCS are the personnel who administer this hardware, create and manage accounts, maintain security, and assist the scientists, often working one on one with them.

  18. Big Data: An Opportunity for Collaboration with Computer Scientists on Data-Driven Science

    NASA Astrophysics Data System (ADS)

    Baru, C.

    2014-12-01

    Big data technologies are evolving rapidly, driven by the need to manage ever increasing amounts of historical data; process relentless streams of human and machine-generated data; and integrate data of heterogeneous structure from extremely heterogeneous sources of information. Big data is inherently an application-driven problem. Developing the right technologies requires an understanding of the applications domain. Though, an intriguing aspect of this phenomenon is that the availability of the data itself enables new applications not previously conceived of! In this talk, we will discuss how the big data phenomenon creates an imperative for collaboration among domain scientists (in this case, geoscientists) and computer scientists. Domain scientists provide the application requirements as well as insights about the data involved, while computer scientists help assess whether problems can be solved with currently available technologies or require adaptaion of existing technologies and/or development of new technologies. The synergy can create vibrant collaborations potentially leading to new science insights as well as development of new data technologies and systems. The area of interface between geosciences and computer science, also referred to as geoinformatics is, we believe, a fertile area for interdisciplinary research.

  19. Statistical evaluation of blood alcohol measurements

    DOT National Transportation Integrated Search

    1981-10-01

    The U.S. Department of Transportation, National Highway Traffic Safety Administration (NHTSA) has instituted a voluntary program to evaluate the proficiency of laboratories measuring the amount of alcohol in blood. In this report, data from that prog...

  20. CREASE 6.0 Catalog of Resources for Education in Ada and Software Engineering

    DTIC Science & Technology

    1992-02-01

    Programming Software Engineering Strong Typing Tasking Audene . Computer Scientists Terbook(s): Barnes, J. Programming in Ada, 3rd ed. Addison-Wesley...Ada. Concept: Abstract Data Types Management Overview Package Real-Time Programming Tasking Audene Computer Scientists Textbook(s): Barnes, J

  1. Parallel computing in genomic research: advances and applications

    PubMed Central

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today’s genomic experiments have to process the so-called “biological big data” that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. PMID:26604801

  2. Parallel computing in genomic research: advances and applications.

    PubMed

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities.

  3. An economic and financial exploratory

    NASA Astrophysics Data System (ADS)

    Cincotti, S.; Sornette, D.; Treleaven, P.; Battiston, S.; Caldarelli, G.; Hommes, C.; Kirman, A.

    2012-11-01

    This paper describes the vision of a European Exploratory for economics and finance using an interdisciplinary consortium of economists, natural scientists, computer scientists and engineers, who will combine their expertise to address the enormous challenges of the 21st century. This Academic Public facility is intended for economic modelling, investigating all aspects of risk and stability, improving financial technology, and evaluating proposed regulatory and taxation changes. The European Exploratory for economics and finance will be constituted as a network of infrastructure, observatories, data repositories, services and facilities and will foster the creation of a new cross-disciplinary research community of social scientists, complexity scientists and computing (ICT) scientists to collaborate in investigating major issues in economics and finance. It is also considered a cradle for training and collaboration with the private sector to spur spin-offs and job creations in Europe in the finance and economic sectors. The Exploratory will allow Social Scientists and Regulators as well as Policy Makers and the private sector to conduct realistic investigations with real economic, financial and social data. The Exploratory will (i) continuously monitor and evaluate the status of the economies of countries in their various components, (ii) use, extend and develop a large variety of methods including data mining, process mining, computational and artificial intelligence and every other computer and complex science techniques coupled with economic theory and econometric, and (iii) provide the framework and infrastructure to perform what-if analysis, scenario evaluations and computational, laboratory, field and web experiments to inform decision makers and help develop innovative policy, market and regulation designs.

  4. Award-Winning Animation Helps Scientists See Nature at Work | News | NREL

    Science.gov Websites

    Scientists See Nature at Work August 8, 2008 A computer-aided image combines a photo of a man with a three -dimensional, computer-generated image. The man has long brown hair and a long beard. He is wearing a blue - simultaneously. "It is very difficult to parallelize the process to run even on a huge computer,"

  5. From Years of Work in Psychology and Computer Science, Scientists Build Theories of Thinking and Learning.

    ERIC Educational Resources Information Center

    Wheeler, David L.

    1988-01-01

    Scientists feel that progress in artificial intelligence and the availability of thousands of experimental results make this the right time to build and test theories on how people think and learn, using the computer to model minds. (MSE)

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCaskey, Alexander J.

    Hybrid programming models for beyond-CMOS technologies will prove critical for integrating new computing technologies alongside our existing infrastructure. Unfortunately the software infrastructure required to enable this is lacking or not available. XACC is a programming framework for extreme-scale, post-exascale accelerator architectures that integrates alongside existing conventional applications. It is a pluggable framework for programming languages developed for next-gen computing hardware architectures like quantum and neuromorphic computing. It lets computational scientists efficiently off-load classically intractable work to attached accelerators through user-friendly Kernel definitions. XACC makes post-exascale hybrid programming approachable for domain computational scientists.

  7. Evaluation of the Oregon DMV driver improvement program.

    DOT National Transportation Integrated Search

    2007-04-01

    This report provides an evaluation of the Oregon Department of Transportation-Driver and Motor Vehicle : (DMV) Services Driver Improvement Program (DIP), which was substantially changed in 2002. Prior to 2002, : the DIP was organized around four prog...

  8. Effectiveness of Oregon's teen licensing program.

    DOT National Transportation Integrated Search

    2008-06-01

    Significant changes in Oregons teen licensing laws went into effect on March 1, 2000. The new laws expanded the provisional driving license program which had been in effect since October 1989 and established a graduated driver licensing (GDL) prog...

  9. MODELING TREE LEVEL PROCESSES

    EPA Science Inventory

    An overview of three main types of simulation approach (explanatory, abstraction, and estimation) is presented, along with a discussion of their capabilities limitations, and the steps required for their validation. A process model being developed through the Forest Response Prog...

  10. Iran Sanctions

    DTIC Science & Technology

    2016-05-18

    Manufacturing Group (Iran, missile prog.) Aerospace Industries Organization (AIO) (Iran) September 2007 Korea Mining and Development Corp. ( N . Korea... Vitaly Sokolenko (general manager of Ferland) April 29, 2014 (for connections to deceptive oil dealings for Iran) Saeed Al Aqili (co-owner of Al

  11. NCORP Gets Underway

    Cancer.gov

    NCI has awarded 53 new 5-year grants to researchers across the country to conduct multi-site cancer clinical trials and cancer care delivery research studies in their communities. The grants are being awarded under the NCI Community Oncology Research Prog

  12. Accuracy Of LTPP Traffic Loading Estimates

    DOT National Transportation Integrated Search

    1998-07-01

    The accuracy and reliability of traffic load estimates are key to determining a pavement's life expectancy. To better understand the variability of traffic loading rates and its effect on the accuracy of the Long Term Pavement Performance (LTPP) prog...

  13. "Ask Argonne" - Charlie Catlett, Computer Scientist, Part 2

    ScienceCinema

    Catlett, Charlie

    2018-02-14

    A few weeks back, computer scientist Charlie Catlett talked a bit about the work he does and invited questions from the public during Part 1 of his "Ask Argonne" video set (http://bit.ly/1joBtzk). In Part 2, he answers some of the questions that were submitted. Enjoy!

  14. "Ask Argonne" - Charlie Catlett, Computer Scientist, Part 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Catlett, Charlie

    2014-06-17

    A few weeks back, computer scientist Charlie Catlett talked a bit about the work he does and invited questions from the public during Part 1 of his "Ask Argonne" video set (http://bit.ly/1joBtzk). In Part 2, he answers some of the questions that were submitted. Enjoy!

  15. From Lived Experiences to Game Creation: How Scaffolding Supports Elementary School Students Learning Computer Science Principles in an After School Setting

    ERIC Educational Resources Information Center

    Her Many Horses, Ian

    2016-01-01

    The world, and especially our own country, is in dire need of a larger and more diverse population of computer scientists. While many organizations have approached this problem of too few computer scientists in various ways, a promising, and I believe necessary, path is to expose elementary students to authentic practices of the discipline.…

  16. Effect of Sex Differences on Brain Mitochondrial Function and Its Suppression by Ovariectomy and in Aged Mice.

    PubMed

    Gaignard, Pauline; Savouroux, Stéphane; Liere, Philippe; Pianos, Antoine; Thérond, Patrice; Schumacher, Michael; Slama, Abdelhamid; Guennoun, Rachida

    2015-08-01

    Sex steroids regulate brain function in both normal and pathological states. Mitochondria are an essential target of steroids, as demonstrated by the experimental administration of 17β-estradiol or progesterone (PROG) to ovariectomized female rodents, but the influence of endogenous sex steroids remains understudied. To address this issue, mitochondrial oxidative stress, the oxidative phosphorylation system, and brain steroid levels were analyzed under 3 different experimental sets of endocrine conditions. The first set was designed to study steroid-mediated sex differences in young male and female mice, intact and after gonadectomy. The second set concerned young female mice at 3 time points of the estrous cycle in order to analyze the influence of transient variations in steroid levels. The third set involved the evaluation of the effects of a permanent decrease in gonadal steroids in aged male and female mice. Our results show that young adult females have lower oxidative stress and a higher reduced nicotinamide adenine dinucleotide (NADH)-linked respiration rate, which is related to a higher pyruvate dehydrogenase complex activity as compared with young adult males. This sex difference did not depend on phases of the estrous cycle, was suppressed by ovariectomy but not by orchidectomy, and no longer existed in aged mice. Concomitant analysis of brain steroids showed that pregnenolone and PROG brain levels were higher in females during the reproductive period than in males and decreased with aging in females. These findings suggest that the major male/female differences in brain pregnenolone and PROG levels may contribute to the sex differences observed in brain mitochondrial function.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayer, Vidya M.; Miguez, Sheila; Toby, Brian H.

    Scientists have been central to the historical development of the computer industry, but the importance of software only continues to grow for all areas of scientific research and in particular for powder diffraction. Knowing how to program a computer is a basic and useful skill for scientists. The article introduces the three types of programming languages and why scripting languages are now preferred for scientists. Of them, the authors assert Python is the most useful and easiest to learn. Python is introduced. Also presented is an overview to a few of the many add-on packages available to extend the capabilitiesmore » of Python, for example, for numerical computations, scientific graphics and graphical user interface programming.« less

  18. Effectiveness of Oregon's teen licensing program : final report.

    DOT National Transportation Integrated Search

    2008-06-01

    Significant changes in Oregons teen licensing laws went into effect on March 1, 2000. The new laws expanded the provisional driving license program which had been in effect since October 1989 and established a graduated driver licensing (GDL) prog...

  19. Siletz River nutrients: Effects of biosolids application

    EPA Science Inventory

    Stream water nutrients were measured in the Siletz River, Oregon, with the goal of comparing dissolved nutrient concentrations, primarily the nitrogenous nutrients nitrate and ammonium, with previously collected data for the Yaquina and Alsea Rivers for the nutrient criteria prog...

  20. SITE-SPECIFIC DIAGNOSTIC TOOLS

    EPA Science Inventory

    US EPA's Office of Water is proposing Combined Assessment and Listing Methods (CALM) to
    meet reporting requirements under both Sections 305b and 303d for chemical and nonchemical
    stressors in the nation's waterbodies. Current Environmental Monitoring and Assessment
    Prog...

  1. Satellite Remote Sensing for Monitoring and Assessment

    EPA Science Inventory

    Remote sensing technology has the potential to enhance the engagement of communities and managers in the implementation and performance of best management practices. This presentation will use examples from U.S. numeric criteria development and state water quality monitoring prog...

  2. Optical Limiting Materials Based on Gold Nanoparticles

    DTIC Science & Technology

    2014-04-30

    of the electromagnetic spectrum. 2. Functionalization of the surface of the gold nanoparticles with selected organic and inorganic materials, with...F. A Review of Optical Limiting Mechanisms and Devices Using Organics, Fullerenes , Semiconductors and Other Materials. Prog. Quant. Electr. 1993

  3. Long live the Data Scientist, but can he/she persist?

    NASA Astrophysics Data System (ADS)

    Wyborn, L. A.

    2011-12-01

    In recent years the fourth paradigm of data intensive science has slowly taken hold as the increased capacity of instruments and an increasing number of instruments (in particular sensor networks) have changed how fundamental research is undertaken. Most modern scientific research is about digital capture of data direct from instruments, processing it by computers, storing the results on computers and only publishing a small fraction of data in hard copy publications. At the same time, the rapid increase in capacity of supercomputers, particularly at petascale, means that far larger data sets can be analysed and to greater resolution than previously possible. The new cloud computing paradigm which allows distributed data, software and compute resources to be linked by seamless workflows, is creating new opportunities in processing of high volumes of data to an increasingly larger number of researchers. However, to take full advantage of these compute resources, data sets for analysis have to be aggregated from multiple sources to create high performance data sets. These new technology developments require that scientists must become more skilled in data management and/or have a higher degree of computer literacy. In almost every science discipline there is now an X-informatics branch and a computational X branch (eg, Geoinformatics and Computational Geoscience): both require a new breed of researcher that has skills in both the science fundamentals and also knowledge of some ICT aspects (computer programming, data base design and development, data curation, software engineering). People that can operate in both science and ICT are increasingly known as 'data scientists'. Data scientists are a critical element of many large scale earth and space science informatics projects, particularly those that are tackling current grand challenges at an international level on issues such as climate change, hazard prediction and sustainable development of our natural resources. These projects by their very nature require the integration of multiple digital data sets from multiple sources. Often the preparation of the data for computational analysis can take months and requires painstaking attention to detail to ensure that anomalies identified are real and are not just artefacts of the data preparation and/or the computational analysis. Although data scientists are increasingly vital to successful data intensive earth and space science projects, unless they are recognised for their capabilities in both the science and the computational domains they are likely to migrate to either a science role or an ICT role as their career advances. Most reward and recognition systems do not recognise those with skills in both, hence, getting trained data scientists to persist beyond one or two projects can be challenge. Those data scientists that persist in the profession are characteristically committed and enthusiastic people who have the support of their organisations to take on this role. They also tend to be people who share developments and are critical to the success of the open source software movement. However, the fact remains that survival of the data scientist as a species is being threatened unless something is done to recognise their invaluable contributions to the new fourth paradigm of science.

  4. Quasiperiodicity route to chaos in cardiac conduction model

    NASA Astrophysics Data System (ADS)

    Quiroz-Juárez, M. A.; Vázquez-Medina, R.; Ryzhii, E.; Ryzhii, M.; Aragón, J. L.

    2017-01-01

    It has been suggested that cardiac arrhythmias are instances of chaos. In particular that the ventricular fibrillation is a form of spatio-temporal chaos that arises from normal rhythm through a quasi-periodicity or Ruelle-Takens-Newhouse route to chaos. In this work, we modify the heterogeneous oscillator model of cardiac conduction system proposed in Ref. [Ryzhii E, Ryzhii M. A heterogeneous coupled oscillator model for simulation of ECG signals. Comput Meth Prog Bio 2014;117(1):40-49. doi:10.1016/j.cmpb.2014.04.009.], by including an ectopic pacemaker that stimulates the ventricular muscle to model arrhythmias. With this modification, the transition from normal rhythm to ventricular fibrillation is controlled by a single parameter. We show that this transition follows the so-called torus of quasi-periodic route to chaos, as verified by using numerical tools such as power spectrum and largest Lyapunov exponent.

  5. How to Cloud for Earth Scientists: An Introduction

    NASA Technical Reports Server (NTRS)

    Lynnes, Chris

    2018-01-01

    This presentation is a tutorial on getting started with cloud computing for the purposes of Earth Observation datasets. We first discuss some of the main advantages that cloud computing can provide for the Earth scientist: copious processing power, immense and affordable data storage, and rapid startup time. We also talk about some of the challenges of getting the most out of cloud computing: re-organizing the way data are analyzed, handling node failures and attending.

  6. us9805_dni

    Science.gov Websites

    STYLE="text-align:Left;">

    Monthly and annual average solar Resource Potential Solar Resource Direct Normal

  7. MTBE REMOVAL FROM DRINKING WATER - PHASE I

    EPA Science Inventory

    The 1990 Federal Clean Air Act mandated the incorporation of oxygenates into gasoline in ozone and carbon monoxide nonattainment areas. Methyl tertiary butyl ether (MTBE) is the oxygenate of choice due to economic and supply considerations. Despite federal and state prog...

  8. Transportation improvement program : Richland, Ohio : fiscal year 1997-2000

    DOT National Transportation Integrated Search

    1996-06-01

    As part of the Urban Transportation Planning Process, under the Federal Planning regulations (Title 23 U.S.C. and Title 49 U.S.C.), the Metropolitan Planning Organization (MPO) is required to develop and keep current a Transportation Improvement Prog...

  9. Comparative Analysis Of River Conservation In The United States And South Africa

    EPA Science Inventory

    Both the United States and South Africa are recognized for their strong and innovative approaches to the conservation of river ecosystems. These national programs possess similar driving legislation and ecoregional classification schemes supported by comprehensive monitoring prog...

  10. LASP-01: Distribution of Mouse Embryonic Stem Cells Expressing MicroRNAs | Frederick National Laboratory for Cancer Research

    Cancer.gov

    The Laboratory Animal Sciences Program manages the expansion, processing, and distribution of1,501 genetically engineered mouse embryonic stem cell (mESC) linesharboring conditional microRNA transgenes. The Laboratory Animal Sciences Prog

  11. Minnesota urban partnership agreement national evaluation : surveys, interviews, and focus groups test plan.

    DOT National Transportation Integrated Search

    2009-11-17

    This report presents the test plan for developing, conducting, and analyzing surveys, interviews, and focus groups for evaluating the Minnesota Urban Partnership Agreement (UPA) under the United States Department of Transportation (U.S. DOT) UPA Prog...

  12. The IT in Secondary Science Book. A Compendium of Ideas for Using Computers and Teaching Science.

    ERIC Educational Resources Information Center

    Frost, Roger

    Scientists need to measure and communicate, to handle information, and model ideas. In essence, they need to process information. Young scientists have the same needs. Computers have become a tremendously important addition to the processing of information through database use, graphing and modeling and also in the collection of information…

  13. 75 FR 64996 - Takes of Marine Mammals Incidental to Specified Activities; Marine Geophysical Survey in the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-21

    ... cruises. A laptop computer is located on the observer platform for ease of data entry. The computer is... lines, the receiving systems will receive the returning acoustic signals. The study (e.g., equipment...-board assistance by the scientists who have proposed the study. The Chief Scientist is Dr. Franco...

  14. What do computer scientists tweet? Analyzing the link-sharing practice on Twitter

    PubMed Central

    Schmitt, Marco

    2017-01-01

    Twitter communication has permeated every sphere of society. To highlight and share small pieces of information with possibly vast audiences or small circles of the interested has some value in almost any aspect of social life. But what is the value exactly for a scientific field? We perform a comprehensive study of computer scientists using Twitter and their tweeting behavior concerning the sharing of web links. Discerning the domains, hosts and individual web pages being tweeted and the differences between computer scientists and a Twitter sample enables us to look in depth at the Twitter-based information sharing practices of a scientific community. Additionally, we aim at providing a deeper understanding of the role and impact of altmetrics in computer science and give a glance at the publications mentioned on Twitter that are most relevant for the computer science community. Our results show a link sharing culture that concentrates more heavily on public and professional quality information than the Twitter sample does. The results also show a broad variety in linked sources and especially in linked publications with some publications clearly related to community-specific interests of computer scientists, while others with a strong relation to attention mechanisms in social media. This refers to the observation that Twitter is a hybrid form of social media between an information service and a social network service. Overall the computer scientists’ style of usage seems to be more on the information-oriented side and to some degree also on professional usage. Therefore, altmetrics are of considerable use in analyzing computer science. PMID:28636619

  15. Experimental use of geogrids as an alternative to gravel placement : fourth year interim and final report, October 2006.

    DOT National Transportation Integrated Search

    2006-10-01

    With the ongoing demand for improved infrastructure, the Maine Department of Transportation : (MaineDOT) continues to identify and evaluate new and innovative construction methods and materials. : The Departments Collector Highway Improvement Prog...

  16. GLOBAL TRANSITION TO SUSTAINABLE DEVELOPMENT

    EPA Science Inventory

    Global transition to sustainable development is possible but many obstacles lie in the way and it will require acts of political will on the part of both the developed and developing nations to become a reality. In this paper, sustainable development is defined as continuous prog...

  17. An Open Letter to the Cancer Community Regarding Community Clinical Trials

    Cancer.gov

    The National Cancer Institute (NCI) is in the process of combining its two community-based research networks to create a single network that builds on the strengths of the Community Clinical Oncology Program/Minority-Based Community Clinical Oncology Prog

  18. SMALL DRINKING WATER SYSTEM PEER REVIEW PROGRAM

    EPA Science Inventory

    The United South and Eastern Tribes, Inc., which is made up of twenty-four (24) tribes, ranging in location, geographically, from Maine to Texas, AND three (3) states, Mississippi, Kentucky, and Georgia, participated in a program, "The Small Drinking Water System Peer Review Prog...

  19. Results of Special Accident Study Teams/ASAP Coordination Conference

    DOT National Transportation Integrated Search

    1974-07-01

    The second Special Accident Study Teams / ASAP Coordination Conference was held in Washington, D.C. on June 12-13, 1974 to continue coordination of activities and to report recent findings. The objectives of the conference were: (1) To report on prog...

  20. Analysis of severe storm data

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.

    1983-01-01

    The Mesoscale Analysis and Space Sensor (MASS) Data Management and Analysis System developed by Atsuko Computing International (ACI) on the MASS HP-1000 Computer System within the Systems Dynamics Laboratory of the Marshall Space Flight Center is described. The MASS Data Management and Analysis System was successfully implemented and utilized daily by atmospheric scientists to graphically display and analyze large volumes of conventional and satellite derived meteorological data. The scientists can process interactively various atmospheric data (Sounding, Single Level, Gird, and Image) by utilizing the MASS (AVE80) share common data and user inputs, thereby reducing overhead, optimizing execution time, and thus enhancing user flexibility, useability, and understandability of the total system/software capabilities. In addition ACI installed eight APPLE III graphics/imaging computer terminals in individual scientist offices and integrated them into the MASS HP-1000 Computer System thus providing significant enhancement to the overall research environment.

  1. NASA/DOD Aerospace Knowledge Diffusion Research Project. Paper 19: Computer and information technology and aerospace knowledge diffusion

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Kennedy, John M.; Barclay, Rebecca O.; Bishop, Ann P.

    1992-01-01

    To remain a world leader in aerospace, the US must improve and maintain the professional competency of its engineers and scientists, increase the research and development (R&D) knowledge base, improve productivity, and maximize the integration of recent technological developments into the R&D process. How well these objectives are met, and at what cost, depends on a variety of factors, but largely on the ability of US aerospace engineers and scientists to acquire and process the results of federally funded R&D. The Federal Government's commitment to high speed computing and networking systems presupposes that computer and information technology will play a major role in the aerospace knowledge diffusion process. However, we know little about information technology needs, uses, and problems within the aerospace knowledge diffusion process. The use of computer and information technology by US aerospace engineers and scientists in academia, government, and industry is reported.

  2. Introduction to the Space Physics Analysis Network (SPAN)

    NASA Technical Reports Server (NTRS)

    Green, J. L. (Editor); Peters, D. J. (Editor)

    1985-01-01

    The Space Physics Analysis Network or SPAN is emerging as a viable method for solving an immediate communication problem for the space scientist. SPAN provides low-rate communication capability with co-investigators and colleagues, and access to space science data bases and computational facilities. The SPAN utilizes up-to-date hardware and software for computer-to-computer communications allowing binary file transfer and remote log-on capability to over 25 nationwide space science computer systems. SPAN is not discipline or mission dependent with participation from scientists in such fields as magnetospheric, ionospheric, planetary, and solar physics. Basic information on the network and its use are provided. It is anticipated that SPAN will grow rapidly over the next few years, not only from the standpoint of more network nodes, but as scientists become more proficient in the use of telescience, more capability will be needed to satisfy the demands.

  3. Anticipation and consumption of food each increase the concentration of neuroactive steroids in rat brain and plasma.

    PubMed

    Pisu, Maria Giuseppina; Floris, Ivan; Maciocco, Elisabetta; Serra, Mariangela; Biggio, Giovanni

    2006-09-01

    Stressful stimuli and anxiogenic drugs increase the plasma and brain concentrations of neuroactive steroids. Moreover, in rats trained to consume their daily meal during a fixed period, the anticipation of food is associated with changes in the function of various neurotransmitter systems. We have now evaluated the effects of anticipation and consumption of food in such trained rats on the plasma and brain concentrations of 3alpha-hydroxy-5alpha-pregnan-20-one (3alpha,5alpha-TH PROG) and 3alpha,21-dihydroxy-5alpha-pregnan-20-one (3alpha,5alpha-TH DOC), two potent endogenous positive modulators of type A receptors for gamma-aminobutyric acid (GABA). The abundance of these neuroactive steroids was increased in both the cerebral cortex and plasma of the rats during both food anticipation and consumption. In contrast, the concentration of their precursor, progesterone, was increased in the brain only during food consumption, whereas it was increased in plasma only during food anticipation. Intraperitoneal administration of the selective agonist abecarnil (0.1 mg/kg) 40 min before food presentation prevented the increase in the brain levels of 3alpha,5alpha-TH PROG and 3alpha,5alpha-TH DOC during food anticipation but not that associated with consumption. The change in emotional state associated with food anticipation may thus result in an increase in the plasma and brain levels of 3alpha,5alpha-TH PROG and 3alpha,5alpha-TH DOC in a manner sensitive to the activation of GABA(A) receptor-mediated neurotransmission. A different mechanism, insensitive to activation of such transmission, may underlie the changes in the concentrations of these neuroactive steroids during food consumption.

  4. Dopaminergic modulation of effort-related choice behavior as assessed by a progressive ratio chow feeding choice task: pharmacological studies and the role of individual differences.

    PubMed

    Randall, Patrick A; Pardo, Marta; Nunes, Eric J; López Cruz, Laura; Vemuri, V Kiran; Makriyannis, Alex; Baqi, Younis; Müller, Christa E; Correa, Mercè; Salamone, John D

    2012-01-01

    Mesolimbic dopamine (DA) is involved in behavioral activation and effort-related processes. Rats with impaired DA transmission reallocate their instrumental behavior away from food-reinforced tasks with high response requirements, and instead select less effortful food-seeking behaviors. In the present study, the effects of several drug treatments were assessed using a progressive ratio (PROG)/chow feeding concurrent choice task. With this task, rats can lever press on a PROG schedule reinforced by a preferred high-carbohydrate food pellet, or alternatively approach and consume the less-preferred but concurrently available laboratory chow. Rats pass through each ratio level 15 times, after which the ratio requirement is incremented by one additional response. The DA D(2) antagonist haloperidol (0.025-0.1 mg/kg) reduced number of lever presses and highest ratio achieved but did not reduce chow intake. In contrast, the adenosine A(2A) antagonist MSX-3 increased lever presses and highest ratio achieved, but decreased chow consumption. The cannabinoid CB1 inverse agonist and putative appetite suppressant AM251 decreased lever presses, highest ratio achieved, and chow intake; this effect was similar to that produced by pre-feeding. Furthermore, DA-related signal transduction activity (pDARPP-32(Thr34) expression) was greater in nucleus accumbens core of high responders (rats with high lever pressing output) compared to low responders. Thus, the effects of DA antagonism differed greatly from those produced by pre-feeding or reduced CB1 transmission, and it appears unlikely that haloperidol reduces PROG responding because of a general reduction in primary food motivation or the unconditioned reinforcing properties of food. Furthermore, accumbens core signal transduction activity is related to individual differences in work output.

  5. A two-model hydrologic ensemble prediction of hydrograph: case study from the upper Nysa Klodzka river basin (SW Poland)

    NASA Astrophysics Data System (ADS)

    Niedzielski, Tomasz; Mizinski, Bartlomiej

    2016-04-01

    The HydroProg system has been elaborated in frame of the research project no. 2011/01/D/ST10/04171 of the National Science Centre of Poland and is steadily producing multimodel ensemble predictions of hydrograph in real time. Although there are six ensemble members available at present, the longest record of predictions and their statistics is available for two data-based models (uni- and multivariate autoregressive models). Thus, we consider 3-hour predictions of water levels, with lead times ranging from 15 to 180 minutes, computed every 15 minutes since August 2013 for the Nysa Klodzka basin (SW Poland) using the two approaches and their two-model ensemble. Since the launch of the HydroProg system there have been 12 high flow episodes, and the objective of this work is to present the performance of the two-model ensemble in the process of forecasting these events. For a sake of brevity, we limit our investigation to a single gauge located at the Nysa Klodzka river in the town of Klodzko, which is centrally located in the studied basin. We identified certain regular scenarios of how the models perform in predicting the high flows in Klodzko. At the initial phase of the high flow, well before the rising limb of hydrograph, the two-model ensemble is found to provide the most skilful prognoses of water levels. However, while forecasting the rising limb of hydrograph, either the two-model solution or the vector autoregressive model offers the best predictive performance. In addition, it is hypothesized that along with the development of the rising limb phase, the vector autoregression becomes the most skilful approach amongst the scrutinized ones. Our simple two-model exercise confirms that multimodel hydrologic ensemble predictions cannot be treated as universal solutions suitable for forecasting the entire high flow event, but their superior performance may hold only for certain phases of a high flow.

  6. Most Social Scientists Shun Free Use of Supercomputers.

    ERIC Educational Resources Information Center

    Kiernan, Vincent

    1998-01-01

    Social scientists, who frequently complain that the federal government spends too little on them, are passing up what scholars in the physical and natural sciences see as the government's best give-aways: free access to supercomputers. Some social scientists say the supercomputers are difficult to use; others find desktop computers provide…

  7. The Unified English Braille Code: Examination by Science, Mathematics, and Computer Science Technical Expert Braille Readers

    ERIC Educational Resources Information Center

    Holbrook, M. Cay; MacCuspie, P. Ann

    2010-01-01

    Braille-reading mathematicians, scientists, and computer scientists were asked to examine the usability of the Unified English Braille Code (UEB) for technical materials. They had little knowledge of the code prior to the study. The research included two reading tasks, a short tutorial about UEB, and a focus group. The results indicated that the…

  8. Hormone Metabolism During Potato Tuber Dormancy

    USDA-ARS?s Scientific Manuscript database

    At harvest and for an indeterminate period thereafter potato tubers will not sprout and are physiologically dormant. The length of tuber dormancy is dependent on cultivar and pre- and postharvest environmental conditions. Plant hormones have been shown to be involved in all phases of dormancy prog...

  9. IMPACT OF LEAD ACID BATTERIES AND CADMIUM STABILIZERS ON INCINERATOR EMISSIONS

    EPA Science Inventory

    The Waste Analysis Sampling, Testing and Evaluation (WASTE) Program is a multi-year, multi-disciplinary program designed to elicit the source and fate of environmentally significant trace materials as a solid waste progresses through management processes. s part of the WASTE Prog...

  10. Assessing genomic selection prediction accuracy in a dynamic barley breeding

    USDA-ARS?s Scientific Manuscript database

    Genomic selection is a method to improve quantitative traits in crops and livestock by estimating breeding values of selection candidates using phenotype and genome-wide marker data sets. Prediction accuracy has been evaluated through simulation and cross-validation, however validation based on prog...

  11. Meet EPA Scientist Valerie Zartarian, Ph.D.

    EPA Pesticide Factsheets

    Senior exposure scientist and research environmental engineer Valerie Zartarian, Ph.D. helps build computer models and other tools that advance our understanding of how people interact with chemicals.

  12. Hot, Hot, Hot Computer Careers.

    ERIC Educational Resources Information Center

    Basta, Nicholas

    1988-01-01

    Discusses the increasing need for electrical, electronic, and computer engineers; and scientists. Provides current status of the computer industry and average salaries. Considers computer chip manufacture and the current chip shortage. (MVL)

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shankar, Arjun

    Computer scientist Arjun Shankar is director of the Compute and Data Environment for Science (CADES), ORNL’s multidisciplinary big data computing center. CADES offers computing, networking and data analytics to facilitate workflows for both ORNL and external research projects.

  14. Jungle Computing: Distributed Supercomputing Beyond Clusters, Grids, and Clouds

    NASA Astrophysics Data System (ADS)

    Seinstra, Frank J.; Maassen, Jason; van Nieuwpoort, Rob V.; Drost, Niels; van Kessel, Timo; van Werkhoven, Ben; Urbani, Jacopo; Jacobs, Ceriel; Kielmann, Thilo; Bal, Henri E.

    In recent years, the application of high-performance and distributed computing in scientific practice has become increasingly wide spread. Among the most widely available platforms to scientists are clusters, grids, and cloud systems. Such infrastructures currently are undergoing revolutionary change due to the integration of many-core technologies, providing orders-of-magnitude speed improvements for selected compute kernels. With high-performance and distributed computing systems thus becoming more heterogeneous and hierarchical, programming complexity is vastly increased. Further complexities arise because urgent desire for scalability and issues including data distribution, software heterogeneity, and ad hoc hardware availability commonly force scientists into simultaneous use of multiple platforms (e.g., clusters, grids, and clouds used concurrently). A true computing jungle.

  15. Serum microRNA biomarker identification in a residential cohort with elevated polychlorinated biphenyl exposures.

    EPA Science Inventory

    Toxicant-associated steatohepatitis (TASH) is a form of liver disease associated with both industrial [1] and environmental [2] chemical exposures. Like other forms of Non-alcoholic Fatty Liver Disease (NAFLD), TASH can contribute to systemic metabolic disease states and may prog...

  16. us9805_latilt

    Science.gov Websites

    ;meta http-equiv=Content-Type content="text/html; charset=iso-8859-1"> <meta name=ProgId content=Word.Document> <meta name=Generator content="Microsoft Word 11"> <meta name /dublin_core"> <meta name=dc.title content="Alaska Solar Resource: Flat Plate Collector, Facing

  17. EVALUATION OF USFILTER CORPORATION'S RETEC® MODEL SCP-6 SEPARATED CELL PURIFICATION SYSTEM FOR CHROMIC ACID ANODIZE BATH SOLUTION

    EPA Science Inventory

    The USEPA has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The ETV P2 Metal Finishing Technologies (ETV-MF) Prog...

  18. HD gas purification for polarized HDice targets production at Jefferson Lab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whisnant, Charles; D'Angelo, Annalisa; Colaneri, Luca

    2014-06-01

    Solid, frozen-spin targets of molecular HD were rst developed for nuclear physics by a collaboration between Syracuse University and Brookhaven National Lab. They have been successfully used in measurements with photon beams, rst at the Laser-Electron-Gamma-Source [1] and most recently at Je erson Lab during the running of the E06-101 (g14) experiment [2]. Preparations are underway to utilize the targets in future electron experiments after the completion of the 12 GeV JLab upgrade [3]. HD is an attractive target since all of the material is polarizable, of low Z, and requires only modest holding elds. At the same time, themore » small contributions from the target cell can be subtracted from direct measurements. Reaching the frozen-spin state with both high polarization and a signi cant spin relaxation time requires careful control of H2 and D2 impurities. Commercially available HD contains 0.5 - 2% concentrations of H2 and D2. Low-temperature distillation is required to reduce these concentrations to the 104 level to enable useful target production. This distillation is done using a column lled with heli-pack C [4] to give good separation e ciency. Approximately 12 moles of commercial HD is condensed into the mechanically refrigerated system at the base temperature of 11K. The system is then isolated and the temperature stabilized at 18K producing liquid HD, which is boiled by a resistive heater. The circulation established by the boil-o condensing throughout the column then ltering back down produces a steady-state isotopic separation permitting the extraction of HD gas with very low H2 and D2 content. A residual gas analyzer initially monitors distillation. Once the H2 concentration falls below its useful operating range, samples are periodically collected for analysis using gas chromatography [5] and Raman scattering. Where the measurement techniques overlap, good agreement is obtained. The operation of the distillery and results of gas analysis will be discussed. References [1] Phy. Rev. Lett. 101 (2009) 172002. [2] www.jlab.org/exp_prog/proposals/06/PR-06-101.pdf [3] www.jlab.org/exp_prog/proposals/12/PR12-12-009.pdf, www.jlab.org/exp_prog/proposals/12/PR12-12-010.pdf, and www.jlab.org/exp_prog/proposals/11/PR12-11-111.pdf [4] Nucl. Inst. Meth. 664 (2012) 347, www.wilmad-labglass.com/Products/LG-6730-104/ [5] Rev. Sci. Instrum. 82, 024101 (2011).« less

  19. Developing a strategy for computational lab skills training through Software and Data Carpentry: Experiences from the ELIXIR Pilot action

    PubMed Central

    Pawlik, Aleksandra; van Gelder, Celia W.G.; Nenadic, Aleksandra; Palagi, Patricia M.; Korpelainen, Eija; Lijnzaad, Philip; Marek, Diana; Sansone, Susanna-Assunta; Hancock, John; Goble, Carole

    2017-01-01

    Quality training in computational skills for life scientists is essential to allow them to deliver robust, reproducible and cutting-edge research. A pan-European bioinformatics programme, ELIXIR, has adopted a well-established and progressive programme of computational lab and data skills training from Software and Data Carpentry, aimed at increasing the number of skilled life scientists and building a sustainable training community in this field. This article describes the Pilot action, which introduced the Carpentry training model to the ELIXIR community. PMID:28781745

  20. Developing a strategy for computational lab skills training through Software and Data Carpentry: Experiences from the ELIXIR Pilot action.

    PubMed

    Pawlik, Aleksandra; van Gelder, Celia W G; Nenadic, Aleksandra; Palagi, Patricia M; Korpelainen, Eija; Lijnzaad, Philip; Marek, Diana; Sansone, Susanna-Assunta; Hancock, John; Goble, Carole

    2017-01-01

    Quality training in computational skills for life scientists is essential to allow them to deliver robust, reproducible and cutting-edge research. A pan-European bioinformatics programme, ELIXIR, has adopted a well-established and progressive programme of computational lab and data skills training from Software and Data Carpentry, aimed at increasing the number of skilled life scientists and building a sustainable training community in this field. This article describes the Pilot action, which introduced the Carpentry training model to the ELIXIR community.

  1. Project Work Assignment | Sustainable Stormwater Funding Project

    EPA Pesticide Factsheets

    2010-11-30

    ... J'l.wroprl 8OOljJf!'l fOI 8..... n """ Nwo I.1n DCN N. e_ e_ prog~;"':x~~ment ObI~~Cl~·· unt I ICents) SI:~~~O:t COSI~~od. Max6 Mo•• Ma:t6 ,,",7 1 ...

  2. CARRY-OVER EFFECTS OF OZONE ON ROOT GROWTH AND CARBOHYDRATE CONCENTRATIONS OF PONDEROSA PINE SEEDLINGS

    EPA Science Inventory

    Ozone exposure decreases belowground carbon allocation and root growth of plants;however,the extent to which these effects persist and the cumulative impact of ozone stress on plant growth are poorly understood.To evaluate the potential for plant compensation,we followed the prog...

  3. The Physiology of Growth Hormone-Releasing Hormone (GHRH) in Breast Cancer

    DTIC Science & Technology

    2003-06-01

    production of growth hormone-releasing factor by carcinoid and pancreatic islet tumors associated with acromegaly . Prog Clin Biol Res 1981; 74:259-271. (16...promotion of apop- cause of acromegaly . More recently, expression has been tosis. These results indicate that disruption of enaog- demonstrated in tumors

  4. Highway concrete pavement technology development and testing : volume III, field evaluation of Strategic Highway Research Program (SHRP) C-205 test sites (high-performance concrete).

    DOT National Transportation Integrated Search

    2006-05-01

    This research study, sponsored by the Federal Highway Administration, summarizes the field performance of eight high-early-strength (HES) : concrete patches between 1994 and 1998. The patches were constructed under the Strategic Highway Research Prog...

  5. THE WORKSHOP ON THE SOURCE APPORTIONMENT OF PM HEALTH EFFECTS: INTER-COMPARISON OF RESULTS AND IMPLICATIONS

    EPA Science Inventory

    While the association between exposure to ambient fine particulate matter mass (PM2.5) and human mortality is well established, the most responsible particle types/sources are not yet certain. In May 2003, the U.S. Environmental Protection Agency's Particulate Matter Centers Prog...

  6. Performance Characteristics of Automotive Engines in the United States : Third Series - Report No. 14 - 1978 Buick 196 CID (3.2L)

    DOT National Transportation Integrated Search

    1981-02-01

    Experimental data were obtained in dynamometer tests of a 1978 Buick 231 CID turbocharged to determine fuel consumption and emissions (hydrocarbon, carbon monoxide, oxides of nitrogen) at steady-state engine operating modes. The objective of the prog...

  7. NUCLEAR ESPIONAGE: Report Details Spying on Touring Scientists.

    PubMed

    Malakoff, D

    2000-06-30

    A congressional report released this week details dozens of sometimes clumsy attempts by foreign agents to obtain nuclear secrets from U.S. nuclear scientists traveling abroad, ranging from offering scientists prostitutes to prying off the backs of their laptop computers. The report highlights the need to better prepare traveling researchers to safeguard secrets and resist such temptations, say the two lawmakers who requested the report and officials at the Department of Energy, which employs the scientists.

  8. Keeping an Eye on the Prize

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hazi, A U

    2007-02-06

    Setting performance goals is part of the business plan for almost every company. The same is true in the world of supercomputers. Ten years ago, the Department of Energy (DOE) launched the Accelerated Strategic Computing Initiative (ASCI) to help ensure the safety and reliability of the nation's nuclear weapons stockpile without nuclear testing. ASCI, which is now called the Advanced Simulation and Computing (ASC) Program and is managed by DOE's National Nuclear Security Administration (NNSA), set an initial 10-year goal to obtain computers that could process up to 100 trillion floating-point operations per second (teraflops). Many computer experts thought themore » goal was overly ambitious, but the program's results have proved them wrong. Last November, a Livermore-IBM team received the 2005 Gordon Bell Prize for achieving more than 100 teraflops while modeling the pressure-induced solidification of molten metal. The prestigious prize, which is named for a founding father of supercomputing, is awarded each year at the Supercomputing Conference to innovators who advance high-performance computing. Recipients for the 2005 prize included six Livermore scientists--physicists Fred Streitz, James Glosli, and Mehul Patel and computer scientists Bor Chan, Robert Yates, and Bronis de Supinski--as well as IBM researchers James Sexton and John Gunnels. This team produced the first atomic-scale model of metal solidification from the liquid phase with results that were independent of system size. The record-setting calculation used Livermore's domain decomposition molecular-dynamics (ddcMD) code running on BlueGene/L, a supercomputer developed by IBM in partnership with the ASC Program. BlueGene/L reached 280.6 teraflops on the Linpack benchmark, the industry standard used to measure computing speed. As a result, it ranks first on the list of Top500 Supercomputer Sites released in November 2005. To evaluate the performance of nuclear weapons systems, scientists must understand how materials behave under extreme conditions. Because experiments at high pressures and temperatures are often difficult or impossible to conduct, scientists rely on computer models that have been validated with obtainable data. Of particular interest to weapons scientists is the solidification of metals. ''To predict the performance of aging nuclear weapons, we need detailed information on a material's phase transitions'', says Streitz, who leads the Livermore-IBM team. For example, scientists want to know what happens to a metal as it changes from molten liquid to a solid and how that transition affects the material's characteristics, such as its strength.« less

  9. Final Report. Center for Scalable Application Development Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mellor-Crummey, John

    2014-10-26

    The Center for Scalable Application Development Software (CScADS) was established as a part- nership between Rice University, Argonne National Laboratory, University of California Berkeley, University of Tennessee – Knoxville, and University of Wisconsin – Madison. CScADS pursued an integrated set of activities with the aim of increasing the productivity of DOE computational scientists by catalyzing the development of systems software, libraries, compilers, and tools for leadership computing platforms. Principal Center activities were workshops to engage the research community in the challenges of leadership computing, research and development of open-source software, and work with computational scientists to help them develop codesmore » for leadership computing platforms. This final report summarizes CScADS activities at Rice University in these areas.« less

  10. Characterization of real-time computers

    NASA Technical Reports Server (NTRS)

    Shin, K. G.; Krishna, C. M.

    1984-01-01

    A real-time system consists of a computer controller and controlled processes. Despite the synergistic relationship between these two components, they have been traditionally designed and analyzed independently of and separately from each other; namely, computer controllers by computer scientists/engineers and controlled processes by control scientists. As a remedy for this problem, in this report real-time computers are characterized by performance measures based on computer controller response time that are: (1) congruent to the real-time applications, (2) able to offer an objective comparison of rival computer systems, and (3) experimentally measurable/determinable. These measures, unlike others, provide the real-time computer controller with a natural link to controlled processes. In order to demonstrate their utility and power, these measures are first determined for example controlled processes on the basis of control performance functionals. They are then used for two important real-time multiprocessor design applications - the number-power tradeoff and fault-masking and synchronization.

  11. Assessment of chemistry models for compressible reacting flows

    NASA Astrophysics Data System (ADS)

    Lapointe, Simon; Blanquart, Guillaume

    2014-11-01

    Recent technological advances in propulsion and power devices and renewed interest in the development of next generation supersonic and hypersonic vehicles have increased the need for detailed understanding of turbulence-combustion interactions in compressible reacting flows. In numerical simulations of such flows, accurate modeling of the fuel chemistry is a critical component of capturing the relevant physics. Various chemical models are currently being used in reacting flow simulations. However, the differences between these models and their impacts on the fluid dynamics in the context of compressible flows are not well understood. In the present work, a numerical code is developed to solve the fully coupled compressible conservation equations for reacting flows. The finite volume code is based on the theoretical and numerical framework developed by Oefelein (Prog. Aero. Sci. 42 (2006) 2-37) and employs an all-Mach-number formulation with dual time-stepping and preconditioning. The numerical approach is tested on turbulent premixed flames at high Karlovitz numbers. Different chemical models of varying complexity and computational cost are used and their effects are compared.

  12. The Fermilab Connection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fermilab

    More than 4,000 scientists in 53 countries use Fermilab and its particle accelerators, detectors and computers for their research. That includes about 2,500 scientists from 223 U.S. institutions in 42 states, plus the District of Columbia and Puerto Rico.

  13. EarthCube: A Community-Driven Cyberinfrastructure for the Geosciences

    NASA Astrophysics Data System (ADS)

    Koskela, Rebecca; Ramamurthy, Mohan; Pearlman, Jay; Lehnert, Kerstin; Ahern, Tim; Fredericks, Janet; Goring, Simon; Peckham, Scott; Powers, Lindsay; Kamalabdi, Farzad; Rubin, Ken; Yarmey, Lynn

    2017-04-01

    EarthCube is creating a dynamic, System of Systems (SoS) infrastructure and data tools to collect, access, analyze, share, and visualize all forms of geoscience data and resources, using advanced collaboration, technological, and computational capabilities. EarthCube, as a joint effort between the U.S. National Science Foundation Directorate for Geosciences and the Division of Advanced Cyberinfrastructure, is a quickly growing community of scientists across all geoscience domains, as well as geoinformatics researchers and data scientists. EarthCube has attracted an evolving, dynamic virtual community of more than 2,500 contributors, including earth, ocean, polar, planetary, atmospheric, geospace, computer and social scientists, educators, and data and information professionals. During 2017, EarthCube will transition to the implementation phase. The implementation will balance "innovation" and "production" to advance cross-disciplinary science goals as well as the development of future data scientists. This presentation will describe the current architecture design for the EarthCube cyberinfrastructure and implementation plan.

  14. Volunteer Clouds and Citizen Cyberscience for LHC Physics

    NASA Astrophysics Data System (ADS)

    Aguado Sanchez, Carlos; Blomer, Jakob; Buncic, Predrag; Chen, Gang; Ellis, John; Garcia Quintas, David; Harutyunyan, Artem; Grey, Francois; Lombrana Gonzalez, Daniel; Marquina, Miguel; Mato, Pere; Rantala, Jarno; Schulz, Holger; Segal, Ben; Sharma, Archana; Skands, Peter; Weir, David; Wu, Jie; Wu, Wenjing; Yadav, Rohit

    2011-12-01

    Computing for the LHC, and for HEP more generally, is traditionally viewed as requiring specialized infrastructure and software environments, and therefore not compatible with the recent trend in "volunteer computing", where volunteers supply free processing time on ordinary PCs and laptops via standard Internet connections. In this paper, we demonstrate that with the use of virtual machine technology, at least some standard LHC computing tasks can be tackled with volunteer computing resources. Specifically, by presenting volunteer computing resources to HEP scientists as a "volunteer cloud", essentially identical to a Grid or dedicated cluster from a job submission perspective, LHC simulations can be processed effectively. This article outlines both the technical steps required for such a solution and the implications for LHC computing as well as for LHC public outreach and for participation by scientists from developing regions in LHC research.

  15. Computer-based communication in support of scientific and technical work. [conferences on management information systems used by scientists of NASA programs

    NASA Technical Reports Server (NTRS)

    Vallee, J.; Wilson, T.

    1976-01-01

    Results are reported of the first experiments for a computer conference management information system at the National Aeronautics and Space Administration. Between August 1975 and March 1976, two NASA projects with geographically separated participants (NASA scientists) used the PLANET computer conferencing system for portions of their work. The first project was a technology assessment of future transportation systems. The second project involved experiments with the Communication Technology Satellite. As part of this project, pre- and postlaunch operations were discussed in a computer conference. These conferences also provided the context for an analysis of the cost of computer conferencing. In particular, six cost components were identified: (1) terminal equipment, (2) communication with a network port, (3) network connection, (4) computer utilization, (5) data storage and (6) administrative overhead.

  16. Know Your Discipline: Teaching the Philosophy of Computer Science

    ERIC Educational Resources Information Center

    Tedre, Matti

    2007-01-01

    The diversity and interdisciplinarity of computer science and the multiplicity of its uses in other sciences make it hard to define computer science and to prescribe how computer science should be carried out. The diversity of computer science also causes friction between computer scientists from different branches. Computer science curricula, as…

  17. Patterns of Genetic Variation among Populations of the Asian Longhorned Beetle in China and Korea

    USDA-ARS?s Scientific Manuscript database

    Central to the study of invasive species is identifying source populations in their native ranges. Source populations of invasive species can provide important information about species life cycles, host use and species-specific predators and parasites which could be deployed in a pest control prog...

  18. Dirk Jordan | NREL

    Science.gov Websites

    quantify module degradation rates. Statistical analysis of reported degradation rates of PV modules degradation rates," Prog. in PV 24(7), 2016, DOI: 10.1002/pip.2744 Jordan D.C., Silverman T.J PV, 2017, DOI: 10.1002/pip.2866 Jordan D.C., Silverman T.J., Sekulic B., Kurtz S.R., "PV

  19. hawaii_9805_latilt

    Science.gov Websites

    ;meta http-equiv=Content-Type content="text/html; charset=iso-8859-1"> text-decoration:underline; text-underline:single;} a:visited, span.MsoHyperlinkFollowed {color:blue; text-decoration:underline; text-underline:single;} p {mso-margin-top-alt:auto; margin-right:0in

  20. An Assessment of Decision-Making Processes: Evaluation of Where Land Protection Planning Can Incorporate Climate Change Information (Final Report)

    EPA Science Inventory

    EPA announced the availability of the final report, An Assessment of Decision-Making Processes: Evaluation of Where Land Protection Planning Can Incorporate Climate Change Information. This report is a review of decision-making processes of selected land protection prog...

  1. Statistical basis and outputs of stable isotope mixing models: Comment on Fry (2013)

    EPA Science Inventory

    A recent article by Fry (2013; Mar Ecol Prog Ser 472:1−13) reviewed approaches to solving underdetermined stable isotope mixing systems, and presented a new graphical approach and set of summary statistics for the analysis of such systems. In his review, Fry (2013) mis-characteri...

  2. Juvenile Survival in Common Loons Gavia Immer: Effects of Natal Lake Size and pH

    EPA Science Inventory

    Survival is a vexing parameter to measure in many young birds because of dispersal and delayed impacts of natal rearing conditions on fitness. Drawing upon marking and resighting records from an 18-year study of territorial behavior, we used Cormack-Jolly-Seber analysis with Prog...

  3. Risk of Pore Water Hydrogen Sulfide Toxicity in Dredged Material Bioassays

    DTIC Science & Technology

    1995-11-01

    Prog. Ser. 101, 147-155. Moore, D. W., and Dillon, T. M. (1993). “Chronic sublethal effects of San Francisco Bay sediments on Neris (Neanthes...metabolism of Arctica isfandica L. (Bivalvia),” J. Exp. Mar. Biol. Ecol. 170, 213-226. Oritz, J. A., Rueda, A., Carbonell, G., Camargo , J. A., Nieto, F

  4. Performance Characteristics of Automotive Engines in the United States, Third Series, Report No. 8, 1978 Buick, 231 CID (3.8 Liters), 4V, Turbocharge

    DOT National Transportation Integrated Search

    1979-02-01

    Experimental data were obtained in dynamometer tests of a 1978 Buick 231 CID turbocharged to determine fuel consumption and emissions (hydrocarbon, carbon monoxide, oxides of nitrogen) at steady-state engine operating modes. The objective of the prog...

  5. Hampton Roads, Virginia eight-hour ozone maintenance area transportation conformity analysis : 2030 long range transportation plan and FY 09-12 transportation improvement program, draft executive summary.

    DOT National Transportation Integrated Search

    2010-05-01

    This report presents the regional conformity analysis and recommendation for a finding of conformity for the Hampton Roads 2030 Long Range Transportation Plan (LRTP, or "Plan") and associated Fiscal Year (FY) 2009-2012 Transportation Improvement Prog...

  6. My life in the watershed: then, now, and beyond

    USDA-ARS?s Scientific Manuscript database

    "My Life in the Watershed" tells the first-hand account of a young girl growing up in southwestern Oklahoma, the impact growing up in a watershed had on her life, and the vision she sees for her children and her children's children, so they will continue to benefit from the USDA Small Watershed Prog...

  7. A Review of Water Mist Technology for Fire Suppression

    DTIC Science & Technology

    1994-09-30

    Smith, D.P., and Ball, D.N. (1993), "New Applications of Aqueous Agents for Fire Suppression," Halen Alternatives Technical Working Conference...Flows," Prog. Energy, Combust. ScL, 14, 1988, pp. 171-194. Dabros, T., and Van de Ven, T.G.M. (1992), "Hydrodynamic Interactions between Two Spheres Near

  8. Quantifying long-term risks to sea otters from the 1989 'Exxon Valdez' oil spill: reply to Harwell & Gentile (2013)

    USGS Publications Warehouse

    Ballachey, Brenda E.; Bodkin, James L.; Monson, Daniel H.

    2013-01-01

    Recovery of sea otter populations in Prince William Sound (PWS), Alaska, has been delayed for more than 2 decades following the 1989 ‘Exxon Valdez’ oil spill. Harwell & Gentile (2013; Mar Ecol Prog Ser 488:291–296) question our conclusions in Bodkin et al. (2012; Mar Ecol Prog Ser 447:273-287) regarding adverse effects that oil lingering in the environment may have on sea otters. They agree that exposure may continue, but disagree that it constitutes a significant risk to sea otters. In Bodkin et al. (2012), we suggested that subtle effects of chronic exposure were the most reasonable explanation for delayed recovery of the sea otter population in areas of western PWS, where shorelines were most heavily oiled. Here, we provide additional information on the ecology of sea otters that clarifies why the toxicological effects of oral ingestion of oil do not reflect all effects of chronic exposure. The full range of energetic, behavioral, and toxicological concerns must be considered to appraise how chronic exposure to residual oil may constrain recovery of sea otter populations.

  9. Antioxidant and ACE-inhibitory activities of hemp (Cannabis sativa L.) protein hydrolysates produced by the proteases AFP, HT, Pro-G, actinidin and zingibain.

    PubMed

    Teh, Sue-Siang; Bekhit, Alaa El-Din A; Carne, Alan; Birch, John

    2016-07-15

    Hemp protein isolates (HPIs) were hydrolysed by proteases (AFP, HT, ProG, actinidin and zingibain). The enzymatic hydrolysis of HPIs was evaluated through the degree of hydrolysis and SDS-PAGE profiles. The bioactive properties of the resultant hydrolysates (HPHs) were accessed through ORAC, DPPḢ scavenging and ACE-inhibitory activities. The physical properties of the resultant HPHs were evaluated for their particle sizes, zeta potential and surface hydrophobicity. HT had the highest rate of caseinolytic activity at the lowest concentration (0.1 mg mL(-1)) compared to other proteases that required concentration of 100 mg mL(-1) to achieve their maximum rate of caseinolytic activity. This led to the highest degree of hydrolysis of HPIs by HT in the SDS-PAGE profiles. Among all proteases and substrates, HT resulted in the highest bioactivities (ORAC, DPPḢ scavenging and ACE-inhibitory activities) generated from alkali extracted HPI in the shortest time (2 h) compared to the other protease preparations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Benefits of Exchange Between Computer Scientists and Perceptual Scientists: A Panel Discussion

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary K.; Null, Cynthia H. (Technical Monitor)

    1995-01-01

    We have established several major goals for this panel: 1) Introduce the computer graphics community to some specific leaders in the use of perceptual psychology relating to computer graphics; 2) Enumerate the major results that are known, and provide a set of resources for finding others; 3) Identify research areas where knowledge of perceptual psychology can help computer system designers improve their systems; and 4) Provide advice to researchers on how they can establish collaborations in their own research programs. We believe this will be a very important panel. In addition to generating lively discussion, we hope to point out some of the fundamental issues that occur at the boundary between computer science and perception, and possibly help researchers avoid some of the common pitfalls.

  11. Boom. Bust. Build.

    ERIC Educational Resources Information Center

    Kite, Vance; Park, Soonhye

    2018-01-01

    In 2006 Jeanette Wing, a professor of computer science at Carnegie Mellon University, proposed computational thinking (CT) as a literacy just as important as reading, writing, and mathematics. Wing defined CT as a set of skills and strategies computer scientists use to solve complex, computational problems (Wing 2006). The computer science and…

  12. Workforce Retention Study in Support of the U.S. Army Aberdeen Test Center Human Capital Management Strategy

    DTIC Science & Technology

    2016-09-01

    Sciences Group 6% 1550s Computer Scientists Group 5% Other 1500s ORSAa, Mathematics, & Statistics Group 3% 1600s Equipment & Facilities Group 4...Employee removal based on misconduct, delinquency , suitability, unsatisfactory performance, or failure to qualify for conversion to a career appointment...average of 10.4% in many areas, but over double the average for the 1550s (Computer Scientists) and other 1500s (ORSA, Mathematics, and Statistics ). Also

  13. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.

  14. Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.

  15. Detailed genetic characteristics of an international large cohort of patients with Stargardt disease: ProgStar study report 8.

    PubMed

    Fujinami, Kaoru; Strauss, Rupert W; Chiang, John Pei-Wen; Audo, Isabelle S; Bernstein, Paul S; Birch, David G; Bomotti, Samantha M; Cideciyan, Artur V; Ervin, Ann-Margret; Marino, Meghan J; Sahel, José-Alain; Mohand-Said, Saddek; Sunness, Janet S; Traboulsi, Elias I; West, Sheila; Wojciechowski, Robert; Zrenner, Eberhart; Michaelides, Michel; Scholl, Hendrik P N

    2018-06-20

    To describe the genetic characteristics of the cohort enrolled in the international multicentre progression of Stargardt disease 1 (STGD1) studies (ProgStar) and to determine geographic differences based on the allele frequency. 345 participants with a clinical diagnosis of STGD1 and harbouring at least one disease-causing ABCA4 variant were enrolled from 9 centres in the USA and Europe. All variants were reviewed and in silico analysis was performed including allele frequency in public databases and pathogenicity predictions. Participants with multiple likely pathogenic variants were classified into four national subgroups (USA, UK, France, Germany), with subsequent comparison analysis of the allele frequency for each prevalent allele. 211 likely pathogenic variants were identified in the total cohort, including missense (63%), splice site alteration (18%), stop (9%) and others. 50 variants were novel. Exclusively missense variants were detected in 139 (50%) of 279 patients with multiple pathogenic variants. The three most prevalent variants of these patients with multiple pathogenic variants were p.G1961E (15%), p.G863A (7%) and c.5461-10 T>C (5%). Subgroup analysis revealed a statistically significant difference between the four recruiting nations in the allele frequency of nine variants. There is a large spectrum of ABCA4 sequence variants, including 50 novel variants, in a well-characterised cohort thereby further adding to the unique allelic heterogeneity in STGD1. Approximately half of the cohort harbours missense variants only, indicating a relatively mild phenotype of the ProgStar cohort. There are significant differences in allele frequencies between nations, although the three most prevalent variants are shared as frequent variants. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  16. Impact of thyroid function abnormalities on reproductive hormones during menstrual cycle in premenopausal HIV infected females at NAUTH, Nnewi, Nigeria

    PubMed Central

    Emelumadu, Obiageli Fidelia; Igwegbe, Anthony Osita; Monago, Ifeoma Nwamaka; Ilika, Amobi Linus

    2017-01-01

    Background This was a prospective study designed to evaluate the impact of thyroid function abnormalities on reproductive hormones during menstrual cycle in HIV infected females at Nnamdi Azikiwe University Teaching Hospital Nnewi, South-East Nigeria. Methods The study randomly recruited 35 Symptomatic HIV infected females and 35 Symptomatic HIV infected females on antiretroviral therapy (HAART) for not less than six weeks from an HIV clinic and 40 apparently heathy control females among the hospital staff of NAUTH Nnewi. They were all premenopausal females with regular menstrual cycle and aged between 15–45 years. Blood samples were collected at follicular and luteal phases of their menstrual cycle for assay of Thyroid indices (FT3, FT4 and TSH) and Reproductive indices (FSH, LH, Estrogen, Progesterone, Prolactin and Testosterone) using ELISA method. Results The result showed significantly higher FSH and LH but significantly lower progesterone (prog) and estrogen (E2) in the test females compared to control females at both phases of menstrual cycle (P<0.05). There was significantly lower FT3 but significantly higher TSH value in Symptomatic HIV females (P<0.05). FSH, LH and TSH values were significantly lowered while prog and FT3 were significantly higher in Symptomatic HIV on ART compared to Symptomatic HIV females (P<0.05). FT3, FT4, Prog and E2 were inversely correlated while FSH and LH were positively correlated with duration of HIV infection in HIV females (P<0.05 respectively). There was a direct correlation between CD4+ count and FT3 while inverse correlation was found between CD4+ count and TSH levels (P<0.05). Discussion The present study demonstrated hypothyroidism with a significant degree of primary hypogonadism in Symptomatic HIV infected females at both follicular and luteal phases of menstrual cycle which tends to normalize on treatments. PMID:28723963

  17. Recent Advances and Issues in Computers. Oryx Frontiers of Science Series.

    ERIC Educational Resources Information Center

    Gay, Martin K.

    Discussing recent issues in computer science, this book contains 11 chapters covering: (1) developments that have the potential for changing the way computers operate, including microprocessors, mass storage systems, and computing environments; (2) the national computational grid for high-bandwidth, high-speed collaboration among scientists, and…

  18. Electronic Ecosystem.

    ERIC Educational Resources Information Center

    Travis, John

    1991-01-01

    A discipline in which scientists seek to simulate and synthesize lifelike behaviors within computers, chemical mixtures, and other media is discussed. A computer program with self-replicating digital "organisms" that evolve as they compete for computer time and memory is described. (KR)

  19. Chemistry Research

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Philip Morris research center scientists use a computer program called CECTRP, for Chemical Equilibrium Composition and Transport Properties, to gain insight into the behavior of atoms as they progress along the reaction pathway. Use of the program lets the scientist accurately predict the behavior of a given molecule or group of molecules. Computer generated data must be checked by laboratory experiment, but the use of CECTRP saves the researchers hundreds of hours of laboratory time since experiments must run only to validate the computer's prediction. Philip Morris estimates that had CECTRP not been available, at least two man years would have been required to develop a program to perform similar free energy calculations.

  20. Developing an online programme in computational biology.

    PubMed

    Vincent, Heather M; Page, Christopher

    2013-11-01

    Much has been written about the need for continuing education and training to enable life scientists and computer scientists to manage and exploit the different types of biological data now becoming available. Here we describe the development of an online programme that combines short training courses, so that those who require an educational programme can progress to complete a formal qualification. Although this flexible approach fits the needs of course participants, it does not fit easily within the organizational structures of a campus-based university.

  1. System biology of gene regulation.

    PubMed

    Baitaluk, Michael

    2009-01-01

    A famous joke story that exhibits the traditionally awkward alliance between theory and experiment and showing the differences between experimental biologists and theoretical modelers is when a University sends a biologist, a mathematician, a physicist, and a computer scientist to a walking trip in an attempt to stimulate interdisciplinary research. During a break, they watch a cow in a field nearby and the leader of the group asks, "I wonder how one could decide on the size of a cow?" Since a cow is a biological object, the biologist responded first: "I have seen many cows in this area and know it is a big cow." The mathematician argued, "The true volume is determined by integrating the mathematical function that describes the outer surface of the cow's body." The physicist suggested: "Let's assume the cow is a sphere...." Finally the computer scientist became nervous and said that he didn't bring his computer because there is no Internet connection up there on the hill. In this humorous but explanatory story suggestions proposed by theorists can be taken to reflect the view of many experimental biologists that computer scientists and theorists are too far removed from biological reality and therefore their theories and approaches are not of much immediate usefulness. Conversely, the statement of the biologist mirrors the view of many traditional theoretical and computational scientists that biological experiments are for the most part simply descriptive, lack rigor, and that much of the resulting biological data are of questionable functional relevance. One of the goals of current biology as a multidisciplinary science is to bring people from different scientific areas together on the same "hill" and teach them to speak the same "language." In fact, of course, when presenting their data, most experimentalist biologists do provide an interpretation and explanation for the results, and many theorists/computer scientists aim to answer (or at least to fully describe) questions of biological relevance. Thus systems biology could be treated as such a socioscientific phenomenon and a new approach to both experiments and theory that is defined by the strategy of pursuing integration of complex data about the interactions in biological systems from diverse experimental sources using interdisciplinary tools and personnel.

  2. 78 FR 76319 - Notice of Invitation-Coal Exploration License Application MTM 106757, Montana

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-17

    ...] Notice of Invitation--Coal Exploration License Application MTM 106757, Montana AGENCY: Bureau of Land... Signal Peak Energy, LLC on a pro rata cost sharing basis in a program for the exploration of coal... Office coal Web site at http://www.blm.gov/mt/st/en/prog/energy/coal.html . A written notice to...

  3. Photodynamic Molecular Beacons: An Image-Guided Therapeutic Approach to Breast Cancer Vertebral Metastases

    DTIC Science & Technology

    2012-03-01

    uptake. Prog Clin Biol Res. 1984;; 170: 629-­36. 14. Wilson BC, Firnau G, Jeeves WP, Brown KL, Burns-­McCormick DM . Chromatographic analysis and...Photosensitizer-­conjugated human serum albumin nanoparticles for effective photodynamic therapy. Theranostics. 2011;; 1: 230-­9. 24. Ferreira CL, Yapp DT, Crisp

  4. 78 FR 34403 - Notice of Availability of the Record of Decision for the Quartzsite Solar Energy Project, AZ

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-07

    ...: http://www.blm.gov/az/st/en/prog/energy/solar/quartzsite_solar_energy.html . FOR FURTHER INFORMATION... . Persons who use a telecommunications device for the deaf (TDD) may call the Federal Information Relay.... You will receive a reply during normal business hours. SUPPLEMENTARY INFORMATION: Quartzsite Solar...

  5. Effects of Synthetic Versus Natural Colloid Resuscitation on Inducing Dilutional Coagulopathy and Increasing Hemorrhage in Rabbits

    DTIC Science & Technology

    2008-05-01

    hemostasis, and plasma expanders: a quarter century enigma. Fed Proc. 1975;34:1429–1440. 23. Bergqvist D. Dextran and haemostasis. a review. Acta Chir ...eds. Blood Substitutes and Plasma Expanders. Prog Clin Biol Res. 1978;19:293–298. 57. Kovalik SG, Ledgewood AM, Lucas CE, Higgins RF. The cardiac

  6. Towards Direct Simulations of Counterflow Flames with Consistent Numerical Differential-Algebraic Boundary Conditions

    DTIC Science & Technology

    2015-05-18

    First, the gov - erning equations of the problem are presented. A detailed discussion on the construction of the initial profile of the flow follows...time from the DoD HPCMP Open Research Systems and JPL/ NASA is gratefully acknowledged. References [1] H. Tsuji, Prog. Energ. Combust.8(2) (1982) 93-119

  7. Prevention of Mycobacterium avium subsp. paratuberculosis (MAP) infection in Balb/c mice by feeding probiotic Lactobacillus acidophilus NP-51

    USDA-ARS?s Scientific Manuscript database

    The objective of this study was to examine effects of feeding Lactobacillus acidophilus strain NP51 to mice challenged with Mycobacterium avium subspecies paratuberculosis (MAP), the causative agent of Johne’s disease. We hypothesized that feeding NP51 would increase Th-1 responses and decrease prog...

  8. Prevention of Mycobacterium avium subsp. paratuberculosis (MAP) Infection in Balb/c Mice by Feeding Probiotic Lactobacillus acidophilus NP-51

    USDA-ARS?s Scientific Manuscript database

    The objective of this study was to examine effects of feeding Lactobacillus acidophilus strain NP51 to mice challenged with Mycobacterium avium subspecies paratuberculosis (MAP), the causative agent of Johne’s disease. We hypothesized that feeding NP51 would increase Th-1 responses and decrease prog...

  9. Numerical Sensitivity of Trajectories Across Conformational Energy Hypersurfaces from Geometry Optimized Molecular Orbital Calculations: AM1, MNDO, and MINDO/3

    DTIC Science & Technology

    1988-01-01

    relationship studies, it became corn- surfaces are being traversed, the molecule monly believed that compounds with higher h can go along different paths on...1975). AMPAC, and consanguineous programs 15. W. Thiel, Quantum Chem. Prog. Exchange Cata- should be done with the tightest available log, 11, 353

  10. Report: State of Utah Drinking Water State Revolving Fund Financial Statements with Independent Auditor’s Report, June 30, 2002

    EPA Pesticide Factsheets

    Report #2003-1-00110, June 3, 2003.Audit of the net assets statement of the Utah Dept of Env Quality Drinking Water State Revolving Fund Prog as of June 30, 2002, and the statements of revenues, expenses and changes in fund net assets, and 2002 cash flows.

  11. Argonne Out Loud: Computation, Big Data, and the Future of Cities

    ScienceCinema

    Catlett, Charlie

    2018-01-16

    Charlie Catlett, a Senior Computer Scientist at Argonne and Director of the Urban Center for Computation and Data at the Computation Institute of the University of Chicago and Argonne, talks about how he and his colleagues are using high-performance computing, data analytics, and embedded systems to better understand and design cities.

  12. RIACS FY2002 Annual Report

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    2002-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. Operated by the Universities Space Research Association (a non-profit university consortium), RIACS is located at the NASA Ames Research Center, Moffett Field, California. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in September 2003. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology (IT) Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1) Automated Reasoning for Autonomous Systems; 2) Human-Centered Computing; and 3) High Performance Computing and Networking. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains including aerospace technology, earth science, life sciences, and astrobiology. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  13. Exascale computing and what it means for shock physics

    NASA Astrophysics Data System (ADS)

    Germann, Timothy

    2015-06-01

    The U.S. Department of Energy is preparing to launch an Exascale Computing Initiative, to address the myriad challenges required to deploy and effectively utilize an exascale-class supercomputer (i.e., one capable of performing 1018 operations per second) in the 2023 timeframe. Since physical (power dissipation) requirements limit clock rates to at most a few GHz, this will necessitate the coordination of on the order of a billion concurrent operations, requiring sophisticated system and application software, and underlying mathematical algorithms, that may differ radically from traditional approaches. Even at the smaller workstation or cluster level of computation, the massive concurrency and heterogeneity within each processor will impact computational scientists. Through the multi-institutional, multi-disciplinary Exascale Co-design Center for Materials in Extreme Environments (ExMatEx), we have initiated an early and deep collaboration between domain (computational materials) scientists, applied mathematicians, computer scientists, and hardware architects, in order to establish the relationships between algorithms, software stacks, and architectures needed to enable exascale-ready materials science application codes within the next decade. In my talk, I will discuss these challenges, and what it will mean for exascale-era electronic structure, molecular dynamics, and engineering-scale simulations of shock-compressed condensed matter. In particular, we anticipate that the emerging hierarchical, heterogeneous architectures can be exploited to achieve higher physical fidelity simulations using adaptive physics refinement. This work is supported by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research.

  14. CESDIS

    NASA Technical Reports Server (NTRS)

    1994-01-01

    CESDIS, the Center of Excellence in Space Data and Information Sciences was developed jointly by NASA, Universities Space Research Association (USRA), and the University of Maryland in 1988 to focus on the design of advanced computing techniques and data systems to support NASA Earth and space science research programs. CESDIS is operated by USRA under contract to NASA. The Director, Associate Director, Staff Scientists, and administrative staff are located on-site at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The primary CESDIS mission is to increase the connection between computer science and engineering research programs at colleges and universities and NASA groups working with computer applications in Earth and space science. Research areas of primary interest at CESDIS include: 1) High performance computing, especially software design and performance evaluation for massively parallel machines; 2) Parallel input/output and data storage systems for high performance parallel computers; 3) Data base and intelligent data management systems for parallel computers; 4) Image processing; 5) Digital libraries; and 6) Data compression. CESDIS funds multiyear projects at U. S. universities and colleges. Proposals are accepted in response to calls for proposals and are selected on the basis of peer reviews. Funds are provided to support faculty and graduate students working at their home institutions. Project personnel visit Goddard during academic recess periods to attend workshops, present seminars, and collaborate with NASA scientists on research projects. Additionally, CESDIS takes on specific research tasks of shorter duration for computer science research requested by NASA Goddard scientists.

  15. Computational Thinking: A Digital Age Skill for Everyone

    ERIC Educational Resources Information Center

    Barr, David; Harrison, John; Conery, Leslie

    2011-01-01

    In a seminal article published in 2006, Jeanette Wing described computational thinking (CT) as a way of "solving problems, designing systems, and understanding human behavior by drawing on the concepts fundamental to computer science." Wing's article gave rise to an often controversial discussion and debate among computer scientists,…

  16. Application of advanced computing techniques to the analysis and display of space science measurements

    NASA Technical Reports Server (NTRS)

    Klumpar, D. M.; Lapolla, M. V.; Horblit, B.

    1995-01-01

    A prototype system has been developed to aid the experimental space scientist in the display and analysis of spaceborne data acquired from direct measurement sensors in orbit. We explored the implementation of a rule-based environment for semi-automatic generation of visualizations that assist the domain scientist in exploring one's data. The goal has been to enable rapid generation of visualizations which enhance the scientist's ability to thoroughly mine his data. Transferring the task of visualization generation from the human programmer to the computer produced a rapid prototyping environment for visualizations. The visualization and analysis environment has been tested against a set of data obtained from the Hot Plasma Composition Experiment on the AMPTE/CCE satellite creating new visualizations which provided new insight into the data.

  17. Information processing, computation, and cognition.

    PubMed

    Piccinini, Gualtiero; Scarantino, Andrea

    2011-01-01

    Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both - although others disagree vehemently. Yet different cognitive scientists use 'computation' and 'information processing' to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates' empirical aspects.

  18. Scientists at Work. Final Report.

    ERIC Educational Resources Information Center

    Education Turnkey Systems, Inc., Falls Church, VA.

    This report summarizes activities related to the development, field testing, evaluation, and marketing of the "Scientists at Work" program which combines computer assisted instruction with database tools to aid cognitively impaired middle and early high school children in learning and applying thinking skills to science. The brief report reviews…

  19. Bridging Social and Semantic Computing - Design and Evaluation of User Interfaces for Hybrid Systems

    ERIC Educational Resources Information Center

    Bostandjiev, Svetlin Alex I.

    2012-01-01

    The evolution of the Web brought new interesting problems to computer scientists that we loosely classify in the fields of social and semantic computing. Social computing is related to two major paradigms: computations carried out by a large amount of people in a collective intelligence fashion (i.e. wikis), and performing computations on social…

  20. Profugus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Thomas; Hamilton, Steven; Slattery, Stuart

    Profugus is an open-source mini-application (mini-app) for radiation transport and reactor applications. It contains the fundamental computational kernels used in the Exnihilo code suite from Oak Ridge National Laboratory. However, Exnihilo is production code with a substantial user base. Furthermore, Exnihilo is export controlled. This makes collaboration with computer scientists and computer engineers difficult. Profugus is designed to bridge that gap. By encapsulating the core numerical algorithms in an abbreviated code base that is open-source, computer scientists can analyze the algorithms and easily make code-architectural changes to test performance without compromising the production code values of Exnihilo. Profugus is notmore » meant to be production software with respect to problem analysis. The computational kernels in Profugus are designed to analyze performance, not correctness. Nonetheless, users of Profugus can setup and run problems with enough real-world features to be useful as proof-of-concept for actual production work.« less

  1. Combinatorial Algorithms to Enable Computational Science and Engineering: Work from the CSCAPES Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boman, Erik G.; Catalyurek, Umit V.; Chevalier, Cedric

    2015-01-16

    This final progress report summarizes the work accomplished at the Combinatorial Scientific Computing and Petascale Simulations Institute. We developed Zoltan, a parallel mesh partitioning library that made use of accurate hypergraph models to provide load balancing in mesh-based computations. We developed several graph coloring algorithms for computing Jacobian and Hessian matrices and organized them into a software package called ColPack. We developed parallel algorithms for graph coloring and graph matching problems, and also designed multi-scale graph algorithms. Three PhD students graduated, six more are continuing their PhD studies, and four postdoctoral scholars were advised. Six of these students and Fellowsmore » have joined DOE Labs (Sandia, Berkeley), as staff scientists or as postdoctoral scientists. We also organized the SIAM Workshop on Combinatorial Scientific Computing (CSC) in 2007, 2009, and 2011 to continue to foster the CSC community.« less

  2. Perspectives on an education in computational biology and medicine.

    PubMed

    Rubinstein, Jill C

    2012-09-01

    The mainstream application of massively parallel, high-throughput assays in biomedical research has created a demand for scientists educated in Computational Biology and Bioinformatics (CBB). In response, formalized graduate programs have rapidly evolved over the past decade. Concurrently, there is increasing need for clinicians trained to oversee the responsible translation of CBB research into clinical tools. Physician-scientists with dedicated CBB training can facilitate such translation, positioning themselves at the intersection between computational biomedical research and medicine. This perspective explores key elements of the educational path to such a position, specifically addressing: 1) evolving perceptions of the role of the computational biologist and the impact on training and career opportunities; 2) challenges in and strategies for obtaining the core skill set required of a biomedical researcher in a computational world; and 3) how the combination of CBB with medical training provides a logical foundation for a career in academic medicine and/or biomedical research.

  3. Integrated Circuits/Segregated Labor: Women in Three Computer-Related Occupations. Project Report No. 84-A27.

    ERIC Educational Resources Information Center

    Strober, Myra H.; Arnold, Carolyn L.

    This discussion of the impact of new computer occupations on women's employment patterns is divided into four major sections. The first section describes the six computer-related occupations to be analyzed: (1) engineers; (2) computer scientists and systems analysts; (3) programmers; (4) electronic technicians; (5) computer operators; and (6) data…

  4. Enduring Influence of Stereotypical Computer Science Role Models on Women's Academic Aspirations

    ERIC Educational Resources Information Center

    Cheryan, Sapna; Drury, Benjamin J.; Vichayapai, Marissa

    2013-01-01

    The current work examines whether a brief exposure to a computer science role model who fits stereotypes of computer scientists has a lasting influence on women's interest in the field. One-hundred undergraduate women who were not computer science majors met a female or male peer role model who embodied computer science stereotypes in appearance…

  5. Collective Computation of Neural Network

    DTIC Science & Technology

    1990-03-15

    Sciences, Beijing ABSTRACT Computational neuroscience is a new branch of neuroscience originating from current research on the theory of computer...scientists working in artificial intelligence engineering and neuroscience . The paper introduces the collective computational properties of model neural...vision research. On this basis, the authors analyzed the significance of the Hopfield model. Key phrases: Computational Neuroscience , Neural Network, Model

  6. 75 FR 43551 - Notice of Intent To Prepare an Environmental Impact Statement for the Proposed Mohave County Wind...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-26

    ... Dam. The project will consist of up to 335 wind turbine generators (WTGs). Construction may consist of... County Wind Farm Project, Mohave County, AZ AGENCY: Bureau of Land Management, Interior. ACTION: Notice....gov/az/st/en/prog/energy/wind/mohave.html . In order to be included in the Draft EIS, all comments...

  7. 78 FR 57877 - Notice of Intent To Prepare an Environmental Impact Statement for the Proposed Maricopa Solar...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-20

    ... Park Project, a photovoltaic (PV) solar power plant with a planned generating capacity of up to 300...; AZA35927] Notice of Intent To Prepare an Environmental Impact Statement for the Proposed Maricopa Solar... site at http://www.blm.gov/az/st/en/prog/energy/solar/maricopa-solar.html . In order to be included in...

  8. INDUCTION OF 6-THIOGUANINE RESISTANCE IN SYNTHRONIZED HUMAN FIBROBLAST CELLS TREATED WITH METHYL METHANESULFONATE, N-ACETOXY-2-ACETHYLAMINOFLUORENE AND N-METHYL-N'-NITRO-N-NITROSOGUANIDINE

    EPA Science Inventory

    Chemical induction of 6-thioguanine resistance was studied in synchronized human fibroblast cells. Cells initially grown in a medium lacking arginine and glutamine for 24 h ceased DNA synthesis and failed to enter the S phase. After introduction of complete medium, the cells prog...

  9. 76 FR 66747 - Notice of Availability of the Northern Arizona Proposed Withdrawal Final Environmental Impact...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-27

    ... on the Internet at http://www.blm.gov/az/st/en/prog/mining/timeout.html . FOR FURTHER INFORMATION... million acres of Federal locatable minerals in northern Arizona from location and entry under the Mining Law of 1872, (30 U.S.C. 22-54) (Mining Law), subject to valid existing rights, by the Secretary of the...

  10. A Theory of Electromagnetic Shielding with Applications to MIL-STD-285, IEEE-299, and EMP Simulation

    DTIC Science & Technology

    1985-02-01

    in a building sized enclosure slot-like discontinuities may not all be small compar- ed to all wavelengths in the incident field, and slot resonan ...OFFICE OF RESEARCH/ NPP US AIR FORCE SPACE COMMAND ATTN STATE & LOCAL PROG SUPPORT O ATTN KKO 500 C STREET, SW ATTN KRQ WASHINGTON, DC 20472 ATTN XPOW

  11. 77 FR 2317 - Notice of Availability of Record of Decision for the Northern Arizona Proposed Withdrawal

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-17

    ... Plan for the Arizona Strip Field Office and Forest plans for the Kaibab National Forest would be... Internet at http://www.blm.gov/az/st/en/prog/mining/timeout/rod.html . FOR FURTHER INFORMATION CONTACT... General Mining Law 1,006,545 acres of Federal land and interests in land in the vicinity of the Grand...

  12. USAF/SCEEE Summer Faculty Research Program (1979). Volume 2

    DTIC Science & Technology

    1979-12-01

    Summer Faculty Research Program participants. The program designed to stimulate ’Ilk scientific and engineering interaction between university faculty...Prog., Dept. of Industrial Engineering Facility design and location theory University of Oklahoma and routing and distribution systems 202 W. Boyd...Theory & Assistant Professor of Management Adninistration, 1975 University of Akron S.ec aIty: Organization Design Akron, OH 44325 Assigned: AFBRMC

  13. Effect of feeding flax or linseed meal on progesterone clearance rate in ovariectomized ewes

    USDA-ARS?s Scientific Manuscript database

    Ovariectomized ewes (n = 22; 68.76 ± 2.34 kg initial body weight; 2.9 ± 0.1 initial body condition score) were individually fed one of three diets: 1) Control (phytoestrogen-free; n = 7), 2) Flax containing diet (n = 8), or 3) linseed meal (LSM) containing diet (n =7) to investigate the rate of prog...

  14. Site-Specific Dynamics of β-Sheet Peptides with (D) Pro-Gly Turns Probed by Laser-Excited Temperature-Jump Infrared Spectroscopy.

    PubMed

    Popp, Alexander; Scheerer, David; Chi, Heng; Keiderling, Timothy A; Hauser, Karin

    2016-05-04

    Turn residues and side-chain interactions play an important role for the folding of β-sheets. We investigated the conformational dynamics of a three-stranded β-sheet peptide ((D) P(D) P) and a two-stranded β-hairpin (WVYY-(D) P) by time-resolved temperature-jump (T-jump) infrared spectroscopy. Both peptide sequences contain (D) Pro-Gly residues that favor a tight β-turn. The three-stranded β-sheet (Ac-VFITS(D) PGKTYTEV(D) PGOKILQ-NH2 ) is stabilized by the turn sequences, whereas the β-hairpin (SWTVE(D) PGKYTYK-NH2 ) folding is assisted by both the turn sequence and hydrophobic cross-strand interactions. Relaxation times after the T-jump were monitored as a function of temperature and occur on a sub-microsecond time scale, (D) P(D) P being faster than WVYY-(D) P. The Xxx-(D) Pro tertiary amide provides a detectable IR band, allowing us to probe the dynamics site-specifically. The relative importance of the turn versus the intrastrand stability in β-sheet formation is discussed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Comparative analysis of operational forecasts versus actual weather conditions in airline flight planning, volume 4

    NASA Technical Reports Server (NTRS)

    Keitz, J. F.

    1982-01-01

    The impact of more timely and accurate weather data on airline flight planning with the emphasis on fuel savings is studied. This volume of the report discusses the results of Task 4 of the four major tasks included in the study. Task 4 uses flight plan segment wind and temperature differences as indicators of dates and geographic areas for which significant forecast errors may have occurred. An in-depth analysis is then conducted for the days identified. The analysis show that significant errors occur in the operational forecast on 15 of the 33 arbitrarily selected days included in the study. Wind speeds in an area of maximum winds are underestimated by at least 20 to 25 kts. on 14 of these days. The analysis also show that there is a tendency to repeat the same forecast errors from prog to prog. Also, some perceived forecast errors from the flight plan comparisons could not be verified by visual inspection of the corresponding National Meteorological Center forecast and analyses charts, and it is likely that they are the result of weather data interpolation techniques or some other data processing procedure in the airlines' flight planning systems.

  16. [Effect of progesterone on the expression of GLUT in the brain following hypoxic-ischemia in newborn rats].

    PubMed

    Li, Dong-Liang; Han, Hua

    2008-08-01

    To investigate the expression of GLUT1 and GLUT3 in the hippocampus after cerebral hypoxic-ischemia (HI) in newborn rats and the effect of progesterone (PROG) on them. Forty newborn SD rats were randomly divided into four groups: normal group, sham-operated group, hypoxic-ischemic group and progesterone group. Model of hypoxic-ischemia encephalopathy (HIE) was established in the 7-day-old newborn SD rats. Immunohistochemical method was applied to detect the expression of GLUT1 and GLUT3 in hippocampus. GLUT1 and GLUT3 were slightly seen in normal and sham operation group, there was no obviously difference between the two groups (P > 0.05). The expression of GLUT1 and GLUT3 in hypoxic-ischemia group were all higher than that in sham operated group (P < 0.05). Not only the expression of GLUT in progesterone group were significantly higher than that in sham operated group (P < 0.01), but also than that in hypoxic-ischemia group (P < 0.05). PROG could increase the tolerance of neuron to hypoxic-ischemia with maintaining the energy supply in the brain by up-regulating GLUT expression.

  17. Scientific Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fermilab

    2017-09-01

    Scientists, engineers and programmers at Fermilab are tackling today’s most challenging computational problems. Their solutions, motivated by the needs of worldwide research in particle physics and accelerators, help America stay at the forefront of innovation.

  18. An Analysis of Cloud Computing with Amazon Web Services for the Atmospheric Science Data Center

    NASA Astrophysics Data System (ADS)

    Gleason, J. L.; Little, M. M.

    2013-12-01

    NASA science and engineering efforts rely heavily on compute and data handling systems. The nature of NASA science data is such that it is not restricted to NASA users, instead it is widely shared across a globally distributed user community including scientists, educators, policy decision makers, and the public. Therefore NASA science computing is a candidate use case for cloud computing where compute resources are outsourced to an external vendor. Amazon Web Services (AWS) is a commercial cloud computing service developed to use excess computing capacity at Amazon, and potentially provides an alternative to costly and potentially underutilized dedicated acquisitions whenever NASA scientists or engineers require additional data processing. AWS desires to provide a simplified avenue for NASA scientists and researchers to share large, complex data sets with external partners and the public. AWS has been extensively used by JPL for a wide range of computing needs and was previously tested on a NASA Agency basis during the Nebula testing program. Its ability to support the Langley Science Directorate needs to be evaluated by integrating it with real world operational needs across NASA and the associated maturity that would come with that. The strengths and weaknesses of this architecture and its ability to support general science and engineering applications has been demonstrated during the previous testing. The Langley Office of the Chief Information Officer in partnership with the Atmospheric Sciences Data Center (ASDC) has established a pilot business interface to utilize AWS cloud computing resources on a organization and project level pay per use model. This poster discusses an effort to evaluate the feasibility of the pilot business interface from a project level perspective by specifically using a processing scenario involving the Clouds and Earth's Radiant Energy System (CERES) project.

  19. OptFuels: Fuel treatment optimization

    Treesearch

    Greg Jones

    2011-01-01

    Scientists at the USDA Forest Service, Rocky Mountain Research Station, in Missoula, MT, in collaboration with scientists at the University of Montana, are developing a tool to help forest managers prioritize forest fuel reduction treatments. Although several computer models analyze fuels and fire behavior, stand-level effects of fuel treatments, and priority planning...

  20. Four Argonne National Laboratory scientists receive Early Career Research

    Science.gov Websites

    Media Contacts Social Media Photos Videos Fact Sheets, Brochures and Reports Summer Science Writing Writing Internship Four Argonne National Laboratory scientists receive Early Career Research Program economic impact of cascading shortages. He will also seek to enable scaling on high-performance computing

  1. Air Force Laboratory’s 2005 Technology Milestones

    DTIC Science & Technology

    2006-01-01

    Computational materials science methods can benefit the design and property prediction of complex real-world materials. With these models , scientists and...Warfighter Page Air High - Frequency Acoustic System...800) 203-6451 High - Frequency Acoustic System Payoff Scientists created the High - Frequency Acoustic Suppression Technology (HiFAST) airflow control

  2. A toolbox and record for scientific models

    NASA Technical Reports Server (NTRS)

    Ellman, Thomas

    1994-01-01

    Computational science presents a host of challenges for the field of knowledge-based software design. Scientific computation models are difficult to construct. Models constructed by one scientist are easily misapplied by other scientists to problems for which they are not well-suited. Finally, models constructed by one scientist are difficult for others to modify or extend to handle new types of problems. Construction of scientific models actually involves much more than the mechanics of building a single computational model. In the course of developing a model, a scientist will often test a candidate model against experimental data or against a priori expectations. Test results often lead to revisions of the model and a consequent need for additional testing. During a single model development session, a scientist typically examines a whole series of alternative models, each using different simplifying assumptions or modeling techniques. A useful scientific software design tool must support these aspects of the model development process as well. In particular, it should propose and carry out tests of candidate models. It should analyze test results and identify models and parts of models that must be changed. It should determine what types of changes can potentially cure a given negative test result. It should organize candidate models, test data, and test results into a coherent record of the development process. Finally, it should exploit the development record for two purposes: (1) automatically determining the applicability of a scientific model to a given problem; (2) supporting revision of a scientific model to handle a new type of problem. Existing knowledge-based software design tools must be extended in order to provide these facilities.

  3. Computer Series, 98. Electronics for Scientists: A Computer-Intensive Approach.

    ERIC Educational Resources Information Center

    Scheeline, Alexander; Mork, Brian J.

    1988-01-01

    Reports the design for a principles-before-details presentation of electronics for an instrumental analysis class. Uses computers for data collection and simulations. Requires one semester with two 2.5-hour periods and two lectures per week. Includes lab and lecture syllabi. (MVL)

  4. Big data computing: Building a vision for ARS information management

    USDA-ARS?s Scientific Manuscript database

    Improvements are needed within the ARS to increase scientific capacity and keep pace with new developments in computer technologies that support data acquisition and analysis. Enhancements in computing power and IT infrastructure are needed to provide scientists better access to high performance com...

  5. Science-Driven Computing: NERSC's Plan for 2006-2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simon, Horst D.; Kramer, William T.C.; Bailey, David H.

    NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise ofmore » the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.« less

  6. High-Performance Computing Unlocks Innovation at NREL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Need to fly around a wind farm? Or step inside a molecule? NREL scientists use a super powerful (and highly energy-efficient) computer to visualize and solve big problems in renewable energy research.

  7. Mathematical computer programs: A compilation

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Computer programs, routines, and subroutines for aiding engineers, scientists, and mathematicians in direct problem solving are presented. Also included is a group of items that affords the same users greater flexibility in the use of software.

  8. A Modular Environment for Geophysical Inversion and Run-time Autotuning using Heterogeneous Computing Systems

    NASA Astrophysics Data System (ADS)

    Myre, Joseph M.

    Heterogeneous computing systems have recently come to the forefront of the High-Performance Computing (HPC) community's interest. HPC computer systems that incorporate special purpose accelerators, such as Graphics Processing Units (GPUs), are said to be heterogeneous. Large scale heterogeneous computing systems have consistently ranked highly on the Top500 list since the beginning of the heterogeneous computing trend. By using heterogeneous computing systems that consist of both general purpose processors and special- purpose accelerators, the speed and problem size of many simulations could be dramatically increased. Ultimately this results in enhanced simulation capabilities that allows, in some cases for the first time, the execution of parameter space and uncertainty analyses, model optimizations, and other inverse modeling techniques that are critical for scientific discovery and engineering analysis. However, simplifying the usage and optimization of codes for heterogeneous computing systems remains a challenge. This is particularly true for scientists and engineers for whom understanding HPC architectures and undertaking performance analysis may not be primary research objectives. To enable scientists and engineers to remain focused on their primary research objectives, a modular environment for geophysical inversion and run-time autotuning on heterogeneous computing systems is presented. This environment is composed of three major components: 1) CUSH---a framework for reducing the complexity of programming heterogeneous computer systems, 2) geophysical inversion routines which can be used to characterize physical systems, and 3) run-time autotuning routines designed to determine configurations of heterogeneous computing systems in an attempt to maximize the performance of scientific and engineering codes. Using three case studies, a lattice-Boltzmann method, a non-negative least squares inversion, and a finite-difference fluid flow method, it is shown that this environment provides scientists and engineers with means to reduce the programmatic complexity of their applications, to perform geophysical inversions for characterizing physical systems, and to determine high-performing run-time configurations of heterogeneous computing systems using a run-time autotuner.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nikolic, R J

    This month's issue has the following articles: (1) Dawn of a New Era of Scientific Discovery - Commentary by Edward I. Moses; (2) At the Frontiers of Fundamental Science Research - Collaborators from national laboratories, universities, and international organizations are using the National Ignition Facility to probe key fundamental science questions; (3) Livermore Responds to Crisis in Post-Earthquake Japan - More than 70 Laboratory scientists provided round-the-clock expertise in radionuclide analysis and atmospheric dispersion modeling as part of the nation's support to Japan following the March 2011 earthquake and nuclear accident; (4) A Comprehensive Resource for Modeling, Simulation, and Experimentsmore » - A new Web-based resource called MIDAS is a central repository for material properties, experimental data, and computer models; and (5) Finding Data Needles in Gigabit Haystacks - Livermore computer scientists have developed a novel computer architecture based on 'persistent' memory to ease data-intensive computations.« less

  10. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill; Feiereisen, William (Technical Monitor)

    2000-01-01

    The term "Grid" refers to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. The vision for NASN's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks that will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. IPG development and deployment is addressing requirements obtained by analyzing a number of different application areas, in particular from the NASA Aero-Space Technology Enterprise. This analysis has focussed primarily on two types of users: The scientist / design engineer whose primary interest is problem solving (e.g., determining wing aerodynamic characteristics in many different operating environments), and whose primary interface to IPG will be through various sorts of problem solving frameworks. The second type of user if the tool designer: The computational scientists who convert physics and mathematics into code that can simulate the physical world. These are the two primary users of IPG, and they have rather different requirements. This paper describes the current state of IPG (the operational testbed), the set of capabilities being put into place for the operational prototype IPG, as well as some of the longer term R&D tasks.

  11. The Effects of a Robot Game Environment on Computer Programming Education for Elementary School Students

    ERIC Educational Resources Information Center

    Shim, Jaekwoun; Kwon, Daiyoung; Lee, Wongyu

    2017-01-01

    In the past, computer programming was perceived as a task only carried out by computer scientists; in the 21st century, however, computer programming is viewed as a critical and necessary skill that everyone should learn. In order to improve teaching of problem-solving abilities in a computing environment, extensive research is being done on…

  12. Application of iterative robust model-based optimal experimental design for the calibration of biocatalytic models.

    PubMed

    Van Daele, Timothy; Gernaey, Krist V; Ringborg, Rolf H; Börner, Tim; Heintz, Søren; Van Hauwermeiren, Daan; Grey, Carl; Krühne, Ulrich; Adlercreutz, Patrick; Nopens, Ingmar

    2017-09-01

    The aim of model calibration is to estimate unique parameter values from available experimental data, here applied to a biocatalytic process. The traditional approach of first gathering data followed by performing a model calibration is inefficient, since the information gathered during experimentation is not actively used to optimize the experimental design. By applying an iterative robust model-based optimal experimental design, the limited amount of data collected is used to design additional informative experiments. The algorithm is used here to calibrate the initial reaction rate of an ω-transaminase catalyzed reaction in a more accurate way. The parameter confidence region estimated from the Fisher Information Matrix is compared with the likelihood confidence region, which is not only more accurate but also a computationally more expensive method. As a result, an important deviation between both approaches is found, confirming that linearization methods should be applied with care for nonlinear models. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 33:1278-1293, 2017. © 2017 American Institute of Chemical Engineers.

  13. RIACS

    NASA Technical Reports Server (NTRS)

    Moore, Robert C.

    1998-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities that serves as a bridge between NASA and the academic community. Under a five-year co-operative agreement with NASA, research at RIACS is focused on areas that are strategically enabling to the Ames Research Center's role as NASA's Center of Excellence for Information Technology. Research is carried out by a staff of full-time scientist,augmented by visitors, students, post doctoral candidates and visiting university faculty. The primary mission of RIACS is charted to carry out research and development in computer science. This work is devoted in the main to tasks that are strategically enabling with respect to NASA's bold mission in space exploration and aeronautics. There are three foci for this work: Automated Reasoning. Human-Centered Computing. and High Performance Computing and Networking. RIACS has the additional goal of broadening the base of researcher in these areas of importance to the nation's space and aeronautics enterprises. Through its visiting scientist program, RIACS facilitates the participation of university-based researchers, including both faculty and students, in the research activities of NASA and RIACS. RIACS researchers work in close collaboration with NASA computer scientists on projects such as the Remote Agent Experiment on Deep Space One mission, and Super-Resolution Surface Modeling.

  14. Computational chemistry at Janssen

    NASA Astrophysics Data System (ADS)

    van Vlijmen, Herman; Desjarlais, Renee L.; Mirzadegan, Tara

    2017-03-01

    Computer-aided drug discovery activities at Janssen are carried out by scientists in the Computational Chemistry group of the Discovery Sciences organization. This perspective gives an overview of the organizational and operational structure, the science, internal and external collaborations, and the impact of the group on Drug Discovery at Janssen.

  15. Computing Life

    ERIC Educational Resources Information Center

    National Institute of General Medical Sciences (NIGMS), 2009

    2009-01-01

    Computer advances now let researchers quickly search through DNA sequences to find gene variations that could lead to disease, simulate how flu might spread through one's school, and design three-dimensional animations of molecules that rival any video game. By teaming computers and biology, scientists can answer new and old questions that could…

  16. The Study Team for Early Life Asthma Research (STELAR) consortium ‘Asthma e-lab’: team science bringing data, methods and investigators together

    PubMed Central

    Custovic, Adnan; Ainsworth, John; Arshad, Hasan; Bishop, Christopher; Buchan, Iain; Cullinan, Paul; Devereux, Graham; Henderson, John; Holloway, John; Roberts, Graham; Turner, Steve; Woodcock, Ashley; Simpson, Angela

    2015-01-01

    We created Asthma e-Lab, a secure web-based research environment to support consistent recording, description and sharing of data, computational/statistical methods and emerging findings across the five UK birth cohorts. The e-Lab serves as a data repository for our unified dataset and provides the computational resources and a scientific social network to support collaborative research. All activities are transparent, and emerging findings are shared via the e-Lab, linked to explanations of analytical methods, thus enabling knowledge transfer. eLab facilitates the iterative interdisciplinary dialogue between clinicians, statisticians, computer scientists, mathematicians, geneticists and basic scientists, capturing collective thought behind the interpretations of findings. PMID:25805205

  17. Identifying the Factors Leading to Success: How an Innovative Science Curriculum Cultivates Student Motivation

    ERIC Educational Resources Information Center

    Scogin, Stephen C.

    2016-01-01

    "PlantingScience" is an award-winning program recognized for its innovation and use of computer-supported scientist mentoring. Science learners work on inquiry-based experiments in their classrooms and communicate asynchronously with practicing plant scientist-mentors about the projects. The purpose of this study was to identify specific…

  18. Tessera: Open source software for accelerated data science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sego, Landon H.; Hafen, Ryan P.; Director, Hannah M.

    2014-06-30

    Extracting useful, actionable information from data can be a formidable challenge for the safeguards, nonproliferation, and arms control verification communities. Data scientists are often on the “front-lines” of making sense of complex and large datasets. They require flexible tools that make it easy to rapidly reformat large datasets, interactively explore and visualize data, develop statistical algorithms, and validate their approaches—and they need to perform these activities with minimal lines of code. Existing commercial software solutions often lack extensibility and the flexibility required to address the nuances of the demanding and dynamic environments where data scientists work. To address this need,more » Pacific Northwest National Laboratory developed Tessera, an open source software suite designed to enable data scientists to interactively perform their craft at the terabyte scale. Tessera automatically manages the complicated tasks of distributed storage and computation, empowering data scientists to do what they do best: tackling critical research and mission objectives by deriving insight from data. We illustrate the use of Tessera with an example analysis of computer network data.« less

  19. Cross Domain Deterrence: Livermore Technical Report, 2014-2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnes, Peter D.; Bahney, Ben; Matarazzo, Celeste

    2016-08-03

    Lawrence Livermore National Laboratory (LLNL) is an original collaborator on the project titled “Deterring Complex Threats: The Effects of Asymmetry, Interdependence, and Multi-polarity on International Strategy,” (CDD Project) led by the UC Institute on Global Conflict and Cooperation at UCSD under PIs Jon Lindsay and Erik Gartzke , and funded through the DoD Minerva Research Initiative. In addition to participating in workshops and facilitating interaction among UC social scientists, LLNL is leading the computational modeling effort and assisting with empirical case studies to probe the viability of analytic, modeling and data analysis concepts. This report summarizes LLNL work on themore » CDD Project to date, primarily in Project Years 1-2, corresponding to Federal fiscal year 2015. LLNL brings two unique domains of expertise to bear on this Project: (1) access to scientific expertise on the technical dimensions of emerging threat technology, and (2) high performance computing (HPC) expertise, required for analyzing the complexity of bargaining interactions in the envisioned threat models. In addition, we have a small group of researchers trained as social scientists who are intimately familiar with the International Relations research. We find that pairing simulation scientists, who are typically trained in computer science, with domain experts, social scientists in this case, is the most effective route to developing powerful new simulation tools capable of representing domain concepts accurately and answering challenging questions in the field.« less

  20. Towards Robot Scientists for autonomous scientific discovery

    PubMed Central

    2010-01-01

    We review the main components of autonomous scientific discovery, and how they lead to the concept of a Robot Scientist. This is a system which uses techniques from artificial intelligence to automate all aspects of the scientific discovery process: it generates hypotheses from a computer model of the domain, designs experiments to test these hypotheses, runs the physical experiments using robotic systems, analyses and interprets the resulting data, and repeats the cycle. We describe our two prototype Robot Scientists: Adam and Eve. Adam has recently proven the potential of such systems by identifying twelve genes responsible for catalysing specific reactions in the metabolic pathways of the yeast Saccharomyces cerevisiae. This work has been formally recorded in great detail using logic. We argue that the reporting of science needs to become fully formalised and that Robot Scientists can help achieve this. This will make scientific information more reproducible and reusable, and promote the integration of computers in scientific reasoning. We believe the greater automation of both the physical and intellectual aspects of scientific investigations to be essential to the future of science. Greater automation improves the accuracy and reliability of experiments, increases the pace of discovery and, in common with conventional laboratory automation, removes tedious and repetitive tasks from the human scientist. PMID:20119518

  1. Towards Robot Scientists for autonomous scientific discovery.

    PubMed

    Sparkes, Andrew; Aubrey, Wayne; Byrne, Emma; Clare, Amanda; Khan, Muhammed N; Liakata, Maria; Markham, Magdalena; Rowland, Jem; Soldatova, Larisa N; Whelan, Kenneth E; Young, Michael; King, Ross D

    2010-01-04

    We review the main components of autonomous scientific discovery, and how they lead to the concept of a Robot Scientist. This is a system which uses techniques from artificial intelligence to automate all aspects of the scientific discovery process: it generates hypotheses from a computer model of the domain, designs experiments to test these hypotheses, runs the physical experiments using robotic systems, analyses and interprets the resulting data, and repeats the cycle. We describe our two prototype Robot Scientists: Adam and Eve. Adam has recently proven the potential of such systems by identifying twelve genes responsible for catalysing specific reactions in the metabolic pathways of the yeast Saccharomyces cerevisiae. This work has been formally recorded in great detail using logic. We argue that the reporting of science needs to become fully formalised and that Robot Scientists can help achieve this. This will make scientific information more reproducible and reusable, and promote the integration of computers in scientific reasoning. We believe the greater automation of both the physical and intellectual aspects of scientific investigations to be essential to the future of science. Greater automation improves the accuracy and reliability of experiments, increases the pace of discovery and, in common with conventional laboratory automation, removes tedious and repetitive tasks from the human scientist.

  2. Implementations of the CC'01 Human-Computer Interaction Guidelines Using Bloom's Taxonomy

    ERIC Educational Resources Information Center

    Manaris, Bill; Wainer, Michael; Kirkpatrick, Arthur E.; Stalvey, RoxAnn H.; Shannon, Christine; Leventhal, Laura; Barnes, Julie; Wright, John; Schafer, J. Ben; Sanders, Dean

    2007-01-01

    In today's technology-laden society human-computer interaction (HCI) is an important knowledge area for computer scientists and software engineers. This paper surveys existing approaches to incorporate HCI into computer science (CS) and such related issues as the perceived gap between the interests of the HCI community and the needs of CS…

  3. Eckert, Wallace John (1902-71)

    NASA Astrophysics Data System (ADS)

    Murdin, P.

    2000-11-01

    Computer scientist and astronomer. Born in Pittsburgh, PA, Eckert was a pioneer of the use of IBM punched card equipment for astronomical calculations. As director of the US Nautical Almanac Office he introduced computer methods to calculate and print tables instead of relying on human `computers'. When, later, he became director of the Watson Scientific Computing Laboratory at Columbia Universit...

  4. "I'm Good, but Not That Good": Digitally-Skilled Young People's Identity in Computing

    ERIC Educational Resources Information Center

    Wong, Billy

    2017-01-01

    Computers and information technology are fast becoming a part of young people's everyday life. However, there remains a difference between the majority who can use computers and the minority who are computer scientists or professionals. Drawing on 32 semi-structured interviews with digitally skilled young people (aged 13-19), we explore their…

  5. Computers in Education: Realizing the Potential. Chairmen's Report of a Research Conference, Pittsburgh, Pennsylvania, November 20-24, 1982.

    ERIC Educational Resources Information Center

    Lesgold, Alan; Reif, Frederick

    The future of computers in education and the research needed to realize the computer's potential are discussed in this report, which presents a summary and the conclusions from an invitational conference involving 40 computer scientists, psychologists, educational researchers, teachers, school administrators, and parents. The summary stresses the…

  6. 2005 White Paper on Institutional Capability Computing Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carnes, B; McCoy, M; Seager, M

    This paper documents the need for a significant increase in the computing infrastructure provided to scientists working in the unclassified domains at Lawrence Livermore National Laboratory (LLNL). This need could be viewed as the next step in a broad strategy outlined in the January 2002 White Paper (UCRL-ID-147449) that bears essentially the same name as this document. Therein we wrote: 'This proposed increase could be viewed as a step in a broader strategy linking hardware evolution to applications development that would take LLNL unclassified computational science to a position of distinction if not preeminence by 2006.' This position of distinctionmore » has certainly been achieved. This paper provides a strategy for sustaining this success but will diverge from its 2002 predecessor in that it will: (1) Amplify the scientific and external success LLNL has enjoyed because of the investments made in 2002 (MCR, 11 TF) and 2004 (Thunder, 23 TF). (2) Describe in detail the nature of additional investments that are important to meet both the institutional objectives of advanced capability for breakthrough science and the scientists clearly stated request for adequate capacity and more rapid access to moderate-sized resources. (3) Put these requirements in the context of an overall strategy for simulation science and external collaboration. While our strategy for Multiprogrammatic and Institutional Computing (M&IC) has worked well, three challenges must be addressed to assure and enhance our position. The first is that while we now have over 50 important classified and unclassified simulation codes available for use by our computational scientists, we find ourselves coping with high demand for access and long queue wait times. This point was driven home in the 2005 Institutional Computing Executive Group (ICEG) 'Report Card' to the Deputy Director for Science and Technology (DDST) Office and Computation Directorate management. The second challenge is related to the balance that should be maintained in the simulation environment. With the advent of Thunder, the institution directed a change in course from past practice. Instead of making Thunder available to the large body of scientists, as was MCR, and effectively using it as a capacity system, the intent was to make it available to perhaps ten projects so that these teams could run very aggressive problems for breakthrough science. This usage model established Thunder as a capability system. The challenge this strategy raises is that the majority of scientists have not seen an improvement in capacity computing resources since MCR, thus creating significant tension in the system. The question then is: 'How do we address the institution's desire to maintain the potential for breakthrough science and also meet the legitimate requests from the ICEG to achieve balance?' Both the capability and the capacity environments must be addressed through this one procurement. The third challenge is to reach out more aggressively to the national science community to encourage access to LLNL resources as part of a strategy for sharpening our science through collaboration. Related to this, LLNL has been unable in the past to provide access for sensitive foreign nationals (SFNs) to the Livermore Computing (LC) unclassified 'yellow' network. Identifying some mechanism for data sharing between LLNL computational scientists and SFNs would be a first practical step in fostering cooperative, collaborative relationships with an important and growing sector of the American science community.« less

  7. Short-Term Evaluation of Intraoral Soft Splints

    DTIC Science & Technology

    1994-06-01

    typical orofacial pain problems (McGlynn and Cassisi, 1985; Fricton, 1991b). The objective outcome measure should both assess the degree of muscle pain ...Second Edition. Chicago, Year Book Medical Publishers, pp 218-21. Bell, W.E. (1989) Orofacial Pains Classification, Diagnosis, Management. Fourth...Fricton, J.R. (1990) Musculoskeletal measures of orofacial pain . Anesth Prog 37:136-43. Fricton, J.R. (1991a) Recent advances in teaporomandibular

  8. Floating Ocean Platform

    DTIC Science & Technology

    2003-08-15

    floating structures create novel habitats for subtidal epibiota?, MARINE ECOLOGY -PROGRESS SERIES, 43-52 Mar. Ecol.- Prog. Ser., 2002 Vegueria, SFJ Godoy... ECOLOGICAL APPLICATIONS, 350-366 Ecol. Appl., 2000 Niedzwecki, JM van de Lindt, JW Gage, JH Teigen, PS, Design estimates of surface wave interaction with...The ecological effects beyond the offshore platform, Coastal Zone: Proceedings of the Symposium on Coastal and Ocean Management, v 2, n pt2, 1989, p

  9. A Comparison of Prostate Cancer Incidence Between U.S. Air Force Enlisted Aircrew

    DTIC Science & Technology

    2011-06-30

    COL DAVID B. RHODES, RAM Prog Dir COL ROBERT E. CARROLL , Chair FE...PROGRAM ELEMENT NUMBER 6. AUTHOR( S ) Joseph A. Lopez 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING...ORGANIZATION NAME( S ) AND ADDRESS(ES) USAF School of Aerospace Medicine Aerospace Medicine Education/FEEG 2510 Fifth St. Wright-Patterson AFB, OH 45433-7913

  10. Chalk-Ex: Transport of Optically Active Particles from the Surface Mixed Layer

    DTIC Science & Technology

    2001-09-30

    Atlantic Ocean. Mar. Ecol. Prog. Ser. 97: 271-285. Harris, R. P. 1994. Zooplankton grazing on the coccolithophore Emiliania huxleyi and its role...Balch, and K. A. Kilpatrick. 1998. Scattering and attenuation properties of Emiliania huxleyi cells and their detached coccoliths. Limnol. Oceanogr...coccolithophore Emiliania huxleyi under steady-state light-limited growth. Marine Ecology Progress Series. 142: 87-97. Bidigare, R. R. , M

  11. 75 FR 60820 - United States v. Adobe Systems, Inc., et al.; Proposed Final Judgment and Competitive Impact...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-01

    ... compete for high tech employees, and in particular specialized computer science and engineering talent on the basis of salaries, benefits, and career opportunities. In recent years, talented computer... Venue 4. Each Defendant hires specialized computer engineers and scientists throughout the United States...

  12. Advanced Biomedical Computing Center (ABCC) | DSITP

    Cancer.gov

    The Advanced Biomedical Computing Center (ABCC), located in Frederick Maryland (MD), provides HPC resources for both NIH/NCI intramural scientists and the extramural biomedical research community. Its mission is to provide HPC support, to provide collaborative research, and to conduct in-house research in various areas of computational biology and biomedical research.

  13. Computer Art--A New Tool in Advertising Graphics.

    ERIC Educational Resources Information Center

    Wassmuth, Birgit L.

    Using computers to produce art began with scientists, mathematicians, and individuals with strong technical backgrounds who used the graphic material as visualizations of data in technical fields. People are using computer art in advertising, as well as in painting; sculpture; music; textile, product, industrial, and interior design; architecture;…

  14. Integrating Computational Science Tools into a Thermodynamics Course

    ERIC Educational Resources Information Center

    Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew

    2018-01-01

    Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of…

  15. 1001 Ways to run AutoDock Vina for virtual screening

    NASA Astrophysics Data System (ADS)

    Jaghoori, Mohammad Mahdi; Bleijlevens, Boris; Olabarriaga, Silvia D.

    2016-03-01

    Large-scale computing technologies have enabled high-throughput virtual screening involving thousands to millions of drug candidates. It is not trivial, however, for biochemical scientists to evaluate the technical alternatives and their implications for running such large experiments. Besides experience with the molecular docking tool itself, the scientist needs to learn how to run it on high-performance computing (HPC) infrastructures, and understand the impact of the choices made. Here, we review such considerations for a specific tool, AutoDock Vina, and use experimental data to illustrate the following points: (1) an additional level of parallelization increases virtual screening throughput on a multi-core machine; (2) capturing of the random seed is not enough (though necessary) for reproducibility on heterogeneous distributed computing systems; (3) the overall time spent on the screening of a ligand library can be improved by analysis of factors affecting execution time per ligand, including number of active torsions, heavy atoms and exhaustiveness. We also illustrate differences among four common HPC infrastructures: grid, Hadoop, small cluster and multi-core (virtual machine on the cloud). Our analysis shows that these platforms are suitable for screening experiments of different sizes. These considerations can guide scientists when choosing the best computing platform and set-up for their future large virtual screening experiments.

  16. 1001 Ways to run AutoDock Vina for virtual screening.

    PubMed

    Jaghoori, Mohammad Mahdi; Bleijlevens, Boris; Olabarriaga, Silvia D

    2016-03-01

    Large-scale computing technologies have enabled high-throughput virtual screening involving thousands to millions of drug candidates. It is not trivial, however, for biochemical scientists to evaluate the technical alternatives and their implications for running such large experiments. Besides experience with the molecular docking tool itself, the scientist needs to learn how to run it on high-performance computing (HPC) infrastructures, and understand the impact of the choices made. Here, we review such considerations for a specific tool, AutoDock Vina, and use experimental data to illustrate the following points: (1) an additional level of parallelization increases virtual screening throughput on a multi-core machine; (2) capturing of the random seed is not enough (though necessary) for reproducibility on heterogeneous distributed computing systems; (3) the overall time spent on the screening of a ligand library can be improved by analysis of factors affecting execution time per ligand, including number of active torsions, heavy atoms and exhaustiveness. We also illustrate differences among four common HPC infrastructures: grid, Hadoop, small cluster and multi-core (virtual machine on the cloud). Our analysis shows that these platforms are suitable for screening experiments of different sizes. These considerations can guide scientists when choosing the best computing platform and set-up for their future large virtual screening experiments.

  17. hackseq: Catalyzing collaboration between biological and computational scientists via hackathon.

    PubMed

    2017-01-01

    hackseq ( http://www.hackseq.com) was a genomics hackathon with the aim of bringing together a diverse set of biological and computational scientists to work on collaborative bioinformatics projects. In October 2016, 66 participants from nine nations came together for three days for hackseq and collaborated on nine projects ranging from data visualization to algorithm development. The response from participants was overwhelmingly positive with 100% (n = 54) of survey respondents saying they would like to participate in future hackathons. We detail key steps for others interested in organizing a successful hackathon and report excerpts from each project.

  18. hackseq: Catalyzing collaboration between biological and computational scientists via hackathon

    PubMed Central

    2017-01-01

    hackseq ( http://www.hackseq.com) was a genomics hackathon with the aim of bringing together a diverse set of biological and computational scientists to work on collaborative bioinformatics projects. In October 2016, 66 participants from nine nations came together for three days for hackseq and collaborated on nine projects ranging from data visualization to algorithm development. The response from participants was overwhelmingly positive with 100% (n = 54) of survey respondents saying they would like to participate in future hackathons. We detail key steps for others interested in organizing a successful hackathon and report excerpts from each project. PMID:28417000

  19. What Physicists Should Know About High Performance Computing - Circa 2002

    NASA Astrophysics Data System (ADS)

    Frederick, Donald

    2002-08-01

    High Performance Computing (HPC) is a dynamic, cross-disciplinary field that traditionally has involved applied mathematicians, computer scientists, and others primarily from the various disciplines that have been major users of HPC resources - physics, chemistry, engineering, with increasing use by those in the life sciences. There is a technological dynamic that is powered by economic as well as by technical innovations and developments. This talk will discuss practical ideas to be considered when developing numerical applications for research purposes. Even with the rapid pace of development in the field, the author believes that these concepts will not become obsolete for a while, and will be of use to scientists who either are considering, or who have already started down the HPC path. These principles will be applied in particular to current parallel HPC systems, but there will also be references of value to desktop users. The talk will cover such topics as: computing hardware basics, single-cpu optimization, compilers, timing, numerical libraries, debugging and profiling tools and the emergence of Computational Grids.

  20. Culture and Workplace Communications: A Comparison of the Technical Communications Practices of Japanese and U.S. Aerospace Engineers and Scientists.

    ERIC Educational Resources Information Center

    Pinelli, Thomas E.; Sato, Yuko; Barclay, Rebecca O.; Kennedy, John M.

    1997-01-01

    Japanese (n=94) and U.S. (n=340) aerospace scientists/engineers described time spent communicating information, collaborative writing, importance of technical communication courses, and the use of libraries, computer networks, and technical reports. Japanese respondents had greater language fluency; U.S. respondents spent more time with…

  1. MeDICi Software Superglue for Data Analysis Pipelines

    ScienceCinema

    Ian Gorton

    2017-12-09

    The Middleware for Data-Intensive Computing (MeDICi) Integration Framework is an integrated middleware platform developed to solve data analysis and processing needs of scientists across many domains. MeDICi is scalable, easily modified, and robust to multiple languages, protocols, and hardware platforms, and in use today by PNNL scientists for bioinformatics, power grid failure analysis, and text analysis.

  2. The Draw a Scientist Test: A Different Population and a Somewhat Different Story

    ERIC Educational Resources Information Center

    Thomas, Mark D.; Henley, Tracy B.; Snell, Catherine M.

    2006-01-01

    This study examined Draw-a-Scientist-Test (DAST) images solicited from 212 undergraduate students for the presence of traditional gender stereotypes. Participants were 100 males and 112 females enrolled in psychology or computer science courses with a mean age of 21.02 years. A standard multiple regression generated a model that accounts for the…

  3. Multiscale computing.

    PubMed

    Kobayashi, M; Irino, T; Sweldens, W

    2001-10-23

    Multiscale computing (MSC) involves the computation, manipulation, and analysis of information at different resolution levels. Widespread use of MSC algorithms and the discovery of important relationships between different approaches to implementation were catalyzed, in part, by the recent interest in wavelets. We present two examples that demonstrate how MSC can help scientists understand complex data. The first is from acoustical signal processing and the second is from computer graphics.

  4. A characterization of workflow management systems for extreme-scale applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  5. A characterization of workflow management systems for extreme-scale applications

    DOE PAGES

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia; ...

    2017-02-16

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  6. Information processing, computation, and cognition

    PubMed Central

    Scarantino, Andrea

    2010-01-01

    Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both – although others disagree vehemently. Yet different cognitive scientists use ‘computation’ and ‘information processing’ to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates’ empirical aspects. PMID:22210958

  7. Mobile Devices and GPU Parallelism in Ionospheric Data Processing

    NASA Astrophysics Data System (ADS)

    Mascharka, D.; Pankratius, V.

    2015-12-01

    Scientific data acquisition in the field is often constrained by data transfer backchannels to analysis environments. Geoscientists are therefore facing practical bottlenecks with increasing sensor density and variety. Mobile devices, such as smartphones and tablets, offer promising solutions to key problems in scientific data acquisition, pre-processing, and validation by providing advanced capabilities in the field. This is due to affordable network connectivity options and the increasing mobile computational power. This contribution exemplifies a scenario faced by scientists in the field and presents the "Mahali TEC Processing App" developed in the context of the NSF-funded Mahali project. Aimed at atmospheric science and the study of ionospheric Total Electron Content (TEC), this app is able to gather data from various dual-frequency GPS receivers. It demonstrates parsing of full-day RINEX files on mobile devices and on-the-fly computation of vertical TEC values based on satellite ephemeris models that are obtained from NASA. Our experiments show how parallel computing on the mobile device GPU enables fast processing and visualization of up to 2 million datapoints in real-time using OpenGL. GPS receiver bias is estimated through minimum TEC approximations that can be interactively adjusted by scientists in the graphical user interface. Scientists can also perform approximate computations for "quickviews" to reduce CPU processing time and memory consumption. In the final stage of our mobile processing pipeline, scientists can upload data to the cloud for further processing. Acknowledgements: The Mahali project (http://mahali.mit.edu) is funded by the NSF INSPIRE grant no. AGS-1343967 (PI: V. Pankratius). We would like to acknowledge our collaborators at Boston College, Virginia Tech, Johns Hopkins University, Colorado State University, as well as the support of UNAVCO for loans of dual-frequency GPS receivers for use in this project, and Intel for loans of smartphones.

  8. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Volume 2: Baseline architecture report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  9. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Phased development plan

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  10. BioImg.org: A Catalog of Virtual Machine Images for the Life Sciences

    PubMed Central

    Dahlö, Martin; Haziza, Frédéric; Kallio, Aleksi; Korpelainen, Eija; Bongcam-Rudloff, Erik; Spjuth, Ola

    2015-01-01

    Virtualization is becoming increasingly important in bioscience, enabling assembly and provisioning of complete computer setups, including operating system, data, software, and services packaged as virtual machine images (VMIs). We present an open catalog of VMIs for the life sciences, where scientists can share information about images and optionally upload them to a server equipped with a large file system and fast Internet connection. Other scientists can then search for and download images that can be run on the local computer or in a cloud computing environment, providing easy access to bioinformatics environments. We also describe applications where VMIs aid life science research, including distributing tools and data, supporting reproducible analysis, and facilitating education. BioImg.org is freely available at: https://bioimg.org. PMID:26401099

  11. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Volume 1: Baseline architecture report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of the computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  12. BioImg.org: A Catalog of Virtual Machine Images for the Life Sciences.

    PubMed

    Dahlö, Martin; Haziza, Frédéric; Kallio, Aleksi; Korpelainen, Eija; Bongcam-Rudloff, Erik; Spjuth, Ola

    2015-01-01

    Virtualization is becoming increasingly important in bioscience, enabling assembly and provisioning of complete computer setups, including operating system, data, software, and services packaged as virtual machine images (VMIs). We present an open catalog of VMIs for the life sciences, where scientists can share information about images and optionally upload them to a server equipped with a large file system and fast Internet connection. Other scientists can then search for and download images that can be run on the local computer or in a cloud computing environment, providing easy access to bioinformatics environments. We also describe applications where VMIs aid life science research, including distributing tools and data, supporting reproducible analysis, and facilitating education. BioImg.org is freely available at: https://bioimg.org.

  13. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Operations concept report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  14. Alliance for Computational Science Collaboration HBCU Partnership at Fisk University. Final Report 2001

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, W. E.

    2004-08-16

    Computational Science plays a big role in research and development in mathematics, science, engineering and biomedical disciplines. The Alliance for Computational Science Collaboration (ACSC) has the goal of training African-American and other minority scientists in the computational science field for eventual employment with the Department of Energy (DOE). The involvements of Historically Black Colleges and Universities (HBCU) in the Alliance provide avenues for producing future DOE African-American scientists. Fisk University has been participating in this program through grants from the DOE. The DOE grant supported computational science activities at Fisk University. The research areas included energy related projects, distributed computing,more » visualization of scientific systems and biomedical computing. Students' involvement in computational science research included undergraduate summer research at Oak Ridge National Lab, on-campus research involving the participation of undergraduates, participation of undergraduate and faculty members in workshops, and mentoring of students. These activities enhanced research and education in computational science, thereby adding to Fisk University's spectrum of research and educational capabilities. Among the successes of the computational science activities are the acceptance of three undergraduate students to graduate schools with full scholarships beginning fall 2002 (one for master degree program and two for Doctoral degree program).« less

  15. Staff | Computational Science | NREL

    Science.gov Websites

    develops and leads laboratory-wide efforts in high-performance computing and energy-efficient data centers Professional IV-High Perf Computing Jim.Albin@nrel.gov 303-275-4069 Ananthan, Shreyas Senior Scientist - High -Performance Algorithms and Modeling Shreyas.Ananthan@nrel.gov 303-275-4807 Bendl, Kurt IT Professional IV-High

  16. Parallel computer vision

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uhr, L.

    1987-01-01

    This book is written by research scientists involved in the development of massively parallel, but hierarchically structured, algorithms, architectures, and programs for image processing, pattern recognition, and computer vision. The book gives an integrated picture of the programs and algorithms that are being developed, and also of the multi-computer hardware architectures for which these systems are designed.

  17. How to Teach Residue Number System to Computer Scientists and Engineers

    ERIC Educational Resources Information Center

    Navi, K.; Molahosseini, A. S.; Esmaeildoust, M.

    2011-01-01

    The residue number system (RNS) has been an important research field in computer arithmetic for many decades, mainly because of its carry-free nature, which can provide high-performance computing architectures with superior delay specifications. Recently, research on RNS has found new directions that have resulted in the introduction of efficient…

  18. A Research and Development Strategy for High Performance Computing.

    ERIC Educational Resources Information Center

    Office of Science and Technology Policy, Washington, DC.

    This report is the result of a systematic review of the status and directions of high performance computing and its relationship to federal research and development. Conducted by the Federal Coordinating Council for Science, Engineering, and Technology (FCCSET), the review involved a series of workshops attended by numerous computer scientists and…

  19. Relevancy in Problem Solving: A Computational Framework

    ERIC Educational Resources Information Center

    Kwisthout, Johan

    2012-01-01

    When computer scientists discuss the computational complexity of, for example, finding the shortest path from building A to building B in some town or city, their starting point typically is a formal description of the problem at hand, e.g., a graph with weights on every edge where buildings correspond to vertices, routes between buildings to…

  20. Cultivating Critique: A (Humanoid) Response to the Online Teaching of Critical Thinking

    ERIC Educational Resources Information Center

    Waggoner, Matt

    2013-01-01

    The Turing era, defined by British mathematician and computer science pioneer Alan Turing's question about whether or not computers can think, is not over. Philosophers and scientists will continue to haggle over whether thought necessitates intentionality, and whether computation can rise to that level. Meanwhile, another frontier is emerging in…

  1. Knowledge Discovery from Climate Data using Graph-Based Methods

    NASA Astrophysics Data System (ADS)

    Steinhaeuser, K.

    2012-04-01

    Climate and Earth sciences have recently experienced a rapid transformation from a historically data-poor to a data-rich environment, thus bringing them into the realm of the Fourth Paradigm of scientific discovery - a term coined by the late Jim Gray (Hey et al. 2009), the other three being theory, experimentation and computer simulation. In particular, climate-related observations from remote sensors on satellites and weather radars, in situ sensors and sensor networks, as well as outputs of climate or Earth system models from large-scale simulations, provide terabytes of spatio-temporal data. These massive and information-rich datasets offer a significant opportunity for advancing climate science and our understanding of the global climate system, yet current analysis techniques are not able to fully realize their potential benefits. We describe a class of computational approaches, specifically from the data mining and machine learning domains, which may be novel to the climate science domain and can assist in the analysis process. Computer scientists have developed spatial and spatio-temporal analysis techniques for a number of years now, and many of them may be applicable and/or adaptable to problems in climate science. We describe a large-scale, NSF-funded project aimed at addressing climate science question using computational analysis methods; team members include computer scientists, statisticians, and climate scientists from various backgrounds. One of the major thrusts is in the development of graph-based methods, and several illustrative examples of recent work in this area will be presented.

  2. Inherited Retinal Degenerative Clinical Trial Network. Addendum

    DTIC Science & Technology

    2013-10-01

    visual impairment usually ending in blindness. In the United States, the total number of individuals affected by retinitis pigmentosa (RP) and other...linica l trial in the NEER network for autosomal dominant retinitis pigmentosa , and the ProgSTAR studies for Stargardt disease) . As new interventions b... retinitis pigmentosa continues at six sites- the CTEC site at University of Utah and five additional recruitment sites- the Retina Foundation of the

  3. Self-Controlled Synthesis of Hyperbranched Poly(etherketone)s from A2 + B3 Approach in Poly(phosphoric acid)

    DTIC Science & Technology

    2009-01-01

    aromatic keto -band arisen from carboxylic acids, which could be part of terminal groups of HPEKs, ranged from 1708 to 1719 cm1. The carbonyl bands from...1999, 143 , 1–34; (d) Inoue, K. Prog Polym Sci 2000, 25, 453–571; (e) Voit, B. J Polym Sci Part A: Polym Chem 2000, 36, 2505–2525; (f) Hult, A

  4. The Use of Modafinil in Operational Settings: Individual Difference Implications

    DTIC Science & Technology

    2000-03-01

    them "skylark". On another treatment of narcolepsia and idiopathic hypersomnia . hand, some people are eveningness type, they have the The wakening...showed that modafinil acts as idiopathic hypersomnia and narcolepsy with modafinil. an agonist of ct1-adrenergic post-synaptic receptors (4). Prog...of During sustained operations (SUSOPS) it is impossible narcolepsia and idiopatic hypersomnia . It is synthetized for the soldier to sleep, sometimes

  5. Serum lipid levels and steroidal hormones in women runners with irregular menses.

    PubMed

    Thompson, D L; Snead, D B; Seip, R L; Weltman, J Y; Rogol, A D; Weltman, A

    1997-02-01

    This study compared the lipid profile of women runners with menstrual cycle irregularities with their normally menstruating counterparts. Relationships among selected steroid hormones and serum lipid levels in 10 eumenorrheic (EU) and 8 oligo-/amenorrheic (O/A) women runners and 6 eumenorrheic controls (CON) were examined. Serum 17 beta-estradiol (E2), progesterone (Prog), and dehydroepiandrosterone-sulfate (DHEAS) concentrations were determined in daily blood samples for 21 days, and integrated concentrations were calculated. Fasting blood samples were analyzed for total cholesterol (TC), low-density lipoprotein cholesterol (LDL-C), high-density lipoprotein cholesterol (HDL-C), HDL2, HDL3, triglycerides (Trig), and apolipoproteins A-1, A-II, and B. The O/A group had significantly lower E2 and Prog than EU or CON groups. Women in the CON group had lower HDL-C and HDL3 than the runners. With all women grouped together, E2 was not significantly correlated with any measured blood lipid parameters. On the other hand, DHEAS was significantly correlated with HDL-C, HDL2, and apolipoprotein A-I. These data demonstrate that women runners, regardless of menstrual cycle status, exhibit higher HDL-C concentrations than CON and supports previous research reporting a positive association between DHEAS and HDL-C.

  6. Production of xylitol by a Coniochaeta ligniaria strain tolerant of inhibitors and defective in growth on xylose.

    PubMed

    Nichols, Nancy N; Saha, Badal C

    2016-05-01

    In conversion of biomass to fuels or chemicals, inhibitory compounds arising from physical-chemical pretreatment of the feedstock can interfere with fermentation of the sugars to product. Fungal strain Coniochaeta ligniaria NRRL30616 metabolizes the furan aldehydes furfural and 5-hydroxymethylfurfural, as well as a number of aromatic and aliphatic acids and aldehydes. Use of NRRL30616 to condition biomass sugars by metabolizing the inhibitors improves their fermentability. Wild-type C. ligniaria has the ability to grow on xylose as sole source of carbon and energy, with no accumulation of xylitol. Mutants of C. ligniaria unable to grow on xylose were constructed. Xylose reductase and xylitol dehydrogenase activities were reduced by approximately two thirds in mutant C8100. The mutant retained ability to metabolize inhibitors in biomass hydrolysates. Although C. ligniaria C8100 did not grow on xylose, the strain converted a portion of xylose to xylitol, producing 0.59 g xylitol/g xylose in rich medium and 0.48 g xylitol/g xylose in corn stover dilute acid hydrolysate. 2016 American Institute of Chemical Engineers Biotechnol. Prog., 2016 © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:606-612, 2016. © 2016 American Institute of Chemical Engineers.

  7. VizieR Online Data Catalog: The YSO population of LDN 1340 in infrared (Kun+, 2016)

    NASA Astrophysics Data System (ADS)

    Kun, M.; Wolf-Chase, G.; Moor, A.; Apai, D.; Balog, Z.; O'Linger-Luscusk, J.; Moriarty-Schieven, G. H.

    2016-07-01

    L1340 was observed by the Spitzer Space Telescope using Spitzer's Infrared Array Camera (IRAC) on 2009 March 16 and the Multiband Imaging Photometer for Spitzer (MIPS) on 2008 November 26 (Prog. ID: 50691, PI: G. Fazio). The IRAC observations covered ~1deg2 in all four bands. Moreover, a small part of the cloud, centered on RNO 7, was observed in the four IRAC bands on 2006 September 24 (Prog. ID: 30734, PI: D. Figer). We selected candidate YSOs from the Spitzer Enhanced Imaging Products (SEIP) Source List, containing 19745 point sources in the target field. High angular resolution near-infrared images of two small regions of L1340 were obtained on 2002 October 24 in the JHK bands, using the near-infrared camera Omega-Cass, mounted on the 3.5m telescope at the Calar Alto Observatory, Spain. The results for IRAS 02224+7227 have been shown in Kun et al. (2014, J/ApJ/795/L26). Here we present the results for RNO 7. To classify the evolutionary status of the color-selected candidate YSOs and obtain as complete a picture of the SFR and its YSO population as possible, we supplemented the Spitzer data with photometric data available in public databases. See section 2.3 for further details. (13 data files).

  8. Associations between Bisphenol A Exposure and Reproductive Hormones among Female Workers

    PubMed Central

    Miao, Maohua; Yuan, Wei; Yang, Fen; Liang, Hong; Zhou, Zhijun; Li, Runsheng; Gao, Ersheng; Li, De-Kun

    2015-01-01

    The associations between Bisphenol-A (BPA) exposure and reproductive hormone levels among women are unclear. A cross-sectional study was conducted among female workers from BPA-exposed and unexposed factories in China. Women’s blood samples were collected for assay of follicle-stimulating hormone (FSH), luteinizing hormone (LH), 17β-Estradiol (E2), prolactin (PRL), and progesterone (PROG). Their urine samples were collected for BPA measurement. In the exposed group, time weighted average exposure to BPA for an 8-h shift (TWA8), a measure incorporating historic exposure level, was generated based on personal air sampling. Multiple linear regression analyses were used to examine linear associations between urine BPA concentration and reproductive hormones after controlling for potential confounders. A total of 106 exposed and 250 unexposed female workers were included in this study. A significant positive association between increased urine BPA concentration and higher PRL and PROG levels were observed. Similar associations were observed after the analysis was carried out separately among the exposed and unexposed workers. In addition, a positive association between urine BPA and E2 was observed among exposed workers with borderline significance, while a statistically significant inverse association between urine BPA and FSH was observed among unexposed group. The results suggest that BPA exposure may lead to alterations in female reproductive hormone levels. PMID:26506366

  9. Associations between Bisphenol A Exposure and Reproductive Hormones among Female Workers.

    PubMed

    Miao, Maohua; Yuan, Wei; Yang, Fen; Liang, Hong; Zhou, Zhijun; Li, Runsheng; Gao, Ersheng; Li, De-Kun

    2015-10-22

    The associations between Bisphenol-A (BPA) exposure and reproductive hormone levels among women are unclear. A cross-sectional study was conducted among female workers from BPA-exposed and unexposed factories in China. Women's blood samples were collected for assay of follicle-stimulating hormone (FSH), luteinizing hormone (LH), 17β-Estradiol (E2), prolactin (PRL), and progesterone (PROG). Their urine samples were collected for BPA measurement. In the exposed group, time weighted average exposure to BPA for an 8-h shift (TWA8), a measure incorporating historic exposure level, was generated based on personal air sampling. Multiple linear regression analyses were used to examine linear associations between urine BPA concentration and reproductive hormones after controlling for potential confounders. A total of 106 exposed and 250 unexposed female workers were included in this study. A significant positive association between increased urine BPA concentration and higher PRL and PROG levels were observed. Similar associations were observed after the analysis was carried out separately among the exposed and unexposed workers. In addition, a positive association between urine BPA and E2 was observed among exposed workers with borderline significance, while a statistically significant inverse association between urine BPA and FSH was observed among unexposed group. The results suggest that BPA exposure may lead to alterations in female reproductive hormone levels.

  10. The house of the future

    ScienceCinema

    None

    2017-12-09

    Learn what it will take to create tomorrow's net-zero energy home as scientists reveal the secrets of cool roofs, smart windows, and computer-driven energy control systems. The net-zero energy home: Scientists are working to make tomorrow's homes more than just energy efficient -- they want them to be zero energy. Iain Walker, a scientist in the Lab's Energy Performance of Buildings Group, will discuss what it takes to develop net-zero energy houses that generate as much energy as they use through highly aggressive energy efficiency and on-site renewable energy generation. Talking back to the grid: Imagine programming your house to use less energy if the electricity grid is full or price are high. Mary Ann Piette, deputy director of Berkeley Lab's building technology department and director of the Lab's Demand Response Research Center, will discuss how new technologies are enabling buildings to listen to the grid and automatically change their thermostat settings or lighting loads, among other demands, in response to fluctuating electricity prices. The networked (and energy efficient) house: In the future, your home's lights, climate control devices, computers, windows, and appliances could be controlled via a sophisticated digital network. If it's plugged in, it'll be connected. Bruce Nordman, an energy scientist in Berkeley Lab's Energy End-Use Forecasting group, will discuss how he and other scientists are working to ensure these networks help homeowners save energy.

  11. The house of the future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Learn what it will take to create tomorrow's net-zero energy home as scientists reveal the secrets of cool roofs, smart windows, and computer-driven energy control systems. The net-zero energy home: Scientists are working to make tomorrow's homes more than just energy efficient -- they want them to be zero energy. Iain Walker, a scientist in the Lab's Energy Performance of Buildings Group, will discuss what it takes to develop net-zero energy houses that generate as much energy as they use through highly aggressive energy efficiency and on-site renewable energy generation. Talking back to the grid: Imagine programming your house tomore » use less energy if the electricity grid is full or price are high. Mary Ann Piette, deputy director of Berkeley Lab's building technology department and director of the Lab's Demand Response Research Center, will discuss how new technologies are enabling buildings to listen to the grid and automatically change their thermostat settings or lighting loads, among other demands, in response to fluctuating electricity prices. The networked (and energy efficient) house: In the future, your home's lights, climate control devices, computers, windows, and appliances could be controlled via a sophisticated digital network. If it's plugged in, it'll be connected. Bruce Nordman, an energy scientist in Berkeley Lab's Energy End-Use Forecasting group, will discuss how he and other scientists are working to ensure these networks help homeowners save energy.« less

  12. The making of the Women in Biology forum (WiB) at Bioclues.

    PubMed

    Singhania, Reeta Rani; Madduru, Dhatri; Pappu, Pranathi; Panchangam, Sameera; Suravajhala, Renuka; Chandrasekharan, Mohanalatha

    2014-01-01

    The Women in Biology forum (WiB) of Bioclues (India) began in 2009 to promote and support women pursuing careers in bioinformatics and computational biology. WiB was formed in order to help women scientists deprived of basic research, boost the prominence of women scientists particularly from developing countries, and bridge the gender gap to innovation. WiB has also served as a platform to highlight the work of established female scientists in these fields. Several award-winning women researchers have shared their experiences and provided valuable suggestions to WiB. Headed by Mohanalatha Chandrasekharan and supported by Dr. Reeta Rani Singhania and Renuka Suravajhala, WiB has seen major progress in the last couple of years particularly in the two avenues Mentoring and Research, off the four avenues in Bioclues: Mentoring, Outreach, Research and Entrepreneurship (MORE). In line with the Bioclues vision for bioinformatics in India, the WiB Journal Club (JoC) recognizes women scientists working on functional genomics and bioinformatics, and provides scientific mentorship and support for project design and hypothesis formulation. As a part of Bioclues, WiB members practice the group's open-desk policy and its belief that all members are free to express their own thoughts and opinions. The WiB forum appreciates suggestions and welcomes scientists from around the world to be a part of their mission to encourage women to pursue computational biology and bioinformatics.

  13. Synergies and Distinctions between Computational Disciplines in Biomedical Research: Perspective from the Clinical and Translational Science Award Programs

    PubMed Central

    Bernstam, Elmer V.; Hersh, William R.; Johnson, Stephen B.; Chute, Christopher G.; Nguyen, Hien; Sim, Ida; Nahm, Meredith; Weiner, Mark; Miller, Perry; DiLaura, Robert P.; Overcash, Marc; Lehmann, Harold P.; Eichmann, David; Athey, Brian D.; Scheuermann, Richard H.; Anderson, Nick; Starren, Justin B.; Harris, Paul A.; Smith, Jack W.; Barbour, Ed; Silverstein, Jonathan C.; Krusch, David A.; Nagarajan, Rakesh; Becich, Michael J.

    2010-01-01

    Clinical and translational research increasingly requires computation. Projects may involve multiple computationally-oriented groups including information technology (IT) professionals, computer scientists and biomedical informaticians. However, many biomedical researchers are not aware of the distinctions among these complementary groups, leading to confusion, delays and sub-optimal results. Although written from the perspective of clinical and translational science award (CTSA) programs within academic medical centers, the paper addresses issues that extend beyond clinical and translational research. The authors describe the complementary but distinct roles of operational IT, research IT, computer science and biomedical informatics using a clinical data warehouse as a running example. In general, IT professionals focus on technology. The authors distinguish between two types of IT groups within academic medical centers: central or administrative IT (supporting the administrative computing needs of large organizations) and research IT (supporting the computing needs of researchers). Computer scientists focus on general issues of computation such as designing faster computers or more efficient algorithms, rather than specific applications. In contrast, informaticians are concerned with data, information and knowledge. Biomedical informaticians draw on a variety of tools, including but not limited to computers, to solve information problems in health care and biomedicine. The paper concludes with recommendations regarding administrative structures that can help to maximize the benefit of computation to biomedical research within academic health centers. PMID:19550198

  14. CGAT: a model for immersive personalized training in computational genomics

    PubMed Central

    Sims, David; Ponting, Chris P.

    2016-01-01

    How should the next generation of genomics scientists be trained while simultaneously pursuing high quality and diverse research? CGAT, the Computational Genomics Analysis and Training programme, was set up in 2010 by the UK Medical Research Council to complement its investment in next-generation sequencing capacity. CGAT was conceived around the twin goals of training future leaders in genome biology and medicine, and providing much needed capacity to UK science for analysing genome scale data sets. Here we outline the training programme employed by CGAT and describe how it dovetails with collaborative research projects to launch scientists on the road towards independent research careers in genomics. PMID:25981124

  15. Research Projects, Technical Reports and Publications

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1996-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Advanced Methods for Scientific Computing High Performance Networks During this report pefiod Professor Antony Jameson of Princeton University, Professor Wei-Pai Tang of the University of Waterloo, Professor Marsha Berger of New York University, Professor Tony Chan of UCLA, Associate Professor David Zingg of University of Toronto, Canada and Assistant Professor Andrew Sohn of New Jersey Institute of Technology have been visiting RIACS. January 1, 1996 through September 30, 1996 RIACS had three staff scientists, four visiting scientists, one post-doctoral scientist, three consultants, two research associates and one research assistant. RIACS held a joint workshop with Code 1 29-30 July 1996. The workshop was held to discuss needs and opportunities in basic research in computer science in and for NASA applications. There were 14 talks given by NASA, industry and university scientists and three open discussion sessions. There were approximately fifty participants. A proceedings is being prepared. It is planned to have similar workshops on an annual basis. RIACS technical reports are usually preprints of manuscripts that have been submitted to research 'ournals or conference proceedings. A list of these reports for the period January i 1, 1996 through September 30, 1996 is in the Reports and Abstracts section of this report.

  16. Determination of Perceptions of the Teacher Candidates Studying in the Computer and Instructional Technology Department towards Human-Computer Interaction and Related Basic Concepts

    ERIC Educational Resources Information Center

    Kiyici, Mubin

    2011-01-01

    HCI is a field which has an increasing popularity by virtue of the spread of the computers and internet and gradually contributes to the production of the user-friendlier software and hardware with the contribution of the scientists from different disciplines. Teacher candidates studying at the computer and instructional technologies department…

  17. Students as Virtual Scientists: An Exploration of Students' and Teachers' Perceived Realness of a Remote Electron Microscopy Investigation

    ERIC Educational Resources Information Center

    Childers, Gina; Jones, M. Gail

    2015-01-01

    Remote access technologies enable students to investigate science by utilizing scientific tools and communicating in real-time with scientists and researchers with only a computer and an Internet connection. Very little is known about student perceptions of how real remote investigations are and how immersed the students are in the experience.…

  18. Resident research associateships, postdoctoral research awards 1989: opportunities for research at the U.S. Geological Survey, U.S. Department of the Interior

    USGS Publications Warehouse

    ,; ,

    1989-01-01

    The scientists of the U.S. Geological Survey are engaged in a wide range of geologic, geophysical, geochemical, hydrologic, and cartographic programs, including the application of computer science to them. These programs offer exciting possibilities for scientific achievement and professional growth to young scientists through participation as Research Associates.

  19. Biography Today: Profiles of People of Interest to Young Readers. Scientists & Inventors Series, Volume 5.

    ERIC Educational Resources Information Center

    Abbey, Cherie D., Ed.

    This book, a special volume focusing on computer-related scientists and inventors, provides 12 biographical profiles of interest to readers ages 9 and above. The Biography Today series was created to appeal to young readers in a format they can enjoy reading and readily understand. Each entry provides at least one picture of the individual…

  20. A survey of big data research

    PubMed Central

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  1. Integrating citizen-science data with movement models to estimate the size of a migratory golden eagle population

    Treesearch

    Andrew J. Dennhardt; Adam E. Duerr; David Brandes; Todd E. Katzner

    2015-01-01

    Estimating population size is fundamental to conservation and management. Population size is typically estimated using survey data, computer models, or both. Some of the most extensive and often least expensive survey data are those collected by citizen-scientists. A challenge to citizen-scientists is that the vagility of many organisms can complicate data collection....

  2. Welcome to the NASA High Performance Computing and Communications Computational Aerosciences (CAS) Workshop 2000

    NASA Technical Reports Server (NTRS)

    Schulbach, Catherine H. (Editor)

    2000-01-01

    The purpose of the CAS workshop is to bring together NASA's scientists and engineers and their counterparts in industry, other government agencies, and academia working in the Computational Aerosciences and related fields. This workshop is part of the technology transfer plan of the NASA High Performance Computing and Communications (HPCC) Program. Specific objectives of the CAS workshop are to: (1) communicate the goals and objectives of HPCC and CAS, (2) promote and disseminate CAS technology within the appropriate technical communities, including NASA, industry, academia, and other government labs, (3) help promote synergy among CAS and other HPCC scientists, and (4) permit feedback from peer researchers on issues facing High Performance Computing in general and the CAS project in particular. This year we had a number of exciting presentations in the traditional aeronautics, aerospace sciences, and high-end computing areas and in the less familiar (to many of us affiliated with CAS) earth science, space science, and revolutionary computing areas. Presentations of more than 40 high quality papers were organized into ten sessions and presented over the three-day workshop. The proceedings are organized here for easy access: by author, title and topic.

  3. Scalable data management, analysis and visualization (SDAV) Institute. Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geveci, Berk

    The purpose of the SDAV institute is to provide tools and expertise in scientific data management, analysis, and visualization to DOE’s application scientists. Our goal is to actively work with application teams to assist them in achieving breakthrough science, and to provide technical solutions in the data management, analysis, and visualization regimes that are broadly used by the computational science community. Over the last 5 years members of our institute worked directly with application scientists and DOE leadership-class facilities to assist them by applying the best tools and technologies at our disposal. We also enhanced our tools based on inputmore » from scientists on their needs. Many of the applications we have been working with are based on connections with scientists established in previous years. However, we contacted additional scientists though our outreach activities, as well as engaging application teams running on leading DOE computing systems. Our approach is to employ an evolutionary development and deployment process: first considering the application of existing tools, followed by the customization necessary for each particular application, and then the deployment in real frameworks and infrastructures. The institute is organized into three areas, each with area leaders, who keep track of progress, engagement of application scientists, and results. The areas are: (1) Data Management, (2) Data Analysis, and (3) Visualization. Kitware has been involved in the Visualization area. This report covers Kitware’s contributions over the last 5 years (February 2012 – February 2017). For details on the work performed by the SDAV institute as a whole, please see the SDAV final report.« less

  4. Moon Search Algorithms for NASA's Dawn Mission to Asteroid Vesta

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess; Mcfadden, Lucy A.; Skillman, David R.; McLean, Brian; Mutchler, Max; Carsenty, Uri; Palmer, Eric E.

    2012-01-01

    A moon or natural satellite is a celestial body that orbits a planetary body such as a planet, dwarf planet, or an asteroid. Scientists seek understanding the origin and evolution of our solar system by studying moons of these bodies. Additionally, searches for satellites of planetary bodies can be important to protect the safety of a spacecraft as it approaches or orbits a planetary body. If a satellite of a celestial body is found, the mass of that body can also be calculated once its orbit is determined. Ensuring the Dawn spacecraft's safety on its mission to the asteroid Vesta primarily motivated the work of Dawn's Satellite Working Group (SWG) in summer of 2011. Dawn mission scientists and engineers utilized various computational tools and techniques for Vesta's satellite search. The objectives of this paper are to 1) introduce the natural satellite search problem, 2) present the computational challenges, approaches, and tools used when addressing this problem, and 3) describe applications of various image processing and computational algorithms for performing satellite searches to the electronic imaging and computer science community. Furthermore, we hope that this communication would enable Dawn mission scientists to improve their satellite search algorithms and tools and be better prepared for performing the same investigation in 2015, when the spacecraft is scheduled to approach and orbit the dwarf planet Ceres.

  5. A short course on measure and probability theories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pebay, Philippe Pierre

    2004-02-01

    This brief Introduction to Measure Theory, and its applications to Probabilities, corresponds to the lecture notes of a seminar series given at Sandia National Laboratories in Livermore, during the spring of 2003. The goal of these seminars was to provide a minimal background to Computational Combustion scientists interested in using more advanced stochastic concepts and methods, e.g., in the context of uncertainty quantification. Indeed, most mechanical engineering curricula do not provide students with formal training in the field of probability, and even in less in measure theory. However, stochastic methods have been used more and more extensively in the pastmore » decade, and have provided more successful computational tools. Scientists at the Combustion Research Facility of Sandia National Laboratories have been using computational stochastic methods for years. Addressing more and more complex applications, and facing difficult problems that arose in applications showed the need for a better understanding of theoretical foundations. This is why the seminar series was launched, and these notes summarize most of the concepts which have been discussed. The goal of the seminars was to bring a group of mechanical engineers and computational combustion scientists to a full understanding of N. WIENER'S polynomial chaos theory. Therefore, these lectures notes are built along those lines, and are not intended to be exhaustive. In particular, the author welcomes any comments or criticisms.« less

  6. Experiences with Efficient Methodologies for Teaching Computer Programming to Geoscientists

    ERIC Educational Resources Information Center

    Jacobs, Christian T.; Gorman, Gerard J.; Rees, Huw E.; Craig, Lorraine E.

    2016-01-01

    Computer programming was once thought of as a skill required only by professional software developers. But today, given the ubiquitous nature of computation and data science it is quickly becoming necessary for all scientists and engineers to have at least a basic knowledge of how to program. Teaching how to program, particularly to those students…

  7. Creating a Pipeline for African American Computing Science Faculty: An Innovative Faculty/Research Mentoring Program Model

    ERIC Educational Resources Information Center

    Charleston, LaVar J.; Gilbert, Juan E.; Escobar, Barbara; Jackson, Jerlando F. L.

    2014-01-01

    African Americans represent 1.3% of all computing sciences faculty in PhD-granting departments, underscoring the severe underrepresentation of Black/African American tenure-track faculty in computing (CRA, 2012). The Future Faculty/Research Scientist Mentoring (FFRM) program, funded by the National Science Foundation, was found to be an effective…

  8. ICASE

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in the areas of (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving Langley facilities and scientists; and (4) computer science.

  9. Is there a glass ceiling for highly cited scientists at the top of research universities?

    PubMed

    Ioannidis, John P A

    2010-12-01

    University leaders aim to protect, shape, and promote the missions of their institutions. I evaluated whether top highly cited scientists are likely to occupy these positions. Of the current leaders of 96 U.S. high research activity universities, only 6 presidents or chancellors were found among the 4009 U.S. scientists listed in the ISIHighlyCited.com database. Of the current leaders of 77 UK universities, only 2 vice-chancellors were found among the 483 UK scientists listed in the same database. In a sample of 100 top-cited clinical medicine scientists and 100 top-cited biology and biochemistry scientists, only 1 and 1, respectively, had served at any time as president of a university. Among the leaders of 25 U.S. universities with the highest citation volumes, only 12 had doctoral degrees in life, natural, physical or computer sciences, and 5 of these 12 had a Hirsch citation index m < 1.0. The participation of highly cited scientists in the top leadership of universities is limited. This could have consequences for the research and overall mission of universities.

  10. New computer system simplifies programming of mathematical equations

    NASA Technical Reports Server (NTRS)

    Reinfelds, J.; Seitz, R. N.; Wood, L. H.

    1966-01-01

    Automatic Mathematical Translator /AMSTRAN/ permits scientists or engineers to enter mathematical equations in their natural mathematical format and to obtain an immediate graphical display of the solution. This automatic-programming, on-line, multiterminal computer system allows experienced programmers to solve nonroutine problems.

  11. NASA/DOD Aerospace Knowledge Diffusion Research Project. Paper 14: An analysis of the technical communications practices reported by Israeli and US aerospace engineers and scientists

    NASA Technical Reports Server (NTRS)

    Barclay, Rebecca O.; Pinelli, Thomas E.; Elazar, David; Kennedy, John M.

    1991-01-01

    As part of Phase 4 of the NASA/DoD Aerospace Knowledge Diffusion Research Project, two pilot studies were conducted that investigated the technical communications practices of Israeli and U.S. aerospace engineers and scientists. Both studies had the same five objectives: first, to solicit the opinions of aerospace engineers and scientists regarding the importance of technical communications to their profession; second, to determine the use and production of technical communications by aerospace engineers and scientists; third, to seek their view about the appropriate content of an undergraduate course in technical communications; fourth, to determine aerospace engineers' and scientists' use of libraries, technical information centers, and on-line databases; and fifth, to determine the use and importance of computer and information technology to them. A self-administered questionnaire was mailed to randomly selected U.S. aerospace engineers and scientists who are working in cryogenics, adaptive walls, and magnetic suspension. A slightly modified version was sent to Israeli aerospace engineers and scientists working at Israel Aircraft Industries, LTD. Responses of the Israeli and U.S. aerospace engineers and scientists to selected questions are presented in this paper.

  12. Science& Technology Review June 2003

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McMahon, D

    This month's issue has the following articles: (1) Livermore's Three-Pronged Strategy for High-Performance Computing, Commentary by Dona Crawford; (2) Riding the Waves of Supercomputing Technology--Livermore's Computation Directorate is exploiting multiple technologies to ensure high-performance, cost-effective computing; (3) Chromosome 19 and Lawrence Livermore Form a Long-Lasting Bond--Lawrence Livermore biomedical scientists have played an important role in the Human Genome Project through their long-term research on chromosome 19; (4) A New Way to Measure the Mass of Stars--For the first time, scientists have determined the mass of a star in isolation from other celestial bodies; and (5) Flexibly Fueled Storage Tank Bringsmore » Hydrogen-Powered Cars Closer to Reality--Livermore's cryogenic hydrogen fuel storage tank for passenger cars of the future can accommodate three forms of hydrogen fuel separately or in combination.« less

  13. Ab initio Quantum Chemical and Experimental Reaction Kinetics Studies in the Combustion of Bipropellants

    DTIC Science & Technology

    2017-03-24

    NUMBER (Include area code) 24 March 2017 Briefing Charts 01 March 2017 - 31 March 2017 Ab initio Quantum Chemical and Experimental Reaction Kinetics...Laboratory AFRL/RQRS 1 Ara Road Edwards AFB, CA 93524 *Email: ghanshyam.vaghjiani@us.af.mil Ab initio Quantum Chemical and Experimental Reaction ...Clearance 17161 Zador et al., Prog. Energ. Combust. Sci., 37 371 (2011) Why Quantum Chemical Reaction Kinetics Studies? DISTRIBUTION A: Approved for

  14. Acoustic Directivity Patterns for Army Weapons

    DTIC Science & Technology

    1979-01-01

    work was performed by the Environmental Division (EN), u.S. Army Construction Engineering Research Laboratory (CERL). Dr. R. K. Jain is Chief of EN...V) P.0. Schomer,,, L. M./Little I rTPRFORMING ORGANI ZATION NAME AND ADDRESS 0.PROG3RAM ELEMENT. PPOJECT, TAWF U.S. ARMY AREA & WORK UNIT NUMBERS...34Environmental Quality for Construction and Operation of Military Facilities" Task 03, "Pollution Control Technology" and Work Unit 001, "Prediction of

  15. Optical Limiting in Photonic Crystal Fibers

    DTIC Science & Technology

    2004-12-01

    Optical Limiting in Photonic Crystal Fibers Mark Bloemer, Michael Scalora , Wayne Davenport, and Evgeni Poliakov(NRC Postdoc) RDECOM, Aviation...Shcherbakov, E. Wintner, M. Scalora , and A. M. Zheltikov, Appl. Opt., in press. 7 21. C. M. de Sterke and J. E. Sipe, Prog. Opt. 33, 203 (1994... Scalora , J..P. Dowling, C.M. Bowden, and M.J. Bloemer, Phys. Rev. Lett. 73, 1368 (1994). 26. M.D. Tocci, M.J. Bloemer, M. Scalora , J.P. Dowling

  16. Principles of Work Sample Testing. 4. Generalizability

    DTIC Science & Technology

    1979-04-01

    ARI TECHNICAL REPORT TR-79-A11 Principles of Work Sample Testing: IV. Generallzability , by ,lobert M. Guion Gail H. Ironson BOWLING GREEN STATE...UNIVERSITY Bowling Green , Ohio 43403 April 1979 or -Contract DAHC 19-77-C-0007 d CD, LUa.J Prepared for -_J ;=U.S. ARMY RESEARCH INSTITUTE for the...ORGANIZATION NAME AND ADDRESS 10. PROG RAM ELEMENT. PROJECT, TASK A REA & WORK UNIT NUMBERS Bowling Green State University __ Bowling Green , Ohio 43403

  17. Department of Defense In-House RDT and E Activities

    DTIC Science & Technology

    1978-10-30

    SERVICF DFV CML CML- BIO OFFENSE SMOKE/OBSCURANT TEST PROGRAMS CONDUCTS R & 0 & LAB INVESTIGATIONS NECESSARY TO SUPPORT MISSIONCONOUCTS JT OP CML & CNL... BIO nEFENSE TESTS/STUEPFS FOR CTNCS, SERVICESCONDUCTS PROG TO SUP ARMY POLL ABATEMENT HAZARD EVAL E DFMIL OPNSCONDUCTS ECOLOGICAL EPIDEMIOLOGICAL C...CDRoCOL. H. F. PENNY TECHODIR.DR. DALE H. STELING PROGRAM DATA BY FISCAL YEAR (MILLION S) PROGRAM 1978 1979 (ACT UAL) (ACT + EST)TOTAL RDT&E 23&722

  18. Integrated Strike Avionics Study. Volume 1

    DTIC Science & Technology

    1980-10-01

    MMW Systems Targeting Studies Perf. Meas. o C02 Laser Radar Ses. St. Army Obstacle Detect Prog. Concept Demo Mobile System 20 ’ - I...Fabrication and Test o FLIR Field of View & Classification Study (FLIR FACS) Definition m Development & Test 4. Aplicability of Current Programs to...FY80 81 8283 84 85 o LANTIRN 1 n Imaoinn Sensor Autoprocessor • o Forward Looking Active Class a 4. Aplicability of Current Program Required The need

  19. Transitioning DARPA Technology

    DTIC Science & Technology

    2001-05-01

    logo suggests, the Institute’s work reflects the summation of technology’s effects on business and government. With a reputation for fierce objectivity... effective for "customerpull" strategies. b. Products moved along the DIS path 30 percent of the time. This path was particularlysuccessful for small...must often be "waited out." But DARPA ha s few effective mechanisms for continuing to "market" its products after the prog ram is over- particularly

  20. Triangle Computer Science Distinguished Lecture Series

    DTIC Science & Technology

    2018-01-30

    scientific inquiry - the cell, the brain, the market - as well as in the models developed by scientists over the centuries for studying them. Human...the great objects of scientific inquiry - the cell, the brain, the market - as well as in the models developed by scientists over the centuries for...in principle , secure system operation can be achieved. Massive-Scale Streaming Analytics David Bader, Georgia Institute of Technology (telecast from

  1. Amplify scientific discovery with artificial intelligence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gil, Yolanda; Greaves, Mark T.; Hendler, James

    Computing innovations have fundamentally changed many aspects of scientific inquiry. For example, advances in robotics, high-end computing, networking, and databases now underlie much of what we do in science such as gene sequencing, general number crunching, sharing information between scientists, and analyzing large amounts of data. As computing has evolved at a rapid pace, so too has its impact in science, with the most recent computing innovations repeatedly being brought to bear to facilitate new forms of inquiry. Recently, advances in Artificial Intelligence (AI) have deeply penetrated many consumer sectors, including for example Apple’s Siri™ speech recognition system, real-time automatedmore » language translation services, and a new generation of self-driving cars and self-navigating drones. However, AI has yet to achieve comparable levels of penetration in scientific inquiry, despite its tremendous potential in aiding computers to help scientists tackle tasks that require scientific reasoning. We contend that advances in AI will transform the practice of science as we are increasingly able to effectively and jointly harness human and machine intelligence in the pursuit of major scientific challenges.« less

  2. The Computer Simulation of Liquids by Molecular Dynamics.

    ERIC Educational Resources Information Center

    Smith, W.

    1987-01-01

    Proposes a mathematical computer model for the behavior of liquids using the classical dynamic principles of Sir Isaac Newton and the molecular dynamics method invented by other scientists. Concludes that other applications will be successful using supercomputers to go beyond simple Newtonian physics. (CW)

  3. Carbon Smackdown: Visualizing Clean Energy (LBNL Summer Lecture Series)

    ScienceCinema

    Meza, Juan [LBNL Computational Research Division

    2017-12-09

    The final Carbon Smackdown match took place Aug. 9, 2010. Juan Meza of the Computational Research Division revealed how scientists use computer visualizations to accelerate climate research and discuss the development of next-generation clean energy technologies such as wind turbines and solar cells.

  4. Interfacing the Experimenter to the Computer: Languages for Psychologists

    ERIC Educational Resources Information Center

    Wood, Ronald W.; And Others

    1975-01-01

    An examination and comparison of the computer languages which behavioral scientists are most likely to use: SCAT, INTERACT, SKED, OS/8 Fortran IV, RT11/Fortran, RSX-11M, Data General's Real-Time; Disk Operating System and its Fortran, and interpretative Languages. (EH)

  5. Programming Digital Stories and How-to Animations

    ERIC Educational Resources Information Center

    Hansen, Alexandria Killian; Iveland, Ashley; Harlow, Danielle Boyd; Dwyer, Hilary; Franklin, Diana

    2015-01-01

    As science teachers continue preparing for implementation of the "Next Generation Science Standards," one recommendation is to use computer programming as a promising context to efficiently integrate science and engineering. In this article, a interdisciplinary team of educational researchers and computer scientists describe how to use…

  6. EASI: An electronic assistant for scientific investigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schur, A.; Feller, D.; DeVaney, M.

    1991-09-01

    Although many automated tools support the productivity of professionals (engineers, managers, architects, secretaries, etc.), none specifically address the needs of the scientific researcher. The scientist's needs are complex and the primary activities are cognitive rather than physical. The individual scientist collects and manipulates large data sets, integrates, synthesizes, generates, and records information. The means to access and manipulate information are a critical determinant of the performance of the system as a whole. One hindrance in this process is the scientist's computer environment, which has changed little in the last two decades. Extensive time and effort is demanded from the scientistmore » to learn to use the computer system. This paper describes how chemists' activities and interactions with information were abstracted into a common paradigm that meets the critical requirement of facilitating information access and retrieval. This paradigm was embodied in EASI, a working prototype that increased the productivity of the individual scientific researcher. 4 refs., 2 figs., 1 tab.« less

  7. Educational NASA Computational and Scientific Studies (enCOMPASS)

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess

    2013-01-01

    Educational NASA Computational and Scientific Studies (enCOMPASS) is an educational project of NASA Goddard Space Flight Center aimed at bridging the gap between computational objectives and needs of NASA's scientific research, missions, and projects, and academia's latest advances in applied mathematics and computer science. enCOMPASS achieves this goal via bidirectional collaboration and communication between NASA and academia. Using developed NASA Computational Case Studies in university computer science/engineering and applied mathematics classes is a way of addressing NASA's goals of contributing to the Science, Technology, Education, and Math (STEM) National Objective. The enCOMPASS Web site at http://encompass.gsfc.nasa.gov provides additional information. There are currently nine enCOMPASS case studies developed in areas of earth sciences, planetary sciences, and astrophysics. Some of these case studies have been published in AIP and IEEE's Computing in Science and Engineering magazines. A few university professors have used enCOMPASS case studies in their computational classes and contributed their findings to NASA scientists. In these case studies, after introducing the science area, the specific problem, and related NASA missions, students are first asked to solve a known problem using NASA data and past approaches used and often published in a scientific/research paper. Then, after learning about the NASA application and related computational tools and approaches for solving the proposed problem, students are given a harder problem as a challenge for them to research and develop solutions for. This project provides a model for NASA scientists and engineers on one side, and university students, faculty, and researchers in computer science and applied mathematics on the other side, to learn from each other's areas of work, computational needs and solutions, and the latest advances in research and development. This innovation takes NASA science and engineering applications to computer science and applied mathematics university classes, and makes NASA objectives part of the university curricula. There is great potential for growth and return on investment of this program to the point where every major university in the U.S. would use at least one of these case studies in one of their computational courses, and where every NASA scientist and engineer facing a computational challenge (without having resources or expertise to solve it) would use enCOMPASS to formulate the problem as a case study, provide it to a university, and get back their solutions and ideas.

  8. A History of the Liberal Arts Computer Science Consortium and Its Model Curricula

    ERIC Educational Resources Information Center

    Bruce, Kim B.; Cupper, Robert D.; Scot Drysdale, Robert L.

    2010-01-01

    With the support of a grant from the Sloan Foundation, nine computer scientists from liberal arts colleges came together in October, 1984 to form the Liberal Arts Computer Science Consortium (LACS) and to create a model curriculum appropriate for liberal arts colleges. Over the years the membership has grown and changed, but the focus has remained…

  9. Computers in Education: Realizing the Potential. Report of a Research Conference, Pittsburgh, Pennsylvania, November 20-24, 1982.

    ERIC Educational Resources Information Center

    Lesgold, Alan M., Ed.; Reif, Frederick, Ed.

    The full proceedings are provided here of a conference of 40 teachers, educational researchers, and scientists from both the public and private sectors that centered on the future of computers in education and the research required to realize the computer's educational potential. A summary of the research issues considered and suggested means for…

  10. Simulation Modeling of Lakes in Undergraduate and Graduate Classrooms Increases Comprehension of Climate Change Concepts and Experience with Computational Tools

    ERIC Educational Resources Information Center

    Carey, Cayelan C.; Gougis, Rebekka Darner

    2017-01-01

    Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing)…

  11. Sustaining and Extending the Open Science Grid: Science Innovation on a PetaScale Nationwide Facility (DE-FC02-06ER41436) SciDAC-2 Closeout Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Livny, Miron; Shank, James; Ernst, Michael

    Under this SciDAC-2 grant the project’s goal w a s t o stimulate new discoveries by providing scientists with effective and dependable access to an unprecedented national distributed computational facility: the Open Science Grid (OSG). We proposed to achieve this through the work of the Open Science Grid Consortium: a unique hands-on multi-disciplinary collaboration of scientists, software developers and providers of computing resources. Together the stakeholders in this consortium sustain and use a shared distributed computing environment that transforms simulation and experimental science in the US. The OSG consortium is an open collaboration that actively engages new research communities. Wemore » operate an open facility that brings together a broad spectrum of compute, storage, and networking resources and interfaces to other cyberinfrastructures, including the US XSEDE (previously TeraGrid), the European Grids for ESciencE (EGEE), as well as campus and regional grids. We leverage middleware provided by computer science groups, facility IT support organizations, and computing programs of application communities for the benefit of consortium members and the US national CI.« less

  12. Visually impaired researchers get their hands on quantum chemistry: application to a computational study on the isomerization of a sterol.

    PubMed

    Lounnas, Valère; Wedler, Henry B; Newman, Timothy; Schaftenaar, Gijs; Harrison, Jason G; Nepomuceno, Gabriella; Pemberton, Ryan; Tantillo, Dean J; Vriend, Gert

    2014-11-01

    In molecular sciences, articles tend to revolve around 2D representations of 3D molecules, and sighted scientists often resort to 3D virtual reality software to study these molecules in detail. Blind and visually impaired (BVI) molecular scientists have access to a series of audio devices that can help them read the text in articles and work with computers. Reading articles published in this journal, though, is nearly impossible for them because they need to generate mental 3D images of molecules, but the article-reading software cannot do that for them. We have previously designed AsteriX, a web server that fully automatically decomposes articles, detects 2D plots of low molecular weight molecules, removes meta data and annotations from these plots, and converts them into 3D atomic coordinates. AsteriX-BVI goes one step further and converts the 3D representation into a 3D printable, haptic-enhanced format that includes Braille annotations. These Braille-annotated physical 3D models allow BVI scientists to generate a complete mental model of the molecule. AsteriX-BVI uses Molden to convert the meta data of quantum chemistry experiments into BVI friendly formats so that the entire line of scientific information that sighted people take for granted-from published articles, via printed results of computational chemistry experiments, to 3D models-is now available to BVI scientists too. The possibilities offered by AsteriX-BVI are illustrated by a project on the isomerization of a sterol, executed by the blind co-author of this article (HBW).

  13. Visually impaired researchers get their hands on quantum chemistry: application to a computational study on the isomerization of a sterol

    NASA Astrophysics Data System (ADS)

    Lounnas, Valère; Wedler, Henry B.; Newman, Timothy; Schaftenaar, Gijs; Harrison, Jason G.; Nepomuceno, Gabriella; Pemberton, Ryan; Tantillo, Dean J.; Vriend, Gert

    2014-11-01

    In molecular sciences, articles tend to revolve around 2D representations of 3D molecules, and sighted scientists often resort to 3D virtual reality software to study these molecules in detail. Blind and visually impaired (BVI) molecular scientists have access to a series of audio devices that can help them read the text in articles and work with computers. Reading articles published in this journal, though, is nearly impossible for them because they need to generate mental 3D images of molecules, but the article-reading software cannot do that for them. We have previously designed AsteriX, a web server that fully automatically decomposes articles, detects 2D plots of low molecular weight molecules, removes meta data and annotations from these plots, and converts them into 3D atomic coordinates. AsteriX-BVI goes one step further and converts the 3D representation into a 3D printable, haptic-enhanced format that includes Braille annotations. These Braille-annotated physical 3D models allow BVI scientists to generate a complete mental model of the molecule. AsteriX-BVI uses Molden to convert the meta data of quantum chemistry experiments into BVI friendly formats so that the entire line of scientific information that sighted people take for granted—from published articles, via printed results of computational chemistry experiments, to 3D models—is now available to BVI scientists too. The possibilities offered by AsteriX-BVI are illustrated by a project on the isomerization of a sterol, executed by the blind co-author of this article (HBW).

  14. CGAT: a model for immersive personalized training in computational genomics.

    PubMed

    Sims, David; Ponting, Chris P; Heger, Andreas

    2016-01-01

    How should the next generation of genomics scientists be trained while simultaneously pursuing high quality and diverse research? CGAT, the Computational Genomics Analysis and Training programme, was set up in 2010 by the UK Medical Research Council to complement its investment in next-generation sequencing capacity. CGAT was conceived around the twin goals of training future leaders in genome biology and medicine, and providing much needed capacity to UK science for analysing genome scale data sets. Here we outline the training programme employed by CGAT and describe how it dovetails with collaborative research projects to launch scientists on the road towards independent research careers in genomics. © The Author 2015. Published by Oxford University Press.

  15. The Man computer Interactive Data Access System: 25 Years of Interactive Processing.

    NASA Astrophysics Data System (ADS)

    Lazzara, Matthew A.; Benson, John M.; Fox, Robert J.; Laitsch, Denise J.; Rueden, Joseph P.; Santek, David A.; Wade, Delores M.; Whittaker, Thomas M.; Young, J. T.

    1999-02-01

    On 12 October 1998, it was the 25th anniversary of the Man computer Interactive Data Access System (McIDAS). On that date in 1973, McIDAS was first used operationally by scientists as a tool for data analysis. Over the last 25 years, McIDAS has undergone numerous architectural changes in an effort to keep pace with changing technology. In its early years, significant technological breakthroughs were required to achieve the functionality needed by atmospheric scientists. Today McIDAS is challenged by new Internet-based approaches to data access and data display. The history and impact of McIDAS, along with some of the lessons learned, are presented here

  16. Applications of genetic programming in cancer research.

    PubMed

    Worzel, William P; Yu, Jianjun; Almal, Arpit A; Chinnaiyan, Arul M

    2009-02-01

    The theory of Darwinian evolution is the fundamental keystones of modern biology. Late in the last century, computer scientists began adapting its principles, in particular natural selection, to complex computational challenges, leading to the emergence of evolutionary algorithms. The conceptual model of selective pressure and recombination in evolutionary algorithms allow scientists to efficiently search high dimensional space for solutions to complex problems. In the last decade, genetic programming has been developed and extensively applied for analysis of molecular data to classify cancer subtypes and characterize the mechanisms of cancer pathogenesis and development. This article reviews current successes using genetic programming and discusses its potential impact in cancer research and treatment in the near future.

  17. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    NASA Astrophysics Data System (ADS)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  18. Introducing biomimetic shear and ion gradients to microfluidic spinning improves silk fiber strength.

    PubMed

    Li, David; Jacobsen, Matthew M; Gyune Rim, Nae; Backman, Daniel; Kaplan, David L; Wong, Joyce Y

    2017-05-31

    Silkworm silk is an attractive biopolymer for biomedical applications due to its high mechanical strength and biocompatibility; as a result, there is increasing interest in scalable devices to spin silk and recombinant silk so as to improve and customize their properties for diverse biomedical purposes (Vepari and Kaplan 2007 Prog. Polym. Sci. 32 ). While artificial spinning of regenerated silk fibroins adds tunability to properties such as degradation rate and surface functionalization, the resulting fibers do not yet approach the mechanical strength of native silkworm silk. These drawbacks reduce the applicability and attractiveness of artificial silk (Kinahan et al 2011 Biomacromolecules 12 ). Here, we used computational fluid dynamic simulations to incorporate shear in tandem with biomimetic ion gradients by coupling a modular novel glass microfluidic device to our previous co-axial flow device. Fibers spun with this combined apparatus demonstrated a significant increase in mechanical strength compared to fibers spun with the basic apparatus alone, with a three-fold increase in Young's modulus and extensibility and a twelve-fold increase in toughness. These results thus demonstrate the critical importance of ionic milieu and shear stress in spinning strong fibers from solubilized silk fibroin.

  19. Signatures of van der Waals binding: A coupling-constant scaling analysis

    NASA Astrophysics Data System (ADS)

    Jiao, Yang; Schröder, Elsebeth; Hyldgaard, Per

    2018-02-01

    The van der Waals (vdW) density functional (vdW-DF) method [Rep. Prog. Phys. 78, 066501 (2015), 10.1088/0034-4885/78/6/066501] describes dispersion or vdW binding by tracking the effects of an electrodynamic coupling among pairs of electrons and their associated exchange-correlation holes. This is done in a nonlocal-correlation energy term Ecnl, which permits density functional theory calculation in the Kohn-Sham scheme. However, to map the nature of vdW forces in a fully interacting materials system, it is necessary to also account for associated kinetic-correlation energy effects. Here, we present a coupling-constant scaling analysis, which permits us to compute the kinetic-correlation energy Tcnl that is specific to the vdW-DF account of nonlocal correlations. We thus provide a more complete spatially resolved analysis of the electrodynamical-coupling nature of nonlocal-correlation binding, including vdW attraction, in both covalently and noncovalently bonded systems. We find that kinetic-correlation energy effects play a significant role in the account of vdW or dispersion interactions among molecules. Furthermore, our mapping shows that the total nonlocal-correlation binding is concentrated to pockets in the sparse electron distribution located between the material fragments.

  20. Methods for Creating and Animating a Computer Model Depicting the Structure and Function of the Sarcoplasmic Reticulum Calcium ATPase Enzyme.

    ERIC Educational Resources Information Center

    Chen, Alice Y.; McKee, Nancy

    1999-01-01

    Describes the developmental process used to visualize the calcium ATPase enzyme of the sarcoplasmic reticulum which involves evaluating scientific information, consulting scientists, model making, storyboarding, and creating and editing in a computer medium. (Author/CCM)

  1. Inquiring Minds

    Science.gov Websites

    -performance Computing Grid Computing Networking Mass Storage Plan for the Future State of the Laboratory to help decipher the language of high-energy physics. Virtual Ask-a-Scientist Read transcripts from past online chat sessions. last modified 1/04/2005 email Fermilab Fermi National Accelerator Laboratory

  2. Computing Logarithms by Hand

    ERIC Educational Resources Information Center

    Reed, Cameron

    2016-01-01

    How can old-fashioned tables of logarithms be computed without technology? Today, of course, no practicing mathematician, scientist, or engineer would actually use logarithms to carry out a calculation, let alone worry about deriving them from scratch. But high school students may be curious about the process. This article develops a…

  3. Participatory Design of Human-Centered Cyberinfrastructure (Invited)

    NASA Astrophysics Data System (ADS)

    Pennington, D. D.; Gates, A. Q.

    2010-12-01

    Cyberinfrastructure, by definition, is about people sharing resources to achieve outcomes that cannot be reached independently. CI depends not just on creating discoverable resources, or tools that allow those resources to be processed, integrated, and visualized -- but on human activation of flows of information across those resources. CI must be centered on human activities. Yet for those CI projects that are directed towards observational science, there are few models for organizing collaborative research in ways that align individual research interests into a collective vision of CI-enabled science. Given that the emerging technologies are themselves expected to change the way science is conducted, it is not simply a matter of conducting requirements analysis on how scientists currently work, or building consensus among the scientists on what is needed. Developing effective CI depends on generating a new, creative vision of problem solving within a community based on computational concepts that are, in some cases, still very abstract and theoretical. The computer science theory may (or may not) be well formalized, but the potential for impact on any particular domain is typically ill-defined. In this presentation we will describe approaches being developed and tested at the CyberShARE Center of Excellence at University of Texas in El Paso for ill-structured problem solving within cross-disciplinary teams of scientists and computer scientists working on data intensive environmental and geoscience. These approaches deal with the challenges associated with sharing and integrating knowledge across disciplines; the challenges of developing effective teamwork skills in a culture that favors independent effort; and the challenges of evolving shared, focused research goals from ill-structured, vague starting points - all issues that must be confronted by every interdisciplinary CI project. We will introduce visual and semantic-based tools that can enable the collaborative research design process and illustrate their application in designing and developing useful end-to-end data solutions for scientists. Lastly, we will outline areas of future investigation within CyberShARE that we believe have the potential for high impact.

  4. Computer measurement of particle sizes in electron microscope images

    NASA Technical Reports Server (NTRS)

    Hall, E. L.; Thompson, W. B.; Varsi, G.; Gauldin, R.

    1976-01-01

    Computer image processing techniques have been applied to particle counting and sizing in electron microscope images. Distributions of particle sizes were computed for several images and compared to manually computed distributions. The results of these experiments indicate that automatic particle counting within a reasonable error and computer processing time is feasible. The significance of the results is that the tedious task of manually counting a large number of particles can be eliminated while still providing the scientist with accurate results.

  5. Proteomics, lipidomics, metabolomics: a mass spectrometry tutorial from a computer scientist's point of view.

    PubMed

    Smith, Rob; Mathis, Andrew D; Ventura, Dan; Prince, John T

    2014-01-01

    For decades, mass spectrometry data has been analyzed to investigate a wide array of research interests, including disease diagnostics, biological and chemical theory, genomics, and drug development. Progress towards solving any of these disparate problems depends upon overcoming the common challenge of interpreting the large data sets generated. Despite interim successes, many data interpretation problems in mass spectrometry are still challenging. Further, though these challenges are inherently interdisciplinary in nature, the significant domain-specific knowledge gap between disciplines makes interdisciplinary contributions difficult. This paper provides an introduction to the burgeoning field of computational mass spectrometry. We illustrate key concepts, vocabulary, and open problems in MS-omics, as well as provide invaluable resources such as open data sets and key search terms and references. This paper will facilitate contributions from mathematicians, computer scientists, and statisticians to MS-omics that will fundamentally improve results over existing approaches and inform novel algorithmic solutions to open problems.

  6. Nature apps: Waiting for the revolution.

    PubMed

    Jepson, Paul; Ladle, Richard J

    2015-12-01

    Apps are small task-orientated programs with the potential to integrate the computational and sensing capacities of smartphones with the power of cloud computing, social networking, and crowdsourcing. They have the potential to transform how humans interact with nature, cause a step change in the quantity and resolution of biodiversity data, democratize access to environmental knowledge, and reinvigorate ways of enjoying nature. To assess the extent to which this potential is being exploited in relation to nature, we conducted an automated search of the Google Play Store using 96 nature-related terms. This returned data on ~36 304 apps, of which ~6301 were nature-themed. We found that few of these fully exploit the full range of capabilities inherent in the technology and/or have successfully captured the public imagination. Such breakthroughs will only be achieved by increasing the frequency and quality of collaboration between environmental scientists, information engineers, computer scientists, and interested publics.

  7. The APECS Virtual Poster Session: a virtual platform for science communication and discussion

    NASA Astrophysics Data System (ADS)

    Renner, A.; Jochum, K.; Jullion, L.; Pavlov, A.; Liggett, D.; Fugmann, G.; Baeseman, J. L.; Apecs Virtual Poster Session Working Group, T.

    2011-12-01

    The Virtual Poster Session (VPS) of the Association of Polar Early Career Scientists (APECS) was developed by early career scientists as an online tool for communicating and discussing science and research beyond the four walls of a conference venue. Poster sessions often are the backbone of a conference where especially early career scientists get a chance to communicate their research, discuss ideas, data, and scientific problems with their peers and senior scientists. There, they can hone their 'elevator pitch', discussion skills and presentation skills. APECS has taken the poster session one step further and created the VPS - the same idea but independent from conferences, travel, and location. All that is needed is a computer with internet access. Instead of letting their posters collect dust on the computer's hard drive, scientists can now upload them to the APECS website. There, others have the continuous opportunity to comment, give feedback and discuss the work. Currently, about 200 posters are accessible contributed by authors and co-authors from 34 countries. Since January 2010, researchers can discuss their poster with a broad international audience including fellow researchers, community members, potential colleagues and collaborators, policy makers and educators during monthly conference calls via an internet platform. Recordings of the calls are available online afterwards. Calls so far have included topical sessions on e.g. marine biology, glaciology, or social sciences, and interdisciplinary calls on Arctic sciences or polar research activities in a specific country, e.g. India or Romania. They attracted audiences of scientists at all career stages and from all continents, with on average about 15 persons participating per call. Online tools like the VPS open up new ways for creating collaborations and new research ideas and sharing different methodologies for future projects, pushing aside the boundaries of countries and nations, conferences, offices, and disciplines, and provide early career scientists with easily accessible training opportunities for their communication and outreach skills, independent of their location and funding situation.

  8. Achieving Operational Adaptability: Capacity Building Needs to Become a Warfighting Function

    DTIC Science & Technology

    2010-04-26

    platypus effect as described by David Green in The Serendipity Machine: A Voyage of Discovery Through the Unexpected World of Computers. Early in...the 18th century, the discovery of the platypus challenged the categories of animal life recognized and utilized by scientists in Europe. Scientists...resisted changing their categories for years. At first, they believed the platypus was a fabrication. Later, they resisted change since they were

  9. The dynamics of Brazilian protozoology over the past century.

    PubMed

    Elias, M Carolina; Floeter-Winter, Lucile M; Mena-Chalco, Jesus P

    2016-01-01

    Brazilian scientists have been contributing to the protozoology field for more than 100 years with important discoveries of new species such as Trypanosoma cruzi and Leishmania spp. In this work, we used a Brazilian thesis database (Coordination for the Improvement of Higher Education Personnel) covering the period from 1987-2011 to identify researchers who contributed substantially to protozoology. We selected 248 advisors by filtering to obtain researchers who supervised at least 10 theses. Based on a computational analysis of the thesis databases, we found students who were supervised by these scientists. A computational procedure was developed to determine the advisors' scientific ancestors using the Lattes Platform. These analyses provided a list of 1,997 researchers who were inspected through Lattes CV examination and allowed the identification of the pioneers of Brazilian protozoology. Moreover, we investigated the areas in which researchers who earned PhDs in protozoology are now working. We found that 68.4% of them are still in protozoology, while 16.7% have migrated to other fields. We observed that support for protozoology by national or international agencies is clearly correlated with the increase of scientists in the field. Finally, we described the academic genealogy of Brazilian protozoology by formalising the "forest" of Brazilian scientists involved in the study of protozoa and their vectors over the past century.

  10. The dynamics of Brazilian protozoology over the past century

    PubMed Central

    Elias, M Carolina; Floeter-Winter, Lucile M; Mena-Chalco, Jesus P

    2016-01-01

    Brazilian scientists have been contributing to the protozoology field for more than 100 years with important discoveries of new species such asTrypanosoma cruzi and Leishmania spp. In this work, we used a Brazilian thesis database (Coordination for the Improvement of Higher Education Personnel) covering the period from 1987-2011 to identify researchers who contributed substantially to protozoology. We selected 248 advisors by filtering to obtain researchers who supervised at least 10 theses. Based on a computational analysis of the thesis databases, we found students who were supervised by these scientists. A computational procedure was developed to determine the advisors’ scientific ancestors using the Lattes Platform. These analyses provided a list of 1,997 researchers who were inspected through Lattes CV examination and allowed the identification of the pioneers of Brazilian protozoology. Moreover, we investigated the areas in which researchers who earned PhDs in protozoology are now working. We found that 68.4% of them are still in protozoology, while 16.7% have migrated to other fields. We observed that support for protozoology by national or international agencies is clearly correlated with the increase of scientists in the field. Finally, we described the academic genealogy of Brazilian protozoology by formalising the “forest” of Brazilian scientists involved in the study of protozoa and their vectors over the past century. PMID:26814646

  11. Human Exploration Ethnography of the Haughton-Mars Project, 1998-1999

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Swanson, Keith (Technical Monitor)

    1999-01-01

    During the past two field seasons, July 1988 and 1999, we have conducted research about the field practices of scientists and engineers at Haughton Crater on Devon Island in the Canadian Arctic, with the objective of determining how people will live and work on Mars. This broad investigation of field life and work practice, part of the Haughton-Mars Project lead by Pascal Lee, spans social and cognitive anthropology, psychology, and computer science. Our approach involves systematic observation and description of activities, places, and concepts, constituting an ethnography of field science at Haughton. Our focus is on human behaviors-what people do, where, when, with whom, and why. By locating behavior in time and place-in contrast with a purely functional or "task oriented" description of work-we find patterns constituting the choreography of interaction between people, their habitat, and their tools. As such, we view the exploration process in terms of a total system comprising a social organization, facilities, terrain/climate, personal identities, artifacts, and computer tools. Because we are computer scientists seeking to develop new kinds of tools for living and working on Mars, we focus on the existing representational tools (such as documents and measuring devices), learning and improvization (such as use of the internet or informal assistance), and prototype computational systems brought to the field. Our research is based on partnership, by which field scientists and engineers actively contribute to our findings, just as we participate in their work and life.

  12. Dynamic Collaboration Infrastructure for Hydrologic Science

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Castillo, C.; Yi, H.; Jiang, F.; Jones, N.; Goodall, J. L.

    2016-12-01

    Data and modeling infrastructure is becoming increasingly accessible to water scientists. HydroShare is a collaborative environment that currently offers water scientists the ability to access modeling and data infrastructure in support of data intensive modeling and analysis. It supports the sharing of and collaboration around "resources" which are social objects defined to include both data and models in a structured standardized format. Users collaborate around these objects via comments, ratings, and groups. HydroShare also supports web services and cloud based computation for the execution of hydrologic models and analysis and visualization of hydrologic data. However, the quantity and variety of data and modeling infrastructure available that can be accessed from environments like HydroShare is increasing. Storage infrastructure can range from one's local PC to campus or organizational storage to storage in the cloud. Modeling or computing infrastructure can range from one's desktop to departmental clusters to national HPC resources to grid and cloud computing resources. How does one orchestrate this vast number of data and computing infrastructure without needing to correspondingly learn each new system? A common limitation across these systems is the lack of efficient integration between data transport mechanisms and the corresponding high-level services to support large distributed data and compute operations. A scientist running a hydrology model from their desktop may require processing a large collection of files across the aforementioned storage and compute resources and various national databases. To address these community challenges a proof-of-concept prototype was created integrating HydroShare with RADII (Resource Aware Data-centric collaboration Infrastructure) to provide software infrastructure to enable the comprehensive and rapid dynamic deployment of what we refer to as "collaborative infrastructure." In this presentation we discuss the results of this proof-of-concept prototype which enabled HydroShare users to readily instantiate virtual infrastructure marshaling arbitrary combinations, varieties, and quantities of distributed data and computing infrastructure in addressing big problems in hydrology.

  13. Isolation, Expression Analysis, and Functional Characterization of the First Antidiuretic Hormone Receptor in Insects

    DTIC Science & Technology

    2010-06-01

    fruitfly Drosophila melanogaster and the honey bee Apis mellifera . Prog Neurobiol 80:1–19. 28. Larkin MA, et al. (2007) Clustal W and Clustal X version...receptors predicted or annotated in the Acyrthosiphon pisum, Pediculus humanus corporis, andApis mellifera genomes. The CAPA-related peptides in insects...prolixus CAPA Receptor Gene. CAPA receptor protein sequences identified or predicted in D. melanogaster (AAS65092) (13, 14), A. mellifera (NP_001091702

  14. Rapid Response Concentration-Controlled Desorption of Activated Carbon to Dampen Concentration Fluctuations

    DTIC Science & Technology

    2007-01-01

    Behavior of trickle - bed air biofilter for toluene removal: Effect of non-use periods. Environ. Prog. 2005, 24, 155-161. (3) Martin, F. J.; Loehr, R. C...dampen the fluctuation in acetone concentration at high concentrations. The effect of inlet concentration and empty bed contact time (EBCT) on dampening...oxidizer. The MSA-SST system is a fixed- bed system that rapidly controls the power that heats the adsorbent/adsorbate, resulting in controlled

  15. Design Evolution of a Fighter Training Scheduling Decision Support System.

    DTIC Science & Technology

    1987-03-01

    SYSTEM THESIS Paul E. Trapp Jeffrey W. Grechanik Captain, USAF Captain, USAF AFIT/GST/ENS/87M-8 MAY 191987 " Approved for public release; distribution...E. Trapp, B.S., M.A. Jeffrey W. Grechanik, B.S. Captain, USAF Captain, USAF March 87 Approved for public release; distribution unlimited This work...DNIF, TDY, and other disruptions. Therefore, cycli- cal scheduling will not be used (3:1-18). ProgAMming. Arthur and Ravindran proposed a goal

  16. Enzootic Plague Reduces Black-Footed Ferret (Mustela nigripes) Survival in Montana

    DTIC Science & Technology

    2010-01-01

    al. Design and testing for a non-tagged F1-V fusion protein as vaccine antigen against bubonic and pneumonic plague . Biotechnol Prog 2005; 21:1490–1510...Enzootic Plague Reduces Black-Footed Ferret (Mustela nigripes) Survival in Montana Marc R. Matchett,1 Dean E. Biggins,2 Valerie Carlson,3,* Bradford...and prey. Epizootic plague kills both prairie dogs and ferrets and is a major factor limiting recovery of the highly endangered ferret. In addition to

  17. The Role of Limited Proteolysis of Thyrotropin-Releasing Hormone in Thermoregulation.

    DTIC Science & Technology

    1982-01-01

    exogenously. The limited proteolysis of TRH by pyroglutamate aminopeptidase from CNS results into formation of a new cyclic dipeptide, cyclo (His-Pro...amino acids (L-histidine and L-proline), and two analogues of cyclo (His-Pro), cyclo (Pro-Gly) and cyclo . (Ala-Gly). Cyclo (His-Pro) cross-reacted only...cyclo (His-Pro). Figure 3 shows the chromato- graphic profile obtained when a neutralized perchloric acid extract of rat brain was passed through DEAE

  18. Electrochemical and Photochemical Treatment of Aqueous Waste Streams

    DTIC Science & Technology

    1996-01-01

    TREATMENT OF AQUEOUS WASTE STREAMS Joseph C. Farmer, Richard W. Pekala, Francis T. Wang, David V. Fix, Alan M. Volpe, Daniel D. Dietrich, William H...STREAMS Joseph C. Farmer, Richard W. Pekala, Francis T. Wang, David V. Fix, Alan M. Volpe, Daniel D. Dietrich, William H. Siegel and James F. Carley...1992). Wilbourne , C. M. Wong, , W. S. Gillam, S. Johnson, R. H. Horowitz, "Electrosorb Process for Desalting Water," Res. Dev. Prog. Rept. No. 516, 16. J

  19. Clinical Investigation Program Report Control Symbol MED 300.

    DTIC Science & Technology

    1982-10-01

    Preeclampsia as an Aid to Further Management. (C) (PR) 41 1961 Use of C-Reactive Protein in Prediction of ARD Prog- nosiS, (C) (PR) 42 1981 The Assessment...37 Status: Completed * Title: Routine Use of Serum Uric Acid Levels at 36 Weeks Gestation as Screening Test for Preeclampsia as an Aid to Further...nvestigators: .amily Pr-&ctiU. CPT Ellis M. Knight, MC Key Words: Serum Uric Acid Preeclampsia Accumulative MZDCASI Eat Accumulative Periodic Mar 82 Cost: ]A

  20. Reactivity of Metal Nitrates.

    DTIC Science & Technology

    1982-07-20

    02NOCuOH Any mechanism suggested for the nitration of aromatic systems by titanium(IV) nitrate must take into account the observed similarity, in...occurs. -26- References 1. For recent reviews see (a) R. B. Moodie and K. Schofield, Accounts Chem. Res., 1976, 9, 287; (b) G. A. Olah and S. J. Kuhn...Ithaca, N.Y., 1969, Chapter VI; L. M. Stock, Prog. Phys. Org. Chem., 1976, 12, 21; J. G. Hoggett , R. B. Moodie, J. R. Penton, and K. Schofield

  1. Integrating High-Throughput Parallel Processing Framework and Storage Area Network Concepts Into a Prototype Interactive Scientific Visualization Environment for Hyperspectral Data

    NASA Astrophysics Data System (ADS)

    Smuga-Otto, M. J.; Garcia, R. K.; Knuteson, R. O.; Martin, G. D.; Flynn, B. M.; Hackel, D.

    2006-12-01

    The University of Wisconsin-Madison Space Science and Engineering Center (UW-SSEC) is developing tools to help scientists realize the potential of high spectral resolution instruments for atmospheric science. Upcoming satellite spectrometers like the Cross-track Infrared Sounder (CrIS), experimental instruments like the Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) and proposed instruments like the Hyperspectral Environmental Suite (HES) within the GOES-R project will present a challenge in the form of the overwhelmingly large amounts of continuously generated data. Current and near-future workstations will have neither the storage space nor computational capacity to cope with raw spectral data spanning more than a few minutes of observations from these instruments. Schemes exist for processing raw data from hyperspectral instruments currently in testing, that involve distributed computation across clusters. Data, which for an instrument like GIFTS can amount to over 1.5 Terabytes per day, is carefully managed on Storage Area Networks (SANs), with attention paid to proper maintenance of associated metadata. The UW-SSEC is preparing a demonstration integrating these back-end capabilities as part of a larger visualization framework, to assist scientists in developing new products from high spectral data, sourcing data volumes they could not otherwise manage. This demonstration focuses on managing storage so that only the data specifically needed for the desired product are pulled from the SAN, and on running computationally expensive intermediate processing on a back-end cluster, with the final product being sent to a visualization system on the scientist's workstation. Where possible, existing software and solutions are used to reduce cost of development. The heart of the computing component is the GIFTS Information Processing System (GIPS), developed at the UW- SSEC to allow distribution of processing tasks such as conversion of raw GIFTS interferograms into calibrated radiance spectra, and retrieving temperature and water vapor content atmospheric profiles from these spectra. The hope is that by demonstrating the capabilities afforded by a composite system like the one described here, scientists can be convinced to contribute further algorithms in support of this model of computing and visualization.

  2. Update on the Culicoides sonorensis transcriptome project: a peek into the molecular biology of the midge

    USDA-ARS?s Scientific Manuscript database

    Next Generation Sequencing is transforming the way scientists collect and measure an organism’s genetic background and gene dynamics, while bioinformatics and super-computing are merging to facilitate parallel sample computation and interpretation at unprecedented speeds. Analyzing the complete gene...

  3. Collaborative Learning: Cognitive and Computational Approaches. Advances in Learning and Instruction Series.

    ERIC Educational Resources Information Center

    Dillenbourg, Pierre, Ed.

    Intended to illustrate the benefits of collaboration between scientists from psychology and computer science, namely machine learning, this book contains the following chapters, most of which are co-authored by scholars from both sides: (1) "Introduction: What Do You Mean by 'Collaborative Learning'?" (Pierre Dillenbourg); (2)…

  4. RESEARCH STRATEGIES FOR THE APPLICATION OF THE TECHNIQUES OF COMPUTATIONAL BIOLOGICAL CHEMISTRY TO ENVIRONMENTAL PROBLEMS

    EPA Science Inventory

    On October 25 and 26, 1984, the U.S. EPA sponsored a workshop to consider the potential applications of the techniques of computational biological chemistry to problems in environmental health. Eleven extramural scientists from the various related disciplines and a similar number...

  5. Debugging Geographers: Teaching Programming to Non-Computer Scientists

    ERIC Educational Resources Information Center

    Muller, Catherine L.; Kidd, Chris

    2014-01-01

    The steep learning curve associated with computer programming can be a daunting prospect, particularly for those not well aligned with this way of logical thinking. However, programming is a skill that is becoming increasingly important. Geography graduates entering careers in atmospheric science are one example of a particularly diverse group who…

  6. Using Computers for Research into Social Relations.

    ERIC Educational Resources Information Center

    Holden, George W.

    1988-01-01

    Discusses computer-presented social situations (CPSS), i.e., microcomputer-based simulations developed to provide a new methodological tool for social scientists interested in the study of social relations. Two CPSSs are described: DaySim, used to help identify types of parenting; and DateSim, used to study interpersonal attraction. (21…

  7. Brains--Computers--Machines: Neural Engineering in Science Classrooms

    ERIC Educational Resources Information Center

    Chudler, Eric H.; Bergsman, Kristen Clapper

    2016-01-01

    Neural engineering is an emerging field of high relevance to students, teachers, and the general public. This feature presents online resources that educators and scientists can use to introduce students to neural engineering and to integrate core ideas from the life sciences, physical sciences, social sciences, computer science, and engineering…

  8. Computer Science Professionals and Greek Library Science

    ERIC Educational Resources Information Center

    Dendrinos, Markos N.

    2008-01-01

    This paper attempts to present the current state of computer science penetration into librarianship in terms of both workplace and education issues. The shift from material libraries into digital libraries is mirrored in the corresponding shift from librarians into information scientists. New library data and metadata, as well as new automated…

  9. Describing the What and Why of Students' Difficulties in Boolean Logic

    ERIC Educational Resources Information Center

    Herman, Geoffrey L.; Loui, Michael C.; Kaczmarczyk, Lisa; Zilles, Craig

    2012-01-01

    The ability to reason with formal logic is a foundational skill for computer scientists and computer engineers that scaffolds the abilities to design, debug, and optimize. By interviewing students about their understanding of propositional logic and their ability to translate from English specifications to Boolean expressions, we characterized…

  10. Novel 3-D Computer Model Can Help Predict Pathogens’ Roles in Cancer | Poster

    Cancer.gov

    To understand how bacterial and viral infections contribute to human cancers, four NCI at Frederick scientists turned not to the lab bench, but to a computer. The team has created the world’s first—and currently, only—3-D computational approach for studying interactions between pathogen proteins and human proteins based on a molecular adaptation known as interface mimicry.

  11. Social and Personal Factors in Semantic Infusion Projects

    NASA Astrophysics Data System (ADS)

    West, P.; Fox, P. A.; McGuinness, D. L.

    2009-12-01

    As part of our semantic data framework activities across multiple, diverse disciplines we required the involvement of domain scientists, computer scientists, software engineers, data managers, and often, social scientists. This involvement from a cross-section of disciplines turns out to be a social exercise as much as it is a technical and methodical activity. Each member of the team is used to different modes of working, expectations, vocabularies, levels of participation, and incentive and reward systems. We will examine how both roles and personal responsibilities play in the development of semantic infusion projects, and how an iterative development cycle can contribute to the successful completion of such a project.

  12. [Activities of Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  13. Decision tree and ensemble learning algorithms with their applications in bioinformatics.

    PubMed

    Che, Dongsheng; Liu, Qi; Rasheed, Khaled; Tao, Xiuping

    2011-01-01

    Machine learning approaches have wide applications in bioinformatics, and decision tree is one of the successful approaches applied in this field. In this chapter, we briefly review decision tree and related ensemble algorithms and show the successful applications of such approaches on solving biological problems. We hope that by learning the algorithms of decision trees and ensemble classifiers, biologists can get the basic ideas of how machine learning algorithms work. On the other hand, by being exposed to the applications of decision trees and ensemble algorithms in bioinformatics, computer scientists can get better ideas of which bioinformatics topics they may work on in their future research directions. We aim to provide a platform to bridge the gap between biologists and computer scientists.

  14. Supercomputing Sheds Light on the Dark Universe

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habib, Salman; Heitmann, Katrin

    2012-11-15

    At Argonne National Laboratory, scientists are using supercomputers to shed light on one of the great mysteries in science today, the Dark Universe. With Mira, a petascale supercomputer at the Argonne Leadership Computing Facility, a team led by physicists Salman Habib and Katrin Heitmann will run the largest, most complex simulation of the universe ever attempted. By contrasting the results from Mira with state-of-the-art telescope surveys, the scientists hope to gain new insights into the distribution of matter in the universe, advancing future investigations of dark energy and dark matter into a new realm. The team's research was named amore » finalist for the 2012 Gordon Bell Prize, an award recognizing outstanding achievement in high-performance computing.« less

  15. International Conferences and Young Scientists Schools on Computational Information Technologies for Environmental Sciences (CITES) as a professional growth instrument

    NASA Astrophysics Data System (ADS)

    Gordov, E. P.; Lykosov, V. N.; Genina, E. Yu; Gordova, Yu E.

    2017-11-01

    The paper describes a regular events CITES consisting of young scientists school and international conference as a tool for training and professional growth. The events address the most pressing issues of application of information-computational technologies in environmental sciences and young scientists’ training, diminishing a gap between university graduates’ skill and concurrent challenges. The viability of the approach to the CITES organization is proved by the fact that single event organized in 2001 turned into a series, quite a few young participants successfully defended their PhD thesis and a number of researchers became Doctors of Science during these years. Young researchers from Russia and foreign countries show undiminishing interest to these events.

  16. Multiuser Collaboration with Networked Mobile Devices

    NASA Technical Reports Server (NTRS)

    Tso, Kam S.; Tai, Ann T.; Deng, Yong M.; Becks, Paul G.

    2006-01-01

    In this paper we describe a multiuser collaboration infrastructure that enables multiple mission scientists to remotely and collaboratively interact with visualization and planning software, using wireless networked personal digital assistants(PDAs) and other mobile devices. During ground operations of planetary rover and lander missions, scientists need to meet daily to review downlinked data and plan science activities. For example, scientists use the Science Activity Planner (SAP) in the Mars Exploration Rover (MER) mission to visualize downlinked data and plan rover activities during the science meetings [1]. Computer displays are projected onto large screens in the meeting room to enable the scientists to view and discuss downlinked images and data displayed by SAP and other software applications. However, only one person can interact with the software applications because input to the computer is limited to a single mouse and keyboard. As a result, the scientists have to verbally express their intentions, such as selecting a target at a particular location on the Mars terrain image, to that person in order to interact with the applications. This constrains communication and limits the returns of science planning. Furthermore, ground operations for Mars missions are fundamentally constrained by the short turnaround time for science and engineering teams to process and analyze data, plan the next uplink, generate command sequences, and transmit the uplink to the vehicle [2]. Therefore, improving ground operations is crucial to the success of Mars missions. The multiuser collaboration infrastructure enables users to control software applications remotely and collaboratively using mobile devices. The infrastructure includes (1) human-computer interaction techniques to provide natural, fast, and accurate inputs, (2) a communications protocol to ensure reliable and efficient coordination of the input devices and host computers, (3) an application-independent middleware that maintains the states, sessions, and interactions of individual users of the software applications, (4) an application programming interface to enable tight integration of applications and the middleware. The infrastructure is able to support any software applications running under the Windows or Unix platforms. The resulting technologies not only are applicable to NASA mission operations, but also useful in other situations such as design reviews, brainstorming sessions, and business meetings, as they can benefit from having the participants concurrently interact with the software applications (e.g., presentation applications and CAD design tools) to illustrate their ideas and provide inputs.

  17. Software Carpentry and the Hydrological Sciences

    NASA Astrophysics Data System (ADS)

    Ahmadia, A. J.; Kees, C. E.; Farthing, M. W.

    2013-12-01

    Scientists are spending an increasing amount of time building and using hydrology software. However, most scientists are never taught how to do this efficiently. As a result, many are unaware of tools and practices that would allow them to write more reliable and maintainable code with less effort. As hydrology models increase in capability and enter use by a growing number of scientists and their communities, it is important that the scientific software development practices scale up to meet the challenges posed by increasing software complexity, lengthening software lifecycles, a growing number of stakeholders and contributers, and a broadened developer base that extends from application domains to high performance computing centers. Many of these challenges in complexity, lifecycles, and developer base have been successfully met by the open source community, and there are many lessons to be learned from their experiences and practices. Additionally, there is much wisdom to be found in the results of research studies conducted on software engineering itself. Software Carpentry aims to bridge the gap between the current state of software development and these known best practices for scientific software development, with a focus on hands-on exercises and practical advice based on the following principles: 1. Write programs for people, not computers. 2. Automate repetitive tasks 3. Use the computer to record history 4. Make incremental changes 5. Use version control 6. Don't repeat yourself (or others) 7. Plan for mistakes 8. Optimize software only after it works 9. Document design and purpose, not mechanics 10. Collaborate We discuss how these best practices, arising from solid foundations in research and experience, have been shown to help improve scientist's productivity and the reliability of their software.

  18. CompNanoTox2015: novel perspectives from a European conference on computational nanotoxicology on predictive nanotoxicology.

    PubMed

    Bañares, Miguel A; Haase, Andrea; Tran, Lang; Lobaskin, Vladimir; Oberdörster, Günter; Rallo, Robert; Leszczynski, Jerzy; Hoet, Peter; Korenstein, Rafi; Hardy, Barry; Puzyn, Tomasz

    2017-09-01

    A first European Conference on Computational Nanotoxicology, CompNanoTox, was held in November 2015 in Benahavís, Spain with the objectives to disseminate and integrate results from the European modeling and database projects (NanoPUZZLES, ModENPTox, PreNanoTox, MembraneNanoPart, MODERN, eNanoMapper and EU COST TD1204 MODENA) as well as to create synergies within the European NanoSafety Cluster. This conference was supported by the COST Action TD1204 MODENA on developing computational methods for toxicological risk assessment of engineered nanoparticles and provided a unique opportunity for cross fertilization among complementary disciplines. The efforts to develop and validate computational models crucially depend on high quality experimental data and relevant assays which will be the basis to identify relevant descriptors. The ambitious overarching goal of this conference was to promote predictive nanotoxicology, which can only be achieved by a close collaboration between the computational scientists (e.g. database experts, modeling experts for structure, (eco) toxicological effects, performance and interaction of nanomaterials) and experimentalists from different areas (in particular toxicologists, biologists, chemists and material scientists, among others). The main outcome and new perspectives of this conference are summarized here.

  19. Tools and techniques for computational reproducibility.

    PubMed

    Piccolo, Stephen R; Frampton, Michael B

    2016-07-11

    When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches.

  20. CompNanoTox2015: novel perspectives from a European conference on computational nanotoxicology on predictive nanotoxicology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bañares, Miguel A.; Haase, Andrea; Tran, Lang

    A first European Conference on Computational Nanotoxicology, CompNanoTox, was held in November 2015 in Benahavís, Spain with the objectives to disseminate and integrate results from the European modeling and database projects (NanoPUZZLES, ModENPTox, PreNanoTox, MembraneNanoPart, MODERN, eNanoMapper and EU COST TD1204 MODENA) as well as to create synergies within the European NanoSafety Cluster. This conference was supported by the COST Action TD1204 MODENA on developing computational methods for toxicological risk assessment of engineered nanoparticles and provided a unique opportunity for crossfertilization among complementary disciplines. The efforts to develop and validate computational models crucially depend on high quality experimental data andmore » relevant assays which will be the basis to identify relevant descriptors. The ambitious overarching goal of this conference was to promote predictive nanotoxicology, which can only be achieved by a close collaboration between the computational scientists (e.g. database experts, modeling experts for structure, (eco) toxicological effects, performance and interaction of nanomaterials) and experimentalists from different areas (in particular toxicologists, biologists, chemists and material scientists, among others). The main outcome and new perspectives of this conference are summarized here.« less

  1. Stereotyping in Relation to the Gender Gap in Participation in Computing.

    ERIC Educational Resources Information Center

    Siann, Gerda; And Others

    1988-01-01

    A questionnaire completed by 928 postsecondary students asked subjects to rate one of two computer scientists on 16 personal attributes. Aside from gender of the ratee, questionnaires were identical. Results indicate that on eight attributes the female was rated significantly more positively than the male. Implications are discussed. (Author/CH)

  2. Constructing Contracts: Making Discrete Mathematics Relevant to Beginning Programmers

    ERIC Educational Resources Information Center

    Gegg-Harrison, Timothy S.

    2005-01-01

    Although computer scientists understand the importance of discrete mathematics to the foundations of their field, computer science (CS) students do not always see the relevance. Thus, it is important to find a way to show students its relevance. The concept of program correctness is generally taught as an activity independent of the programming…

  3. Communication for Scientists and Engineers: A "Computer Model" in the Basic Course.

    ERIC Educational Resources Information Center

    Haynes, W. Lance

    Successful speech should rest not on prepared notes and outlines but on genuine oral discourse based on "data" fed into the "software" in the computer which already exists within each person. Writing cannot speak for itself, nor can it continually adjust itself to accommodate diverse response. Moreover, no matter how skillfully…

  4. Identification of Factors That Affect Software Complexity.

    ERIC Educational Resources Information Center

    Kaiser, Javaid

    A survey of computer scientists was conducted to identify factors that affect software complexity. A total of 160 items were selected from the literature to include in a questionnaire sent to 425 individuals who were employees of computer-related businesses in Lawrence and Kansas City. The items were grouped into nine categories called system…

  5. Synthetic Biology: Knowledge Accessed by Everyone (Open Sources)

    ERIC Educational Resources Information Center

    Sánchez Reyes, Patricia Margarita

    2016-01-01

    Using the principles of biology, along with engineering and with the help of computer, scientists manage to copy. DNA sequences from nature and use them to create new organisms. DNA is created through engineering and computer science managing to create life inside a laboratory. We cannot dismiss the role that synthetic biology could lead in…

  6. The Multiple Pendulum Problem via Maple[R

    ERIC Educational Resources Information Center

    Salisbury, K. L.; Knight, D. G.

    2002-01-01

    The way in which computer algebra systems, such as Maple, have made the study of physical problems of some considerable complexity accessible to mathematicians and scientists with modest computational skills is illustrated by solving the multiple pendulum problem. A solution is obtained for four pendulums with no restriction on the size of the…

  7. Computers and the Future of Skill Demand. Educational Research and Innovation Series

    ERIC Educational Resources Information Center

    Elliott, Stuart W.

    2017-01-01

    Computer scientists are working on reproducing all human skills using artificial intelligence, machine learning and robotics. Unsurprisingly then, many people worry that these advances will dramatically change work skills in the years ahead and perhaps leave many workers unemployable. This report develops a new approach to understanding these…

  8. Symposium on Parallel Computational Methods for Large-scale Structural Analysis and Design, 2nd, Norfolk, VA, US

    NASA Technical Reports Server (NTRS)

    Storaasli, Olaf O. (Editor); Housner, Jerrold M. (Editor)

    1993-01-01

    Computing speed is leaping forward by several orders of magnitude each decade. Engineers and scientists gathered at a NASA Langley symposium to discuss these exciting trends as they apply to parallel computational methods for large-scale structural analysis and design. Among the topics discussed were: large-scale static analysis; dynamic, transient, and thermal analysis; domain decomposition (substructuring); and nonlinear and numerical methods.

  9. Use of Emerging Grid Computing Technologies for the Analysis of LIGO Data

    NASA Astrophysics Data System (ADS)

    Koranda, Scott

    2004-03-01

    The LIGO Scientific Collaboration (LSC) today faces the challenge of enabling analysis of terabytes of LIGO data by hundreds of scientists from institutions all around the world. To meet this challenge the LSC is developing tools, infrastructure, applications, and expertise leveraging Grid Computing technologies available today, and making available to LSC scientists compute resources at sites across the United States and Europe. We use digital credentials for strong and secure authentication and authorization to compute resources and data. Building on top of products from the Globus project for high-speed data transfer and information discovery we have created the Lightweight Data Replicator (LDR) to securely and robustly replicate data to resource sites. We have deployed at our computing sites the Virtual Data Toolkit (VDT) Server and Client packages, developed in collaboration with our partners in the GriPhyN and iVDGL projects, providing uniform access to distributed resources for users and their applications. Taken together these Grid Computing technologies and infrastructure have formed the LSC DataGrid--a coherent and uniform environment across two continents for the analysis of gravitational-wave detector data. Much work, however, remains in order to scale current analyses and recent lessons learned need to be integrated into the next generation of Grid middleware.

  10. A cyber-linked undergraduate research experience in computational biomolecular structure prediction and design

    PubMed Central

    Alford, Rebecca F.; Dolan, Erin L.

    2017-01-01

    Computational biology is an interdisciplinary field, and many computational biology research projects involve distributed teams of scientists. To accomplish their work, these teams must overcome both disciplinary and geographic barriers. Introducing new training paradigms is one way to facilitate research progress in computational biology. Here, we describe a new undergraduate program in biomolecular structure prediction and design in which students conduct research at labs located at geographically-distributed institutions while remaining connected through an online community. This 10-week summer program begins with one week of training on computational biology methods development, transitions to eight weeks of research, and culminates in one week at the Rosetta annual conference. To date, two cohorts of students have participated, tackling research topics including vaccine design, enzyme design, protein-based materials, glycoprotein modeling, crowd-sourced science, RNA processing, hydrogen bond networks, and amyloid formation. Students in the program report outcomes comparable to students who participate in similar in-person programs. These outcomes include the development of a sense of community and increases in their scientific self-efficacy, scientific identity, and science values, all predictors of continuing in a science research career. Furthermore, the program attracted students from diverse backgrounds, which demonstrates the potential of this approach to broaden the participation of young scientists from backgrounds traditionally underrepresented in computational biology. PMID:29216185

  11. A cyber-linked undergraduate research experience in computational biomolecular structure prediction and design.

    PubMed

    Alford, Rebecca F; Leaver-Fay, Andrew; Gonzales, Lynda; Dolan, Erin L; Gray, Jeffrey J

    2017-12-01

    Computational biology is an interdisciplinary field, and many computational biology research projects involve distributed teams of scientists. To accomplish their work, these teams must overcome both disciplinary and geographic barriers. Introducing new training paradigms is one way to facilitate research progress in computational biology. Here, we describe a new undergraduate program in biomolecular structure prediction and design in which students conduct research at labs located at geographically-distributed institutions while remaining connected through an online community. This 10-week summer program begins with one week of training on computational biology methods development, transitions to eight weeks of research, and culminates in one week at the Rosetta annual conference. To date, two cohorts of students have participated, tackling research topics including vaccine design, enzyme design, protein-based materials, glycoprotein modeling, crowd-sourced science, RNA processing, hydrogen bond networks, and amyloid formation. Students in the program report outcomes comparable to students who participate in similar in-person programs. These outcomes include the development of a sense of community and increases in their scientific self-efficacy, scientific identity, and science values, all predictors of continuing in a science research career. Furthermore, the program attracted students from diverse backgrounds, which demonstrates the potential of this approach to broaden the participation of young scientists from backgrounds traditionally underrepresented in computational biology.

  12. Computer Model Predicts the Movement of Dust

    NASA Technical Reports Server (NTRS)

    2002-01-01

    A new computer model of the atmosphere can now actually pinpoint where global dust events come from, and can project where they're going. The model may help scientists better evaluate the impact of dust on human health, climate, ocean carbon cycles, ecosystems, and atmospheric chemistry. Also, by seeing where dust originates and where it blows people with respiratory problems can get advanced warning of approaching dust clouds. 'The model is physically more realistic than previous ones,' said Mian Chin, a co-author of the study and an Earth and atmospheric scientist at Georgia Tech and the Goddard Space Flight Center (GSFC) in Greenbelt, Md. 'It is able to reproduce the short term day-to-day variations and long term inter-annual variations of dust concentrations and distributions that are measured from field experiments and observed from satellites.' The above images show both aerosols measured from space (left) and the movement of aerosols predicted by computer model for the same date (right). For more information, read New Computer Model Tracks and Predicts Paths Of Earth's Dust Images courtesy Paul Giroux, Georgia Tech/NASA Goddard Space Flight Center

  13. The United States Air Force and the Culture of Innovation, 1945-1965

    DTIC Science & Technology

    2002-01-01

    US Dept of Transportation; typically they hover between 85 and 95 percent. 16. Kent C. Redmond and Thomas M. Smith, Project Whirlwind: A Case Histo...Washing- ton, D.C.: AF Hist and Museums Prog, 1994). 14. Thomas A . Sturm, The USAF SAB: Its First Twenty Years 1944–1964 (Washington, D.C.: USAF...allegations at Ramo-Wooldridge and the Air Force’s approach. Schriever answered them in a letter to Lt. Gen. Thomas Power, the commander of ARDC, in

  14. Role of Human Polyomavirus Bkv in Prostate Cancer

    DTIC Science & Technology

    2007-12-01

    D. L. Walker. 1976 . New human papovaviruses. Prog Med Virol. 22:1-35. 59. Palapattu, G. S., S. Sutcliffe, P. J. Bastian, E. A. Platz, A. M. De Marzo ...J. Imperiale. 2004. Detection and expression of human BK virus sequences in neoplastic prostate tissues. Oncogene 23:7031-7046. 4. De Marzo , A. M...virus sequences in neoplastic prostate tissues. Oncogene 23:7031-7046. 41 18. De Marzo , A. M., T. L. DeWeese, E. A. Platz, A. K. Meeker, M. Nakayama

  15. Correlation between average melting temperature and glass transition temperature in metallic glasses

    NASA Astrophysics Data System (ADS)

    Lu, Zhibin; Li, Jiangong

    2009-02-01

    The correlation between average melting temperature (⟨Tm⟩) and glass transition temperature (Tg) in metallic glasses (MGs) is analyzed. A linear relationship, Tg=0.385⟨Tm⟩, is observed. This correlation agrees with Egami's suggestion [Rep. Prog. Phys. 47, 1601 (1984)]. The prediction of Tg from ⟨Tm⟩ through the relationship Tg=0.385⟨Tm⟩ has been tested using experimental data obtained on a large number of MGs. This relationship can be used to predict and design MGs with a desired Tg.

  16. Spontaneous Transitional Cell Carcinoma in the Urinary Bladder of a Strain 13 Guinea Pig.

    DTIC Science & Technology

    1985-05-01

    less than I of all canine neoplasms. In the feline , the extremely low incidence of bladder tumors observed may well be due to a difference in...factor of age. Prog Exp Tumor Res 1967;9:261-85. ..................... , 12. Ediger R D, Rabstein M M. Spontaneous leukemia in a Hartley strain guinea pig...Intracisternal virus -like particles in two guinea pig mammary adenocarcinomas. Lab Anim Sci 1976;26:607-9. 17.. Yoshida A, Iqbal Z M, Epstein. S S

  17. Development of the Wake Behind a Circular Cylinder Impulsively Started into Rotatory and Rectilinear Motion: Intermediate Rotation Rates

    DTIC Science & Technology

    1991-01-01

    cylindre fixe ou en rotation. Effet Magnus . J. Mec. 14, 109-134. Taneda, S. 1977 Visual study of unsteady separated flows around bodies. Prog. Aero...enhancement schemes employing the Magnus effect (Swanson 1961). Rotating all or part of a body may also have applications in active or feedback control of...and yt into the governing equations in the generalized coordinate system. In this study, the body-fitted grid is simply one of cylindrical polar

  18. Adapting Biofilter Processes to Treat Spray Painting Exhausts: Concentration and Leveling of Vapor Delivery Rates, and Enhancement of Destruction by Exhaust Recirculation

    DTIC Science & Technology

    2001-12-20

    1992. "Consider biofiltration for decontaminating gases ," Chem. Eng. Prog. 88(4):34–40. Brunner, W., D. Staub, and T. Leisinger. 1980...Biotreatment processes, such as biofiltration , are environmentally friendly, and produce only non-hazardous by-products such as water, inorganic salts, and...biological air treatment system is biofiltration . Biofiltration is a process that utilizes microorganisms immobilized in the form of a biofilm layer on

  19. Biostable Agonists that Match or Exceed Activity of Native Insect Kinins on Recombinant Arthropod GPCRs

    DTIC Science & Technology

    2009-01-01

    with a modified FastMoc 0.25 procedure using an Fmoc-strategy starting from Rink amide resin (Novabiochem, San Diego, CA, 0.5 mM/g). The Fmoc protecting...In vitro release of amylase by culekinins in two insects: Opsinia arenosella (Lepidoptera) and Rhynchophorus ferrugineus (Coleoptera). Trends Life...Drosophila melanogaster and the honey bee Apis mellifera. Prog. Neurobiol. 80, 1–19. Holman, G.M., Nachman, R.J., Wright, M.S., 1990. Insect

  20. CG-MS/MS Analyses of Biological Samples in Support of Developmental Toxic Effects on Whole-Body Exposure of Rats to GB

    DTIC Science & Technology

    2015-03-01

    Sensitivity to Organophosphorous Anticholinesterase Compounds. Prog. Neurobiol. 1987, 28, 97–129. Shih, T-M.; Penetar, D.M.; McDonough, J.H. Jr.; Romano...J.A.; King, J.M. Age- related Differences in Soman Toxicity and in Blood and Brain Regional Cholinesterase Activity . Brain Res. Bull. 1990, 24...Organophosphates. Toxicol. Appl. Pharmacol. 2004, 198, 132–151. Sterri, S.H.; Berge, G.; Fonnum, F. Esterase Activities and Soman Toxicity in

  1. Salmonella typhimurium gyrA mutations associated with fluoroquinolone resistance.

    PubMed Central

    Reyna, F; Huesca, M; González, V; Fuchs, L Y

    1995-01-01

    Spontaneous quinolone-resistant mutants obtained from Salmonella typhimurium Su694 were screened for mutations by direct DNA sequencing of an amplified PCR gyrA fragment. Substitutions Ser-83-->Phe (Ser83Phe), Ser83Tyr, Asp87Tyr, and Asp87Asn and double mutation Ala67Pro-Gly81Ser, which resulted in decreased sensitivities to ciprofloxacin, enoxacin, pefloxacin, norfloxacin, ofloxacin, and nalidixic acid, were found. The levels of resistance to quinolones for each mutant were determined. PMID:7492118

  2. Automating CapCom: Pragmatic Operations and Technology Research for Human Exploration of Mars

    NASA Technical Reports Server (NTRS)

    Clancey, William J.

    2003-01-01

    During the Apollo program, NASA and the scientific community used terrestrial analog sites for understanding planetary features and for training astronauts to be scientists. More recently, computer scientists and human factors specialists have followed geologists and biologists into the field, learning how science is actually done on expeditions in extreme environments. Research stations have been constructed by the Mars Society in the Arctic and American southwest, providing facilities for hundreds of researchers to investigate how small crews might live and work on Mars. Combining these interests-science, operations, and technology-in Mars analog field expeditions provides tremendous synergy and authenticity to speculations about Mars missions. By relating historical analyses of Apollo and field science, engineers are creating experimental prototypes that provide significant new capabilities, such as a computer system that automates some of the functions of Apollo s CapCom. Thus, analog studies have created a community of practice-a new collaboration between scientists and engineers-so that technology begins with real human needs and works incrementally towards the challenges of the human exploration of Mars.

  3. A systematic identification and analysis of scientists on Twitter.

    PubMed

    Ke, Qing; Ahn, Yong-Yeol; Sugimoto, Cassidy R

    2017-01-01

    Metrics derived from Twitter and other social media-often referred to as altmetrics-are increasingly used to estimate the broader social impacts of scholarship. Such efforts, however, may produce highly misleading results, as the entities that participate in conversations about science on these platforms are largely unknown. For instance, if altmetric activities are generated mainly by scientists, does it really capture broader social impacts of science? Here we present a systematic approach to identifying and analyzing scientists on Twitter. Our method can identify scientists across many disciplines, without relying on external bibliographic data, and be easily adapted to identify other stakeholder groups in science. We investigate the demographics, sharing behaviors, and interconnectivity of the identified scientists. We find that Twitter has been employed by scholars across the disciplinary spectrum, with an over-representation of social and computer and information scientists; under-representation of mathematical, physical, and life scientists; and a better representation of women compared to scholarly publishing. Analysis of the sharing of URLs reveals a distinct imprint of scholarly sites, yet only a small fraction of shared URLs are science-related. We find an assortative mixing with respect to disciplines in the networks between scientists, suggesting the maintenance of disciplinary walls in social media. Our work contributes to the literature both methodologically and conceptually-we provide new methods for disambiguating and identifying particular actors on social media and describing the behaviors of scientists, thus providing foundational information for the construction and use of indicators on the basis of social media metrics.

  4. A systematic identification and analysis of scientists on Twitter

    PubMed Central

    Ke, Qing; Ahn, Yong-Yeol; Sugimoto, Cassidy R.

    2017-01-01

    Metrics derived from Twitter and other social media—often referred to as altmetrics—are increasingly used to estimate the broader social impacts of scholarship. Such efforts, however, may produce highly misleading results, as the entities that participate in conversations about science on these platforms are largely unknown. For instance, if altmetric activities are generated mainly by scientists, does it really capture broader social impacts of science? Here we present a systematic approach to identifying and analyzing scientists on Twitter. Our method can identify scientists across many disciplines, without relying on external bibliographic data, and be easily adapted to identify other stakeholder groups in science. We investigate the demographics, sharing behaviors, and interconnectivity of the identified scientists. We find that Twitter has been employed by scholars across the disciplinary spectrum, with an over-representation of social and computer and information scientists; under-representation of mathematical, physical, and life scientists; and a better representation of women compared to scholarly publishing. Analysis of the sharing of URLs reveals a distinct imprint of scholarly sites, yet only a small fraction of shared URLs are science-related. We find an assortative mixing with respect to disciplines in the networks between scientists, suggesting the maintenance of disciplinary walls in social media. Our work contributes to the literature both methodologically and conceptually—we provide new methods for disambiguating and identifying particular actors on social media and describing the behaviors of scientists, thus providing foundational information for the construction and use of indicators on the basis of social media metrics. PMID:28399145

  5. Understanding the Performance and Potential of Cloud Computing for Scientific Applications

    DOE PAGES

    Sadooghi, Iman; Martin, Jesus Hernandez; Li, Tonglin; ...

    2015-02-19

    In this paper, commercial clouds bring a great opportunity to the scientific computing area. Scientific applications usually require significant resources, however not all scientists have access to sufficient high-end computing systems, may of which can be found in the Top500 list. Cloud Computing has gained the attention of scientists as a competitive resource to run HPC applications at a potentially lower cost. But as a different infrastructure, it is unclear whether clouds are capable of running scientific applications with a reasonable performance per money spent. This work studies the performance of public clouds and places this performance in context tomore » price. We evaluate the raw performance of different services of AWS cloud in terms of the basic resources, such as compute, memory, network and I/O. We also evaluate the performance of the scientific applications running in the cloud. This paper aims to assess the ability of the cloud to perform well, as well as to evaluate the cost of the cloud running scientific applications. We developed a full set of metrics and conducted a comprehensive performance evlauation over the Amazon cloud. We evaluated EC2, S3, EBS and DynamoDB among the many Amazon AWS services. We evaluated the memory sub-system performance with CacheBench, the network performance with iperf, processor and network performance with the HPL benchmark application, and shared storage with NFS and PVFS in addition to S3. We also evaluated a real scientific computing application through the Swift parallel scripting system at scale. Armed with both detailed benchmarks to gauge expected performance and a detailed monetary cost analysis, we expect this paper will be a recipe cookbook for scientists to help them decide where to deploy and run their scientific applications between public clouds, private clouds, or hybrid clouds.« less

  6. Understanding the Performance and Potential of Cloud Computing for Scientific Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadooghi, Iman; Martin, Jesus Hernandez; Li, Tonglin

    In this paper, commercial clouds bring a great opportunity to the scientific computing area. Scientific applications usually require significant resources, however not all scientists have access to sufficient high-end computing systems, may of which can be found in the Top500 list. Cloud Computing has gained the attention of scientists as a competitive resource to run HPC applications at a potentially lower cost. But as a different infrastructure, it is unclear whether clouds are capable of running scientific applications with a reasonable performance per money spent. This work studies the performance of public clouds and places this performance in context tomore » price. We evaluate the raw performance of different services of AWS cloud in terms of the basic resources, such as compute, memory, network and I/O. We also evaluate the performance of the scientific applications running in the cloud. This paper aims to assess the ability of the cloud to perform well, as well as to evaluate the cost of the cloud running scientific applications. We developed a full set of metrics and conducted a comprehensive performance evlauation over the Amazon cloud. We evaluated EC2, S3, EBS and DynamoDB among the many Amazon AWS services. We evaluated the memory sub-system performance with CacheBench, the network performance with iperf, processor and network performance with the HPL benchmark application, and shared storage with NFS and PVFS in addition to S3. We also evaluated a real scientific computing application through the Swift parallel scripting system at scale. Armed with both detailed benchmarks to gauge expected performance and a detailed monetary cost analysis, we expect this paper will be a recipe cookbook for scientists to help them decide where to deploy and run their scientific applications between public clouds, private clouds, or hybrid clouds.« less

  7. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1993-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a testbed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  8. HERCULES: A Pattern Driven Code Transformation System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kartsaklis, Christos; Hernandez, Oscar R; Hsu, Chung-Hsing

    2012-01-01

    New parallel computers are emerging, but developing efficient scientific code for them remains difficult. A scientist must manage not only the science-domain complexity but also the performance-optimization complexity. HERCULES is a code transformation system designed to help the scientist to separate the two concerns, which improves code maintenance, and facilitates performance optimization. The system combines three technologies, code patterns, transformation scripts and compiler plugins, to provide the scientist with an environment to quickly implement code transformations that suit his needs. Unlike existing code optimization tools, HERCULES is unique in its focus on user-level accessibility. In this paper we discuss themore » design, implementation and an initial evaluation of HERCULES.« less

  9. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1992-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a test bed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  10. A visiting scientist program in atmospheric sciences for the Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Davis, M. H.

    1989-01-01

    A visiting scientist program was conducted in the atmospheric sciences and related areas at the Goddard Laboratory for Atmospheres. Research was performed in mathematical analysis as applied to computer modeling of the atmospheres; development of atmospheric modeling programs; analysis of remotely sensed atmospheric, surface, and oceanic data and its incorporation into atmospheric models; development of advanced remote sensing instrumentation; and related research areas. The specific research efforts are detailed by tasks.

  11. Global Land Information System (GLIS)

    USGS Publications Warehouse

    ,

    1992-01-01

    The Global Land Information System (GLIS) is an interactive computer system developed by the U.S. Geological Survey (USGS) for scientists seeking sources of information about the Earth's land surfaces. GLIS contains "metadata," that is, descriptive information about data sets. Through GLIS, scientists can evaluate data sets, determine their availability, and place online requests for products. GLIS is more, however, than a mere list of products. It offers online samples of earth science data that may be ordered through the system.

  12. UNH Data Cooperative: A Cyber Infrastructure for Earth System Studies

    NASA Astrophysics Data System (ADS)

    Braswell, B. H.; Fekete, B. M.; Prusevich, A.; Gliden, S.; Magill, A.; Vorosmarty, C. J.

    2007-12-01

    Earth system scientists and managers have a continuously growing demand for a wide array of earth observations derived from various data sources including (a) modern satellite retrievals, (b) "in-situ" records, (c) various simulation outputs, and (d) assimilated data products combining model results with observational records. The sheer quantity of data, and formatting inconsistencies make it difficult for users to take full advantage of this important information resource. Thus the system could benefit from a thorough retooling of our current data processing procedures and infrastructure. Emerging technologies, like OPeNDAP and OGC map services, open standard data formats (NetCDF, HDF) data cataloging systems (NASA-Echo, Global Change Master Directory, etc.) are providing the basis for a new approach in data management and processing, where web- services are increasingly designed to serve computer-to-computer communications without human interactions and complex analysis can be carried out over distributed computer resources interconnected via cyber infrastructure. The UNH Earth System Data Collaborative is designed to utilize the aforementioned emerging web technologies to offer new means of access to earth system data. While the UNH Data Collaborative serves a wide array of data ranging from weather station data (Climate Portal) to ocean buoy records and ship tracks (Portsmouth Harbor Initiative) to land cover characteristics, etc. the underlaying data architecture shares common components for data mining and data dissemination via web-services. Perhaps the most unique element of the UNH Data Cooperative's IT infrastructure is its prototype modeling environment for regional ecosystem surveillance over the Northeast corridor, which allows the integration of complex earth system model components with the Cooperative's data services. While the complexity of the IT infrastructure to perform complex computations is continuously increasing, scientists are often forced to spend considerable amount of time to solve basic data management and preprocessing tasks and deal with low level computational design problems like parallelization of model codes. Our modeling infrastructure is designed to take care the bulk of the common tasks found in complex earth system models like I/O handling, computational domain and time management, parallel execution of the modeling tasks, etc. The modeling infrastructure allows scientists to focus on the numerical implementation of the physical processes on a single computational objects(typically grid cells) while the framework takes care of the preprocessing of input data, establishing of the data exchange between computation objects and the execution of the science code. In our presentation, we will discuss the key concepts of our modeling infrastructure. We will demonstrate integration of our modeling framework with data services offered by the UNH Earth System Data Collaborative via web interfaces. We will layout the road map to turn our prototype modeling environment into a truly community framework for wide range of earth system scientists and environmental managers.

  13. The iPlant Collaborative: Cyberinfrastructure for Plant Biology.

    PubMed

    Goff, Stephen A; Vaughn, Matthew; McKay, Sheldon; Lyons, Eric; Stapleton, Ann E; Gessler, Damian; Matasci, Naim; Wang, Liya; Hanlon, Matthew; Lenards, Andrew; Muir, Andy; Merchant, Nirav; Lowry, Sonya; Mock, Stephen; Helmke, Matthew; Kubach, Adam; Narro, Martha; Hopkins, Nicole; Micklos, David; Hilgert, Uwe; Gonzales, Michael; Jordan, Chris; Skidmore, Edwin; Dooley, Rion; Cazes, John; McLay, Robert; Lu, Zhenyuan; Pasternak, Shiran; Koesterke, Lars; Piel, William H; Grene, Ruth; Noutsos, Christos; Gendler, Karla; Feng, Xin; Tang, Chunlao; Lent, Monica; Kim, Seung-Jin; Kvilekval, Kristian; Manjunath, B S; Tannen, Val; Stamatakis, Alexandros; Sanderson, Michael; Welch, Stephen M; Cranston, Karen A; Soltis, Pamela; Soltis, Doug; O'Meara, Brian; Ane, Cecile; Brutnell, Tom; Kleibenstein, Daniel J; White, Jeffery W; Leebens-Mack, James; Donoghue, Michael J; Spalding, Edgar P; Vision, Todd J; Myers, Christopher R; Lowenthal, David; Enquist, Brian J; Boyle, Brad; Akoglu, Ali; Andrews, Greg; Ram, Sudha; Ware, Doreen; Stein, Lincoln; Stanzione, Dan

    2011-01-01

    The iPlant Collaborative (iPlant) is a United States National Science Foundation (NSF) funded project that aims to create an innovative, comprehensive, and foundational cyberinfrastructure in support of plant biology research (PSCIC, 2006). iPlant is developing cyberinfrastructure that uniquely enables scientists throughout the diverse fields that comprise plant biology to address Grand Challenges in new ways, to stimulate and facilitate cross-disciplinary research, to promote biology and computer science research interactions, and to train the next generation of scientists on the use of cyberinfrastructure in research and education. Meeting humanity's projected demands for agricultural and forest products and the expectation that natural ecosystems be managed sustainably will require synergies from the application of information technologies. The iPlant cyberinfrastructure design is based on an unprecedented period of research community input, and leverages developments in high-performance computing, data storage, and cyberinfrastructure for the physical sciences. iPlant is an open-source project with application programming interfaces that allow the community to extend the infrastructure to meet its needs. iPlant is sponsoring community-driven workshops addressing specific scientific questions via analysis tool integration and hypothesis testing. These workshops teach researchers how to add bioinformatics tools and/or datasets into the iPlant cyberinfrastructure enabling plant scientists to perform complex analyses on large datasets without the need to master the command-line or high-performance computational services.

  14. The iPlant Collaborative: Cyberinfrastructure for Plant Biology

    PubMed Central

    Goff, Stephen A.; Vaughn, Matthew; McKay, Sheldon; Lyons, Eric; Stapleton, Ann E.; Gessler, Damian; Matasci, Naim; Wang, Liya; Hanlon, Matthew; Lenards, Andrew; Muir, Andy; Merchant, Nirav; Lowry, Sonya; Mock, Stephen; Helmke, Matthew; Kubach, Adam; Narro, Martha; Hopkins, Nicole; Micklos, David; Hilgert, Uwe; Gonzales, Michael; Jordan, Chris; Skidmore, Edwin; Dooley, Rion; Cazes, John; McLay, Robert; Lu, Zhenyuan; Pasternak, Shiran; Koesterke, Lars; Piel, William H.; Grene, Ruth; Noutsos, Christos; Gendler, Karla; Feng, Xin; Tang, Chunlao; Lent, Monica; Kim, Seung-Jin; Kvilekval, Kristian; Manjunath, B. S.; Tannen, Val; Stamatakis, Alexandros; Sanderson, Michael; Welch, Stephen M.; Cranston, Karen A.; Soltis, Pamela; Soltis, Doug; O'Meara, Brian; Ane, Cecile; Brutnell, Tom; Kleibenstein, Daniel J.; White, Jeffery W.; Leebens-Mack, James; Donoghue, Michael J.; Spalding, Edgar P.; Vision, Todd J.; Myers, Christopher R.; Lowenthal, David; Enquist, Brian J.; Boyle, Brad; Akoglu, Ali; Andrews, Greg; Ram, Sudha; Ware, Doreen; Stein, Lincoln; Stanzione, Dan

    2011-01-01

    The iPlant Collaborative (iPlant) is a United States National Science Foundation (NSF) funded project that aims to create an innovative, comprehensive, and foundational cyberinfrastructure in support of plant biology research (PSCIC, 2006). iPlant is developing cyberinfrastructure that uniquely enables scientists throughout the diverse fields that comprise plant biology to address Grand Challenges in new ways, to stimulate and facilitate cross-disciplinary research, to promote biology and computer science research interactions, and to train the next generation of scientists on the use of cyberinfrastructure in research and education. Meeting humanity's projected demands for agricultural and forest products and the expectation that natural ecosystems be managed sustainably will require synergies from the application of information technologies. The iPlant cyberinfrastructure design is based on an unprecedented period of research community input, and leverages developments in high-performance computing, data storage, and cyberinfrastructure for the physical sciences. iPlant is an open-source project with application programming interfaces that allow the community to extend the infrastructure to meet its needs. iPlant is sponsoring community-driven workshops addressing specific scientific questions via analysis tool integration and hypothesis testing. These workshops teach researchers how to add bioinformatics tools and/or datasets into the iPlant cyberinfrastructure enabling plant scientists to perform complex analyses on large datasets without the need to master the command-line or high-performance computational services. PMID:22645531

  15. ArrayBridge: Interweaving declarative array processing with high-performance computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xing, Haoyuan; Floratos, Sofoklis; Blanas, Spyros

    Scientists are increasingly turning to datacenter-scale computers to produce and analyze massive arrays. Despite decades of database research that extols the virtues of declarative query processing, scientists still write, debug and parallelize imperative HPC kernels even for the most mundane queries. This impedance mismatch has been partly attributed to the cumbersome data loading process; in response, the database community has proposed in situ mechanisms to access data in scientific file formats. Scientists, however, desire more than a passive access method that reads arrays from files. This paper describes ArrayBridge, a bi-directional array view mechanism for scientific file formats, that aimsmore » to make declarative array manipulations interoperable with imperative file-centric analyses. Our prototype implementation of ArrayBridge uses HDF5 as the underlying array storage library and seamlessly integrates into the SciDB open-source array database system. In addition to fast querying over external array objects, ArrayBridge produces arrays in the HDF5 file format just as easily as it can read from it. ArrayBridge also supports time travel queries from imperative kernels through the unmodified HDF5 API, and automatically deduplicates between array versions for space efficiency. Our extensive performance evaluation in NERSC, a large-scale scientific computing facility, shows that ArrayBridge exhibits statistically indistinguishable performance and I/O scalability to the native SciDB storage engine.« less

  16. Goddard Visiting Scientist Program

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Under this Indefinite Delivery Indefinite Quantity (IDIQ) contract, USRA was expected to provide short term (from I day up to I year) personnel as required to provide a Visiting Scientists Program to support the Earth Sciences Directorate (Code 900) at the Goddard Space Flight Center. The Contractor was to have a pool, or have access to a pool, of scientific talent, both domestic and international, at all levels (graduate student to senior scientist), that would support the technical requirements of the following laboratories and divisions within Code 900: 1) Global Change Data Center (902); 2) Laboratory for Atmospheres (Code 910); 3) Laboratory for Terrestrial Physics (Code 920); 4) Space Data and Computing Division (Code 930); 5) Laboratory for Hydrospheric Processes (Code 970). The research activities described below for each organization within Code 900 were intended to comprise the general scope of effort covered under the Visiting Scientist Program.

  17. The technical communication practices of Russian and U.S. aerospace engineers and scientists

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Barclay, Rebecca O.; Keene, Michael L.; Flammia, Madelyn; Kennedy, John M.

    1993-01-01

    As part of Phase 4 of the NASA/DoD Aerospace Knowledge Diffusion Research Project, two studies were conducted that investigated the technical communication practices of Russian and U.S. aerospace engineers and scientists. Both studies had the same five objectives: first, to solicit the opinions of aerospace engineers and scientists regarding the importance of technical communication to their professions; second, to determine the use and production of technical communication by aerospace engineers and scientists; third, to seek their views about the appropriate content of the undergraduate course in technical communication; fourth, to determine aerospace engineers' and scientists' use of libraries, technical information centers, and on-line databases; and fifth, to determine the use and importance of computer and information technology to them. A self administered questionnaire was distributed to Russian aerospace engineers and scientists at the Central Aero-Hydrodynamic Institute (TsAGI) and to their U.S. counterparts at the NASA Ames Research Center and the NASA Langley Research Center. The completion rates for the Russian and U.S. surveys were 64 and 61 percent, respectively. Responses of the Russian and U.S. participants to selected questions are presented in this paper.

  18. NASA/DOD Aerospace Knowledge Diffusion Research Project. Paper 16: A comparison of the technical communications practices of Russian and US aerospace engineers and scientists

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Kennedy, John M.; Barclay, Rebecca O.

    1993-01-01

    As part of Phase 4 of the NASA/DOD Aerospace Knowledge Diffusion Project, two studies were conducted that investigated the technical communications practices of Russian and U.S. aerospace engineers and scientists. Both studies have the same five objectives: first, to solicit the opinions of aerospace engineers and scientists regarding the importance of technical communications to their profession; second, to determine the use and production of technical communications by aerospace engineers and scientists; third, to seek their views about the appropriate content of an undergraduate course in technical communications; fourth, to determine aerospace engineers' and scientists' use of libraries, technical information centers, and on-line data bases; and fifth, to determine the use and importance of computer and information technology to them. A self-administered questionnaire was distributed to aerospace engineers and scientists at the Central Aero-Hydrodynamic Institute (TsAGI), NASA ARC, and NASA LaRC. The completion rates for the Russian and U.S. surveys were 64 and 61 percent, respectively. The responses of the Russian and U.S. participants, to selected questions, are presented in this report.

  19. NASA/DOD Aerospace Knowledge Diffusion Research Project. Paper 28: The technical communication practices of Russian and US aerospace engineers and scientists

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Barclay, Rebecca O.; Keene, Michael L.; Flammia, Madelyn; Kennedy, John M.

    1993-01-01

    As part of Phase 4 of the NASA/DoD Aerospace Knowledge Diffusion Research Project, two studies were conducted that investigated the technical communication practices of Russian and U.S. aerospace engineers and scientists. Both studies had the same five objectives: first, to solicit the opinions of aerospace engineers and scientists regarding the importance of technical communication to their professions; second, to determine the use and production of technical communication by aerospace engineers and scientists; third, to seek their views about the appropriate content of the undergraduate course in technical communication; fourth, to determine aerospace engineers' and scientists' use of libraries, technical information centers, and on-line databases; and fifth, to determine the use and importance of computer and information technology to them. A self administered questionnaire was distributed to Russian aerospace engineers and scientists at the Central Aero-Hydrodynamic Institute (TsAGI) and to their U.S. counterparts at the NASA Ames Research Center and the NASA Langley Research Center. The completion rates for the Russian and U.S. surveys were 64 and 61 percent, respectively. Responses of the Russian and U.S. participants to selected questions are presented in this paper.

  20. LHC Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lincoln, Don

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  1. Computer Models of Proteins

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Dr. Marc Pusey (seated) and Dr. Craig Kundrot use computers to analyze x-ray maps and generate three-dimensional models of protein structures. With this information, scientists at Marshall Space Flight Center can learn how proteins are made and how they work. The computer screen depicts a proten structure as a ball-and-stick model. Other models depict the actual volume occupied by the atoms, or the ribbon-like structures that are crucial to a protein's function.

  2. Artificial-life researchers try to create social reality.

    PubMed

    Flam, F

    1994-08-12

    Some scientists, among them cosmologist Stephen Hawking, argue that computer viruses are alive. A better case might be made for many of the self-replicating silicon-based creatures featured at the fourth Conference on Artificial Life, held on 5 to 8 July in Boston. Researchers from computer science, biology, and other disciplines presented computer programs that, among other things, evolved cooperative strategies in a selfish world and recreated themselves in ever more complex forms.

  3. The Development of University Computing in Sweden 1965-1985

    NASA Astrophysics Data System (ADS)

    Dahlstrand, Ingemar

    In 1965-70 the government agency, Statskontoret, set up five university computing centers, as service bureaux financed by grants earmarked for computer use. The centers were well equipped and staffed and caused a surge in computer use. When the yearly flow of grant money stagnated at 25 million Swedish crowns, the centers had to find external income to survive and acquire time-sharing. But the charging system led to the computers not being fully used. The computer scientists lacked equipment for laboratory use. The centers were decentralized and the earmarking abolished. Eventually they got new tasks like running computers owned by the departments, and serving the university administration.

  4. Space station Simulation Computer System (SCS) study for NASA/MSFC. Volume 5: Study analysis report

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex (PTC) at the Marshall Space Flight Center (MSFC). The PTC will train the space station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be on-board the Freedom Space Station. The further analysis performed on the SCS study as part of task 2-Perform Studies and Parametric Analysis-of the SCS study contract is summarized. These analyses were performed to resolve open issues remaining after the completion of task 1, and the publishing of the SCS study issues report. The results of these studies provide inputs into SCS task 3-Develop and present SCS requirements, and SCS task 4-develop SCS conceptual designs. The purpose of these studies is to resolve the issues into usable requirements given the best available information at the time of the study. A list of all the SCS study issues is given.

  5. TomoBank: a tomographic data repository for computational x-ray science

    DOE PAGES

    De Carlo, Francesco; Gürsoy, Doğa; Ching, Daniel J.; ...

    2018-02-08

    There is a widening gap between the fast advancement of computational methods for tomographic reconstruction and their successful implementation in production software at various synchrotron facilities. This is due in part to the lack of readily available instrument datasets and phantoms representative of real materials for validation and comparison of new numerical methods. Recent advancements in detector technology made sub-second and multi-energy tomographic data collection possible [1], but also increased the demand to develop new reconstruction methods able to handle in-situ [2] and dynamic systems [3] that can be quickly incorporated in beamline production software [4]. The X-ray Tomography Datamore » Bank, tomoBank, provides a repository of experimental and simulated datasets with the aim to foster collaboration among computational scientists, beamline scientists, and experimentalists and to accelerate the development and implementation of tomographic reconstruction methods for synchrotron facility production software by providing easy access to challenging dataset and their descriptors.« less

  6. Automating the parallel processing of fluid and structural dynamics calculations

    NASA Technical Reports Server (NTRS)

    Arpasi, Dale J.; Cole, Gary L.

    1987-01-01

    The NASA Lewis Research Center is actively involved in the development of expert system technology to assist users in applying parallel processing to computational fluid and structural dynamic analysis. The goal of this effort is to eliminate the necessity for the physical scientist to become a computer scientist in order to effectively use the computer as a research tool. Programming and operating software utilities have previously been developed to solve systems of ordinary nonlinear differential equations on parallel scalar processors. Current efforts are aimed at extending these capabilities to systems of partial differential equations, that describe the complex behavior of fluids and structures within aerospace propulsion systems. This paper presents some important considerations in the redesign, in particular, the need for algorithms and software utilities that can automatically identify data flow patterns in the application program and partition and allocate calculations to the parallel processors. A library-oriented multiprocessing concept for integrating the hardware and software functions is described.

  7. Metrics and the effective computational scientist: process, quality and communication.

    PubMed

    Baldwin, Eric T

    2012-09-01

    Recent treatments of computational knowledge worker productivity have focused upon the value the discipline brings to drug discovery using positive anecdotes. While this big picture approach provides important validation of the contributions of these knowledge workers, the impact accounts do not provide the granular detail that can help individuals and teams perform better. I suggest balancing the impact-focus with quantitative measures that can inform the development of scientists. Measuring the quality of work, analyzing and improving processes, and the critical evaluation of communication can provide immediate performance feedback. The introduction of quantitative measures can complement the longer term reporting of impacts on drug discovery. These metric data can document effectiveness trends and can provide a stronger foundation for the impact dialogue. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. The Physics of Information Technology

    NASA Astrophysics Data System (ADS)

    Gershenfeld, Neil

    2000-10-01

    The Physics of Information Technology explores the familiar devices that we use to collect, transform, transmit, and interact with electronic information. Many such devices operate surprisingly close to very many fundamental physical limits. Understanding how such devices work, and how they can (and cannot) be improved, requires deep insight into the character of physical law as well as engineering practice. The book starts with an introduction to units, forces, and the probabilistic foundations of noise and signaling, then progresses through the electromagnetics of wired and wireless communications, and the quantum mechanics of electronic, optical, and magnetic materials, to discussions of mechanisms for computation, storage, sensing, and display. This self-contained volume will help both physical scientists and computer scientists see beyond the conventional division between hardware and software to understand the implications of physical theory for information manipulation.

  9. Comparing Active Game-Playing Scores and Academic Performances of Elementary School Students

    ERIC Educational Resources Information Center

    Kert, Serhat Bahadir; Köskeroglu Büyükimdat, Meryem; Uzun, Ahmet; Çayiroglu, Beytullah

    2017-01-01

    In the educational sciences, many discussions on the use of computer games occur. Most of the scientists believe that traditional computer games are time-consuming software and that game-playing activities negatively affect students' academic performance. In this study, the accuracy of this general opinion was examined by focusing on the real…

  10. The Rise of Computing Research in East Africa: The Relationship between Funding, Capacity and Research Community in a Nascent Field

    ERIC Educational Resources Information Center

    Harsh, Matthew; Bal, Ravtosh; Wetmore, Jameson; Zachary, G. Pascal; Holden, Kerry

    2018-01-01

    The emergence of vibrant research communities of computer scientists in Kenya and Uganda has occurred in the context of neoliberal privatization, commercialization, and transnational capital flows from donors and corporations. We explore how this funding environment configures research culture and research practices, which are conceptualized as…

  11. Employing Inquiry-Based Computer Simulations and Embedded Scientist Videos to Teach Challenging Climate Change and Nature of Science Concepts

    ERIC Educational Resources Information Center

    Cohen, Edward Charles

    2013-01-01

    Design based research was utilized to investigate how students use a greenhouse effect simulation in order to derive best learning practices. During this process, students recognized the authentic scientific process involving computer simulations. The simulation used is embedded within an inquiry-based technology-mediated science curriculum known…

  12. It Starts with a Hashtag… the #365scienceselfies Project

    NASA Astrophysics Data System (ADS)

    Guertin, L. A.

    2016-12-01

    The year 2016 is the year of #365scienceselfies. The idea was initially proposed on Twitter for scientists to take one selfie a day to chronicle our everyday lives. The overarching goals include expanding the public perception of who scientists are (gender, race, physical appearance, etc.) and what scientists do outside of science (hobbies, raise a family, etc.). Because the selfie culture is popular among women, the project is hoping to showcase women scientists in a positive forum. Starting January 1, 2016, graduate students, post-doctoral researchers, and faculty members began posting selfies on Twitter and Instagram with the hashtag #365scienceselfies. Some photos are being posted with a frequency of close to once a day, while others are posting selfies once a week or on an occasional basis. Photos continue to appear online of scientists doing fieldwork, lab work, computer work, exercising, eating, socializing, and more. My motivation for participating in the #365scienceselfies project is to teach others, specifically the K-12 teachers and students I work with that struggle to understand the field of Earth science overall, about my life as a female geoscience academic. Little did I realize how much I would learn about myself and my own life. For example, there have been many, many days where I struggled to find something to take a photo of that does not involve my computer screen. Participating in the project was effective in sparking conversations and questions among my own communities and networks, especially the non-scientists that I included in daily photos (my hairdresser, Zumba instructors, etc.). The remaining challenge will be to synthesize this year of selfies (I have been doing so on a blog http://sites.psu.edu/365scienceselfies/) and to engage an even broader community in exploration and discussion of selfies, and what we can learn about the life of a geoscientist, one photo at a time.

  13. Current trends for customized biomedical software tools.

    PubMed

    Khan, Haseeb Ahmad

    2017-01-01

    In the past, biomedical scientists were solely dependent on expensive commercial software packages for various applications. However, the advent of user-friendly programming languages and open source platforms has revolutionized the development of simple and efficient customized software tools for solving specific biomedical problems. Many of these tools are designed and developed by biomedical scientists independently or with the support of computer experts and often made freely available for the benefit of scientific community. The current trends for customized biomedical software tools are highlighted in this short review.

  14. An Interdisciplinary Approach Between Medical Informatics and Social Sciences to Transdisciplinary Requirements Engineering for an Integrated Care Setting.

    PubMed

    Vielhauer, Jan; Böckmann, Britta

    2017-01-01

    Requirements engineering of software products for elderly people faces some special challenges to ensure a maximum of user acceptance. Within the scope of a research project, a web-based platform and a mobile app are approached to enable people to live in their own home as long as possible. This paper is about a developed method of interdisciplinary requirements engineering by a team of social scientists in cooperation with computer scientists.

  15. Work on the physics of ultracold atoms in Russia

    NASA Astrophysics Data System (ADS)

    Kolachevsky, N. N.; Taichenachev, A. V.

    2018-05-01

    In December 2017, the regular All-Russian Conference 'Physics of Ultracold Atoms' was held. Several tens of Russian scientists from major scientific centres of the country, as well as a number of leading foreign scientists took part in the Conference. The Conference topics covered a wide range of urgent problems: quantum metrology, quantum gases, waves of matter, spectroscopy, quantum computing, and laser cooling. This issue of Quantum Electronics publishes the papers reported at the conference and selected for the Journal by the Organising committee.

  16. Handbook of applied mathematics for engineers and scientists

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurtz, M.

    1991-12-31

    This book is intended to be reference for applications of mathematics in a wide range of topics of interest to engineers and scientists. An unusual feature of this book is that it covers a large number of topics from elementary algebra, trigonometry, and calculus to computer graphics and cybernetics. The level of mathematics covers high school through about the junior level of an engineering curriculum in a major univeristy. Throughout, the emphasis is on applications of mathematics rather than on rigorous proofs.

  17. The Natural History of the Progression of Atrophy Secondary to Stargardt Disease (ProgStar) Studies: Design and Baseline Characteristics: ProgStar Report No. 1.

    PubMed

    Strauss, Rupert W; Ho, Alex; Muñoz, Beatriz; Cideciyan, Artur V; Sahel, José-Alain; Sunness, Janet S; Birch, David G; Bernstein, Paul S; Michaelides, Michel; Traboulsi, Elias I; Zrenner, Eberhart; Sadda, SriniVas; Ervin, Ann-Margret; West, Sheila; Scholl, Hendrik P N

    2016-04-01

    To describe the design and baseline characteristics of patients enrolled into 2 natural history studies of Stargardt disease (STGD1). Multicenter retrospective and prospective cohort studies. Three hundred sixty-five unique patients aged 6 years and older at baseline harboring disease-causing variants in the ABCA4 gene and with specified ocular lesions were enrolled from 9 centers in the United States and Europe. In the retrospective study, patients contributed medical record data from at least 2 and up to 4 visits for at least 1 examination modality: fundus autofluorescence (FAF), spectral-domain (SD) optical coherence tomography (SD OCT), and/or microperimetry (MP). The total observational period was at least 2 years and up to 5 years between single visits. Demographic and visual acuity (VA) data also were obtained. In the prospective study, eligible patients were examined at baseline using a standard protocol, with 6-month follow-up visits planned for a 2-year period for serial Early Treatment Diabetic Retinopathy Study (ETDRS) best-corrected VA, SD OCT, FAF, and MP. Design and rationale of a multicenter study to determine the progression of STGD1 in 2 large retrospective and prospective international cohorts. Detailed baseline characteristics of both cohorts are presented, including demographics, and structural and functional retinal metrics. Into the retrospective study, 251 patients (458 eyes) were enrolled; mean follow-up ± standard deviation was 3.9±1.6 years. At baseline, 36% had no or mild VA loss, and 47% of the study eyes had areas of definitely decreased autofluorescence (DDAF) with an average lesion area of 2.5±2.9 mm(2) (range, 0.02-16.03 mm(2)). Two hundred fifty-nine patients (489 eyes) were enrolled in the prospective study. At baseline, 20% had no or mild VA loss, and 64% had areas of DDAF with an average lesion area of 4.0±4.4 mm(2) (range, 0.03-24.24 mm(2)). The mean retinal sensitivity with MP was 10.8±5.0 dB. The ProgStar cohorts have baseline characteristics that encompass a wide range of disease severity and are expected to provide valuable data on progression based on serial quantitative measurements derived from multiple methods, which will be critical to the design of planned clinical trials. Copyright © 2016 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  18. Computational nuclear quantum many-body problem: The UNEDF project

    NASA Astrophysics Data System (ADS)

    Bogner, S.; Bulgac, A.; Carlson, J.; Engel, J.; Fann, G.; Furnstahl, R. J.; Gandolfi, S.; Hagen, G.; Horoi, M.; Johnson, C.; Kortelainen, M.; Lusk, E.; Maris, P.; Nam, H.; Navratil, P.; Nazarewicz, W.; Ng, E.; Nobre, G. P. A.; Ormand, E.; Papenbrock, T.; Pei, J.; Pieper, S. C.; Quaglioni, S.; Roche, K. J.; Sarich, J.; Schunck, N.; Sosonkina, M.; Terasaki, J.; Thompson, I.; Vary, J. P.; Wild, S. M.

    2013-10-01

    The UNEDF project was a large-scale collaborative effort that applied high-performance computing to the nuclear quantum many-body problem. The primary focus of the project was on constructing, validating, and applying an optimized nuclear energy density functional, which entailed a wide range of pioneering developments in microscopic nuclear structure and reactions, algorithms, high-performance computing, and uncertainty quantification. UNEDF demonstrated that close associations among nuclear physicists, mathematicians, and computer scientists can lead to novel physics outcomes built on algorithmic innovations and computational developments. This review showcases a wide range of UNEDF science results to illustrate this interplay.

  19. Research in progress and other activities of the Institute for Computer Applications in Science and Engineering

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics and computer science during the period April 1, 1993 through September 30, 1993. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustic and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  20. Research in progress in applied mathematics, numerical analysis, fluid mechanics, and computer science

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  1. Changing the face of science: Lessons from the 2017 Science-A-Thon

    NASA Astrophysics Data System (ADS)

    Barnes, R. T.; Licker, R.; Burt, M. A.; Holloway, T.

    2017-12-01

    Studies have shown that over two-thirds of Americans cannot name a living scientist. This disconnect is a concern for science and scientists, considering the large role of public funding for science, and the importance of science in many policy issues. As a large-scale public outreach initiative and fundraiser, the Earth Science Women's Network (ESWN) launched "Science-A-Thon" on July 13, 2017. This "day of science" invited participants to share 12 photos over 12 hours of a day, including both personal routines and professional endeavors. Over 200 scientists participated, with the #DayofScience hashtag trending on Twitter for the day. Earth scientists represented the largest portion of participants, but the event engaged cancer biologists, computer scientists, and more, including scientists from more than 10 countries. Science-A-Thon builds on the success and visibility of other social media campaigns, such as #actuallivingscientist and #DresslikeaWoman. Importantly these efforts share a common goal, by providing diverse images of scientists we can shift the public perception of who a scientist is and what science looks like in the real world. This type of public engagement offers a wide range of potential role models for students, and individual stories to increase public engagement with science. Social media campaigns such as this shift the public perception of who scientists are, why they do what they do, and what they do each day. The actions and conversations emerging from Science-A-Thon included scientists talking about (1) their science and motivation, (2) the purpose and need for ESWN, and (3) why they chose to participate in this event increased the reach of a social media campaign and fundraiser.

  2. Intern Programs

    Science.gov Websites

    , engineering and computing. Working with Fermilab scientists or engineers, interns have an opportunity to four quarters at Fermilab, alternating periods of full-time study at their schools with full-time

  3. A Computing Infrastructure for Supporting Climate Studies

    NASA Astrophysics Data System (ADS)

    Yang, C.; Bambacus, M.; Freeman, S. M.; Huang, Q.; Li, J.; Sun, M.; Xu, C.; Wojcik, G. S.; Cahalan, R. F.; NASA Climate @ Home Project Team

    2011-12-01

    Climate change is one of the major challenges facing us on the Earth planet in the 21st century. Scientists build many models to simulate the past and predict the climate change for the next decades or century. Most of the models are at a low resolution with some targeting high resolution in linkage to practical climate change preparedness. To calibrate and validate the models, millions of model runs are needed to find the best simulation and configuration. This paper introduces the NASA effort on Climate@Home project to build a supercomputer based-on advanced computing technologies, such as cloud computing, grid computing, and others. Climate@Home computing infrastructure includes several aspects: 1) a cloud computing platform is utilized to manage the potential spike access to the centralized components, such as grid computing server for dispatching and collecting models runs results; 2) a grid computing engine is developed based on MapReduce to dispatch models, model configuration, and collect simulation results and contributing statistics; 3) a portal serves as the entry point for the project to provide the management, sharing, and data exploration for end users; 4) scientists can access customized tools to configure model runs and visualize model results; 5) the public can access twitter and facebook to get the latest about the project. This paper will introduce the latest progress of the project and demonstrate the operational system during the AGU fall meeting. It will also discuss how this technology can become a trailblazer for other climate studies and relevant sciences. It will share how the challenges in computation and software integration were solved.

  4. RIACS

    NASA Technical Reports Server (NTRS)

    Moore, Robert C.

    1998-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities that serves as a bridge between NASA and the academic community. Under a five-year co-operative agreement with NASA, research at RIACS is focused on areas that are strategically enabling to the Ames Research Center's role as NASA's Center of Excellence for Information Technology. The primary mission of RIACS is charted to carry out research and development in computer science. This work is devoted in the main to tasks that are strategically enabling with respect to NASA's bold mission in space exploration and aeronautics. There are three foci for this work: (1) Automated Reasoning. (2) Human-Centered Computing. and (3) High Performance Computing and Networking. RIACS has the additional goal of broadening the base of researcher in these areas of importance to the nation's space and aeronautics enterprises. Through its visiting scientist program, RIACS facilitates the participation of university-based researchers, including both faculty and students, in the research activities of NASA and RIACS. RIACS researchers work in close collaboration with NASA computer scientists on projects such as the Remote Agent Experiment on Deep Space One mission, and Super-Resolution Surface Modeling.

  5. LHC Computing

    ScienceCinema

    Lincoln, Don

    2018-01-16

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  6. Preface: SciDAC 2005

    NASA Astrophysics Data System (ADS)

    Mezzacappa, Anthony

    2005-01-01

    On 26-30 June 2005 at the Grand Hyatt on Union Square in San Francisco several hundred computational scientists from around the world came together for what can certainly be described as a celebration of computational science. Scientists from the SciDAC Program and scientists from other agencies and nations were joined by applied mathematicians and computer scientists to highlight the many successes in the past year where computation has led to scientific discovery in a variety of fields: lattice quantum chromodynamics, accelerator modeling, chemistry, biology, materials science, Earth and climate science, astrophysics, and combustion and fusion energy science. Also highlighted were the advances in numerical methods and computer science, and the multidisciplinary collaboration cutting across science, mathematics, and computer science that enabled these discoveries. The SciDAC Program was conceived and funded by the US Department of Energy Office of Science. It is the Office of Science's premier computational science program founded on what is arguably the perfect formula: the priority and focus is science and scientific discovery, with the understanding that the full arsenal of `enabling technologies' in applied mathematics and computer science must be brought to bear if we are to have any hope of attacking and ultimately solving today's computational Grand Challenge problems. The SciDAC Program has been in existence for four years, and many of the computational scientists funded by this program will tell you that the program has given them the hope of addressing their scientific problems in full realism for the very first time. Many of these scientists will also tell you that SciDAC has also fundamentally changed the way they do computational science. We begin this volume with one of DOE's great traditions, and core missions: energy research. As we will see, computation has been seminal to the critical advances that have been made in this arena. Of course, to understand our world, whether it is to understand its very nature or to understand it so as to control it for practical application, will require explorations on all of its scales. Computational science has been no less an important tool in this arena than it has been in the arena of energy research. From explorations of quantum chromodynamics, the fundamental theory that describes how quarks make up the protons and neutrons of which we are composed, to explorations of the complex biomolecules that are the building blocks of life, to explorations of some of the most violent phenomena in our universe and of the Universe itself, computation has provided not only significant insight, but often the only means by which we have been able to explore these complex, multicomponent systems and by which we have been able to achieve scientific discovery and understanding. While our ultimate target remains scientific discovery, it certainly can be said that at a fundamental level the world is mathematical. Equations ultimately govern the evolution of the systems of interest to us, be they physical, chemical, or biological systems. The development and choice of discretizations of these underlying equations is often a critical deciding factor in whether or not one is able to model such systems stably, faithfully, and practically, and in turn, the algorithms to solve the resultant discrete equations are the complementary, critical ingredient in the recipe to model the natural world. The use of parallel computing platforms, especially at the TeraScale, and the trend toward even larger numbers of processors, continue to present significant challenges in the development and implementation of these algorithms. Computational scientists often speak of their `workflows'. A workflow, as the name suggests, is the sum total of all complex and interlocking tasks, from simulation set up, execution, and I/O, to visualization and scientific discovery, through which the advancement in our understanding of the natural world is realized. For the computational scientist, enabling such workflows presents myriad, signiflcant challenges, and it is computer scientists that are called upon at such times to address these challenges. Simulations are currently generating data at the staggering rate of tens of TeraBytes per simulation, over the course of days. In the next few years, these data generation rates are expected to climb exponentially to hundreds of TeraBytes per simulation, performed over the course of months. The output, management, movement, analysis, and visualization of these data will be our key to unlocking the scientific discoveries buried within the data. And there is no hope of generating such data to begin with, or of scientific discovery, without stable computing platforms and a sufficiently high and sustained performance of scientific applications codes on them. Thus, scientific discovery in the realm of computational science at the TeraScale and beyond will occur at the intersection of science, applied mathematics, and computer science. The SciDAC Program was constructed to mirror this reality, and the pages that follow are a testament to the efficacy of such an approach. We would like to acknowledge the individuals on whose talents and efforts the success of SciDAC 2005 was based. Special thanks go to Betsy Riley for her work on the SciDAC 2005 Web site and meeting agenda, for lining up our corporate sponsors, for coordinating all media communications, and for her efforts in processing the proceedings contributions, to Sherry Hempfling for coordinating the overall SciDAC 2005 meeting planning, for handling a significant share of its associated communications, and for coordinating with the ORNL Conference Center and Grand Hyatt, to Angela Harris for producing many of the documents and records on which our meeting planning was based and for her efforts in coordinating with ORNL Graphics Services, to Angie Beach of the ORNL Conference Center for her efforts in procurement and setting up and executing the contracts with the hotel, and to John Bui and John Smith for their superb wireless networking and A/V set up and support. We are grateful for the relentless efforts of all of these individuals, their remarkable talents, and for the joy of working with them during this past year. They were the cornerstones of SciDAC 2005. Thanks also go to Kymba A'Hearn and Patty Boyd for on-site registration, Brittany Hagen for administrative support, Bruce Johnston for netcast support, Tim Jones for help with the proceedings and Web site, Sherry Lamb for housing and registration, Cindy Lathum for Web site design, Carolyn Peters for on-site registration, and Dami Rich for graphic design. And we would like to express our appreciation to the Oak Ridge National Laboratory, especially Jeff Nichols, the Argonne National Laboratory, the Lawrence Berkeley National Laboratory, and to our corporate sponsors, Cray, IBM, Intel, and SGI, for their support. We would like to extend special thanks also to our plenary speakers, technical speakers, poster presenters, and panelists for all of their efforts on behalf of SciDAC 2005 and for their remarkable achievements and contributions. We would like to express our deep appreciation to Lali Chatterjee, Graham Douglas and Margaret Smith of Institute of Physics Publishing, who worked tirelessly in order to provide us with this finished volume within two months, which is nothing short of miraculous. Finally, we wish to express our heartfelt thanks to Michael Strayer, SciDAC Director, whose vision it was to focus SciDAC 2005 on scientific discovery, around which all of the excitement we experienced revolved, and to our DOE SciDAC program managers, especially Fred Johnson, for their support, input, and help throughout.

  7. Predicting the whirlwind

    NASA Astrophysics Data System (ADS)

    Ornes, Stephen

    2017-07-01

    Across the wide open plains of the central US and inside air-conditioned computer laboratories, scientists of different stripes are probing one of nature's most devastating phenomena: tornadoes. Stephen Ornes offers a snapshot of their work

  8. Biotechnology awareness study, Part 1: Where scientists get their information.

    PubMed Central

    Grefsheim, S; Franklin, J; Cunningham, D

    1991-01-01

    A model study, funded by the National Library of Medicine (NLM) and conducted by the Southeastern/Atlantic Regional Medical Library (RML) and the University of Maryland Health Sciences Library, attempted to assess the information needs of researchers in the developing field of biotechnology and to determine the resources available to meet those needs in major academic health sciences centers. Nine medical schools in RML Region 2 were selected to participate in a biotechnology awareness study. A survey was conducted of the nine medical school libraries to assess their support of biotechnology research. To identify the information needs of scientists engaged in biotechnology-related research at the schools, a written survey was sent to the deans of the nine institutions and selected scientists they had identified. This was followed by individual, in-depth interviews with both the deans and scientists surveyed. In general, scientists obtained information from three major sources: their own experiments, personal communication with other scientists, and textual material (print or electronic). For textual information, most study participants relied on personal journal subscriptions. Tangential journals were scanned in the department's library. Only a few of these scientists came to the health sciences library on a regular basis. Further, the study found that personal computers have had a major impact on how biotechnologists get and use information. Implications of these findings for libraries and librarians are discussed. PMID:1998818

  9. Frontier Scientists' project probes audience science interests with website, social media, TV broadcast, game, and pop-up book

    NASA Astrophysics Data System (ADS)

    O'Connell, E. A.

    2017-12-01

    The Frontier Scientists National Science Foundation project titled Science in Alaska: Using Multimedia to Support Science Education produced research products in several formats: videos short and long, blogs, social media, a computer game, and a pop-up book. These formats reached distinctly different audiences. Internet users, public TV viewers, gamers, schools, and parents & young children were drawn to Frontier Scientists' research in direct and indirect ways. The analytics (our big data) derived from this media broadcast has given us insight into what works, what doesn't, next steps. We have evidence for what is needed to present science as an interesting, vital, and a necessary component for the general public's daily information diet and as an important tool for scientists to publicize research and to thrive in their careers. Collaborations with scientists at several Universities, USGS, Native organizations, tourism organizations, and Alaska Museums promoted accuracy of videos and increased viewing. For example, Erin Marbarger, at Anchorage Museum, edited, and provided Spark!Lab to test parents & child's interest in the pop-up book titled: The Adventures of Apun the Arctic Fox. Without a marketing budget Frontier Scientist's minimum publicity, during the three year project, still drew an audience. Frontier Scientists was awarded Best Website 2016 by the Alaska Press Club, and won a number of awards for short videos and TV programs.

  10. Gendered by Design? Information Technology and Office Systems. Gender and Society: Feminist Perspectives on the Past and Present Series.

    ERIC Educational Resources Information Center

    Green, Eileen; And Others

    This international collection of essays brings together two important and growing areas of research and debate: the sociology of gender relations in the workplace and the expanding body of interdisciplinary research into the design of computer systems. Feminists, computer scientists, and sociologists explore the impact of gender relations upon…

  11. Framework for Intelligent Teaching and Training Systems -- A Study of Systems

    ERIC Educational Resources Information Center

    Graf von Malotky, Nikolaj Troels; Martens, Alke

    2016-01-01

    Intelligent Tutoring System are state of the art in eLearning since the late 1980s. The earliest system have been developed in teams of psychologists and computer scientists, with the goal to investigate learning processes and, later on with the goal to intelligently support teaching and training with computers. Over the years, the eLearning hype…

  12. Technology Needs for Teachers Web Development and Curriculum Adaptations

    NASA Technical Reports Server (NTRS)

    Carroll, Christy J.

    1999-01-01

    Computer-based mathematics and science curricula focusing on NASA inventions and technologies will enhance current teacher knowledge and skills. Materials and interactive software developed by educators will allow students to integrate their various courses, to work cooperatively, and to collaborate with both NASA scientists and students at other locations by using computer networks, email and the World Wide Web.

  13. Evaluation of the Effectiveness of the Storage and Distribution Entry-Level Computer-Based Training (CBT) Program

    DTIC Science & Technology

    1990-09-01

    learning occurs when this final Zink is made into long-term memory (13:79). Cognitive scientists realize the role of the trainee as a passive receiver of...of property on the computer, and when they did, this piece of paperwork printed out on their printer . Someone from the receiving section brought this

  14. TomoBank: a tomographic data repository for computational x-ray science

    NASA Astrophysics Data System (ADS)

    De Carlo, Francesco; Gürsoy, Doğa; Ching, Daniel J.; Joost Batenburg, K.; Ludwig, Wolfgang; Mancini, Lucia; Marone, Federica; Mokso, Rajmund; Pelt, Daniël M.; Sijbers, Jan; Rivers, Mark

    2018-03-01

    There is a widening gap between the fast advancement of computational methods for tomographic reconstruction and their successful implementation in production software at various synchrotron facilities. This is due in part to the lack of readily available instrument datasets and phantoms representative of real materials for validation and comparison of new numerical methods. Recent advancements in detector technology have made sub-second and multi-energy tomographic data collection possible (Gibbs et al 2015 Sci. Rep. 5 11824), but have also increased the demand to develop new reconstruction methods able to handle in situ (Pelt and Batenburg 2013 IEEE Trans. Image Process. 22 5238-51) and dynamic systems (Mohan et al 2015 IEEE Trans. Comput. Imaging 1 96-111) that can be quickly incorporated in beamline production software (Gürsoy et al 2014 J. Synchrotron Radiat. 21 1188-93). The x-ray tomography data bank, tomoBank, provides a repository of experimental and simulated datasets with the aim to foster collaboration among computational scientists, beamline scientists, and experimentalists and to accelerate the development and implementation of tomographic reconstruction methods for synchrotron facility production software by providing easy access to challenging datasets and their descriptors.

  15. Accessible microscopy workstation for students and scientists with mobility impairments.

    PubMed

    Duerstock, Bradley S

    2006-01-01

    An integrated accessible microscopy workstation was designed and developed to allow persons with mobility impairments to control all aspects of light microscopy with minimal human assistance. This system, named AccessScope, is capable of performing brightfield and fluorescence microscopy, image analysis, and tissue morphometry requisite for undergraduate science courses to graduate-level research. An accessible microscope is necessary for students and scientists with mobility impairments to be able to use a microscope independently to better understand microscopical imaging concepts and cell biology. This knowledge is not always apparent by simply viewing a catalog of histological images. The ability to operate a microscope independently eliminates the need to hire an assistant or rely on a classmate and permits one to take practical laboratory examinations by oneself. Independent microscope handling is also crucial for graduate students and scientists with disabilities to perform scientific research. By making a personal computer as the user interface for controlling AccessScope functions, different upper limb mobility impairments could be accommodated by using various computer input devices and assistive technology software. Participants with a range of upper limb mobility impairments evaluated the prototype microscopy workstation. They were able to control all microscopy functions including loading different slides without assistance.

  16. Rapid Analysis of Energetic and Geo-Materials Using Laser Induced Breakdown Spectroscopy

    DTIC Science & Technology

    2013-04-01

    et al ., Anal Bioanal Chem ( 2006 ) 385, 316. 5. Mohamed, W. T. Y., Prog Phys (2007) 2, 42. 6. Elhassan, A., et al ., Spectrochim Acta B (2008) 63...Anal (2005) 5, 21. 20. Anzano, J. M., et al ., Anal Chim Acta ( 2006 ) 575, 230. 21. Rusak, D. A., et al ., TrAC Trend Anal Chem (1998) 17, 453. 22. Martin...Spectrosc Reviews (2004) 39, 27. 25. Winefordner, J. D., et al ., J Anal Atom Spectrom (2004) 19, 1061. 26. Cremers , D. A., and Radziemski, L. J.,

  17. Evaluation of Quantitative Anti-F1 IgG and Anti-V IgG ELISAs for use as an in Vitro-Based Potency Assay of Plague Vaccine in Mice

    DTIC Science & Technology

    2008-04-01

    Andrews GP, Welkos SL, Friedlander AM, et al. Protection of mice from fatal bubonic and pneu- monic plague by passive immunization with monoclonal...SL, Andrews GP, Adamovicz J, et al. Protection against experimental bubonic and pneumonic plague by a recombinant capsular F1-V antigen fusion...fusion protein as vaccine antigen against bubonic and pneumonic plague . Biotechnol Prog 2005; 21:1490e510.[21] Simpson WJ, Thomas RE, Schwan TG

  18. A TBA approach to thermal transport in the XXZ Heisenberg model

    NASA Astrophysics Data System (ADS)

    Zotos, X.

    2017-10-01

    We show that the thermal Drude weight and magnetothermal coefficient of the 1D easy-plane Heisenberg model can be evaluated by an extension of the Bethe ansatz thermodynamics formulation by Takahashi and Suzuki (1972 Prog. Theor. Phys. 48 2187). They have earlier been obtained by the quantum transfer matrix method (Klümper 1999 Z. Phys. B 91 507). Furthermore, this approach can be applied to the study of the far-out of equilibrium energy current generated at the interface between two semi-infinite chains held at different temperatures.

  19. A Glow Discharge Ion Source with Fourier Transform Ion Cyclotron Resonance Mass Spectrometric Detection

    DTIC Science & Technology

    1991-05-10

    Hall, D . Mikrochim. Acta 1987, 1, 275. 26. Harrison, W.W.; Bentz , B.L. Prog. Analyt. Spectrosc. 1988, L19 53. 27. Harrison, W.W.; Barshick, C.M...Innovation, and Applications. ACS Symp. Series; Buchanan, M.V., Ed.; American Chemical Society: Washington, 1987; 359, p 1. 3. Wilkins, C.L.; Chowdhury, A.K...J.L. In Gas Phase Ion CheMistry; Bowers, M.T., Ed.; Academic: New York, 1984; Vol. 3, p 41. 6. Dunbar, R.C. In Gas Phase Ion Chemistry; Bowvers, M.T

  20. Computational Materials Science | Materials Science | NREL

    Science.gov Websites

    of water splitting and fuel cells Nanoparticles for thermal storage New Materials for High-Capacity Theoretical Methodologies for Studying Complex Materials Contact Stephan Lany Staff Scientist Dr. Lany is a

  1. Accessing and visualizing scientific spatiotemporal data

    NASA Technical Reports Server (NTRS)

    Katz, Daniel S.; Bergou, Attila; Berriman, G. Bruce; Block, Gary L.; Collier, Jim; Curkendall, David W.; Good, John; Husman, Laura; Jacob, Joseph C.; Laity, Anastasia; hide

    2004-01-01

    This paper discusses work done by JPL's Parallel Applications Technologies Group in helping scientists access and visualize very large data sets through the use of multiple computing resources, such as parallel supercomputers, clusters, and grids.

  2. Users guide for information retrieval using APL

    NASA Technical Reports Server (NTRS)

    Shapiro, A.

    1974-01-01

    A Programming Language (APL) is a precise, concise, and powerful computer programming language. Several features make APL useful to managers and other potential computer users. APL is interactive; therefore, the user can communicate with his program or data base in near real-time. This, coupled with the fact that APL has excellent debugging features, reduces program checkout time to minutes or hours rather than days or months. Of particular importance is the fact that APL can be utilized as a management science tool using such techniques as operations research, statistical analysis, and forecasting. The gap between the scientist and the manager could be narrowed by showing how APL can be used to do what the scientists and the manager each need to do, retrieve information. Sometimes, the information needs to be retrieved rapidly. In this case APL is ideally suited for this challenge.

  3. Toward the Geoscience Paper of the Future: Best practices for documenting and sharing research from data to software to provenance

    NASA Astrophysics Data System (ADS)

    Gil, Yolanda; David, Cédric H.; Demir, Ibrahim; Essawy, Bakinam T.; Fulweiler, Robinson W.; Goodall, Jonathan L.; Karlstrom, Leif; Lee, Huikyo; Mills, Heath J.; Oh, Ji-Hyun; Pierce, Suzanne A.; Pope, Allen; Tzeng, Mimi W.; Villamizar, Sandra R.; Yu, Xuan

    2016-10-01

    Geoscientists now live in a world rich with digital data and methods, and their computational research cannot be fully captured in traditional publications. The Geoscience Paper of the Future (GPF) presents an approach to fully document, share, and cite all their research products including data, software, and computational provenance. This article proposes best practices for GPF authors to make data, software, and methods openly accessible, citable, and well documented. The publication of digital objects empowers scientists to manage their research products as valuable scientific assets in an open and transparent way that enables broader access by other scientists, students, decision makers, and the public. Improving documentation and dissemination of research will accelerate the pace of scientific discovery by improving the ability of others to build upon published work.

  4. International Symposium on Grids and Clouds (ISGC) 2014

    NASA Astrophysics Data System (ADS)

    The International Symposium on Grids and Clouds (ISGC) 2014 will be held at Academia Sinica in Taipei, Taiwan from 23-28 March 2014, with co-located events and workshops. The conference is hosted by the Academia Sinica Grid Computing Centre (ASGC).“Bringing the data scientist to global e-Infrastructures” is the theme of ISGC 2014. The last decade has seen the phenomenal growth in the production of data in all forms by all research communities to produce a deluge of data from which information and knowledge need to be extracted. Key to this success will be the data scientist - educated to use advanced algorithms, applications and infrastructures - collaborating internationally to tackle society’s challenges. ISGC 2014 will bring together researchers working in all aspects of data science from different disciplines around the world to collaborate and educate themselves in the latest achievements and techniques being used to tackle the data deluge. In addition to the regular workshops, technical presentations and plenary keynotes, ISGC this year will focus on how to grow the data science community by considering the educational foundation needed for tomorrow’s data scientist. Topics of discussion include Physics (including HEP) and Engineering Applications, Biomedicine & Life Sciences Applications, Earth & Environmental Sciences & Biodiversity Applications, Humanities & Social Sciences Application, Virtual Research Environment (including Middleware, tools, services, workflow, ... etc.), Data Management, Big Data, Infrastructure & Operations Management, Infrastructure Clouds and Virtualisation, Interoperability, Business Models & Sustainability, Highly Distributed Computing Systems, and High Performance & Technical Computing (HPTC).

  5. ISMB/ECCB 2009 Stockholm

    PubMed Central

    Sagot, Marie-France; McKay, B.J. Morrison; Myers, Gene

    2009-01-01

    The International Society for Computational Biology (ISCB; http://www.iscb.org) presents the Seventeenth Annual International Conference on Intelligent Systems for Molecular Biology (ISMB), organized jointly with the Eighth Annual European Conference on Computational Biology (ECCB; http://bioinf.mpi-inf.mpg.de/conferences/eccb/eccb.htm), in Stockholm, Sweden, 27 June to 2 July 2009. The organizers are putting the finishing touches on the year's premier computational biology conference, with an expected attendance of 1400 computer scientists, mathematicians, statisticians, biologists and scientists from other disciplines related to and reliant on this multi-disciplinary science. ISMB/ECCB 2009 (http://www.iscb.org/ismbeccb2009/) follows the framework introduced at the ISMB/ECCB 2007 (http://www.iscb.org/ismbeccb2007/) in Vienna, and further refined at the ISMB 2008 (http://www.iscb.org/ismb2008/) in Toronto; a framework developed to specifically encourage increased participation from often under-represented disciplines at conferences on computational biology. During the main ISMB conference dates of 29 June to 2 July, keynote talks from highly regarded scientists, including ISCB Award winners, are the featured presentations that bring all attendees together twice a day. The remainder of each day offers a carefully balanced selection of parallel sessions to choose from: proceedings papers, special sessions on emerging topics, highlights of the past year's published research, special interest group meetings, technology demonstrations, workshops and several unique sessions of value to the broad audience of students, faculty and industry researchers. Several hundred posters displayed for the duration of the conference has become a standard of the ISMB and ECCB conference series, and an extensive commercial exhibition showcases the latest bioinformatics publications, software, hardware and services available on the market today. The main conference is preceded by 2 days of Special Interest Group (SIG) and Satellite meetings running in parallel to the fifth Student Council Symposium on 27 June, and in parallel to Tutorials on 28 June. All scientific sessions take place at the Stockholmsmässan/Stockholm International Fairs conference and exposition facility. Contact: bj@iscb.org PMID:19447790

  6. NASA/DOD Aerospace Knowledge Diffusion Research Project. Report 18: A comparison of the technical communication practices of aerospace engineers and scientists in India and the United States

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Barclay, Rebecca O.; Kennedy, John M.

    1993-01-01

    As part of Phase 4 of the NASA/DoD Aerospace Knowledge Diffusion Research Project, two studies were conducted that investigated the technical communications practices of India and U.S. aerospace engineers and scientists. Both studies have the same seven objectives: first, to solicit the opinions of aerospace engineers and scientists regarding the importance of technical communications to their profession; second, to determine the use and production of technical communications by aerospace engineers and scientists; third, to seek their views about the appropriate content of an undergraduate course in technical communications; fourth, to determine aerospace engineers' and scientists' use of libraries, technical information centers, and on-line data bases; fifth, to determine the use and importance of computer and information technology to them; sixth, to determine their use of electronic networks; and seventh, to determine their use of foreign and domestically produced technical reports. A self-administered questionnaire was distributed to aerospace engineers and scientists at the Indian Institute of Science and the NASA Langley Research Center. The completion rates for the India and U.S. surveys were 48 and 53 percent, respectively. Responses of the India and U.S. participants to selected questions are presented in this report.

  7. NASA/DOD Aerospace Knowledge Diffusion Research Project. Report 17: A comparison of the technical communication practices of Dutch and US aerospace engineers and scientists

    NASA Technical Reports Server (NTRS)

    Barclay, Rebecca O.; Pinelli, Thomas E.; Kennedy, John M.

    1993-01-01

    As part of Phase 4 of the NASA/DoD Aerospace Knowledge Diffusion Research Project, two studies were conducted that investigated the technical communications practices of Dutch and U.S. aerospace engineers and scientists. Both studies have the same seven objectives: first, to solicit the opinions of aerospace engineers and scientists regarding the importance of technical communications to their profession; second, to determine the use and production of technical communications by aerospace engineers and scientists; third, to seek their views about the appropriate content of an undergraduate course in technical communications; fourth, to determine aerospace engineers' and scientists' use of libraries, technical information centers, and on-line data bases; fifth, to determine the use and importance of computer and information technology to them; sixth, to determine their use of electronic networks; and seventh, to determine their use of foreign and domestically produced technical reports. A self-administered questionnaire was distributed to aerospace engineers and scientists at the National Aerospace Laboratory (NLR), and NASA Ames Research Center, and the NASA Langley Research Center. The completion rates for the Dutch and U.S. surveys were 55 and 61 percent, respectively. Responses of the Dutch and U.S. participants to selected questions are presented.

  8. NASA/DOD Aerospace Knowledge Diffusion Research Project. Report 29: A comparison of the technical communications practices of Japanese and US aerospace engineers and scientists

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Barclay, Rebecca O.; Kennedy, John M.

    1994-01-01

    As part of Phase 4 of the NASA/DoD Aerospace Knowledge Diffusion Research Project, two studies were conducted that investigated the technical communications practices of Japanese and U.S. aerospace engineers and scientists. Both studies have the same seven objectives: first, to solicit the opinions of aerospace engineers and scientists regarding the importance of technical communications to their profession; second, to determine the use and production of technical communications by aerospace engineers and scientists; third; to seek their views about the appropriate content of an undergraduate course in technical communications; fourth, to determine aerospace engineers' and scientists' use of libraries, technical information centers, and on-line data bases; fifth, to determine the use and importance of computer and information technology to them; sixth, to determine their use of electronic networks; and seventh, to determine their use of foreign and domestically produced technical reports. A self-administered questionnaire was distributed to aerospace engineers and scientists in Japan and at the NASA Ames Research Center and the NASA Langley Research Center. The completion rates for the Japanese and U.S. surveys were 85 and 61 percent, respectively. Responses of the Japanese and U.S. participants to selected questions are presented in this report.

  9. East-West paths to unconventional computing.

    PubMed

    Adamatzky, Andrew; Akl, Selim; Burgin, Mark; Calude, Cristian S; Costa, José Félix; Dehshibi, Mohammad Mahdi; Gunji, Yukio-Pegio; Konkoli, Zoran; MacLennan, Bruce; Marchal, Bruno; Margenstern, Maurice; Martínez, Genaro J; Mayne, Richard; Morita, Kenichi; Schumann, Andrew; Sergeyev, Yaroslav D; Sirakoulis, Georgios Ch; Stepney, Susan; Svozil, Karl; Zenil, Hector

    2017-12-01

    Unconventional computing is about breaking boundaries in thinking, acting and computing. Typical topics of this non-typical field include, but are not limited to physics of computation, non-classical logics, new complexity measures, novel hardware, mechanical, chemical and quantum computing. Unconventional computing encourages a new style of thinking while practical applications are obtained from uncovering and exploiting principles and mechanisms of information processing in and functional properties of, physical, chemical and living systems; in particular, efficient algorithms are developed, (almost) optimal architectures are designed and working prototypes of future computing devices are manufactured. This article includes idiosyncratic accounts of 'unconventional computing' scientists reflecting on their personal experiences, what attracted them to the field, their inspirations and discoveries. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Preface: SciDAC 2006

    NASA Astrophysics Data System (ADS)

    Tang, William M., Dr.

    2006-01-01

    The second annual Scientific Discovery through Advanced Computing (SciDAC) Conference was held from June 25-29, 2006 at the new Hyatt Regency Hotel in Denver, Colorado. This conference showcased outstanding SciDAC-sponsored computational science results achieved during the past year across many scientific domains, with an emphasis on science at scale. Exciting computational science that has been accomplished outside of the SciDAC program both nationally and internationally was also featured to help foster communication between SciDAC computational scientists and those funded by other agencies. This was illustrated by many compelling examples of how domain scientists collaborated productively with applied mathematicians and computer scientists to effectively take advantage of terascale computers (capable of performing trillions of calculations per second) not only to accelerate progress in scientific discovery in a variety of fields but also to show great promise for being able to utilize the exciting petascale capabilities in the near future. The SciDAC program was originally conceived as an interdisciplinary computational science program based on the guiding principle that strong collaborative alliances between domain scientists, applied mathematicians, and computer scientists are vital to accelerated progress and associated discovery on the world's most challenging scientific problems. Associated verification and validation are essential in this successful program, which was funded by the US Department of Energy Office of Science (DOE OS) five years ago. As is made clear in many of the papers in these proceedings, SciDAC has fundamentally changed the way that computational science is now carried out in response to the exciting challenge of making the best use of the rapid progress in the emergence of more and more powerful computational platforms. In this regard, Dr. Raymond Orbach, Energy Undersecretary for Science at the DOE and Director of the OS has stated: `SciDAC has strengthened the role of high-end computing in furthering science. It is defining whole new fields for discovery.' (SciDAC Review, Spring 2006, p8). Application domains within the SciDAC 2006 conference agenda encompassed a broad range of science including: (i) the DOE core mission of energy research involving combustion studies relevant to fuel efficiency and pollution issues faced today and magnetic fusion investigations impacting prospects for future energy sources; (ii) fundamental explorations into the building blocks of matter, ranging from quantum chromodynamics - the basic theory that describes how quarks make up the protons and neutrons of all matter - to the design of modern high-energy accelerators; (iii) the formidable challenges of predicting and controlling the behavior of molecules in quantum chemistry and the complex biomolecules determining the evolution of biological systems; (iv) studies of exploding stars for insights into the nature of the universe; and (v) integrated climate modeling to enable realistic analysis of earth's changing climate. Associated research has made it quite clear that advanced computation is often the only means by which timely progress is feasible when dealing with these complex, multi-component physical, chemical, and biological systems operating over huge ranges of temporal and spatial scales. Working with the domain scientists, applied mathematicians and computer scientists have continued to develop the discretizations of the underlying equations and the complementary algorithms to enable improvements in solutions on modern parallel computing platforms as they evolve from the terascale toward the petascale regime. Moreover, the associated tremendous growth of data generated from the terabyte to the petabyte range demands not only the advanced data analysis and visualization methods to harvest the scientific information but also the development of efficient workflow strategies which can deal with the data input/output, management, movement, and storage challenges. If scientific discovery is expected to keep apace with the continuing progression from tera- to petascale platforms, the vital alliance between domain scientists, applied mathematicians, and computer scientists will be even more crucial. During the SciDAC 2006 Conference, some of the future challenges and opportunities in interdisciplinary computational science were emphasized in the Advanced Architectures Panel and by Dr. Victor Reis, Senior Advisor to the Secretary of Energy, who gave a featured presentation on `Simulation, Computation, and the Global Nuclear Energy Partnership.' Overall, the conference provided an excellent opportunity to highlight the rising importance of computational science in the scientific enterprise and to motivate future investment in this area. As Michael Strayer, SciDAC Program Director, has noted: `While SciDAC may have started out as a specific program, Scientific Discovery through Advanced Computing has become a powerful concept for addressing some of the biggest challenges facing our nation and our world.' Looking forward to next year, the SciDAC 2007 Conference will be held from June 24-28 at the Westin Copley Plaza in Boston, Massachusetts. Chairman: David Keyes, Columbia University. The Organizing Committee for the SciDAC 2006 Conference would like to acknowledge the individuals whose talents and efforts were essential to the success of the meeting. Special thanks go to Betsy Riley for her leadership in building the infrastructure support for the conference, for identifying and then obtaining contributions from our corporate sponsors, for coordinating all media communications, and for her efforts in organizing and preparing the conference proceedings for publication; to Tim Jones for handling the hotel scouting, subcontracts, and exhibits and stage production; to Angela Harris for handling supplies, shipping, and tracking, poster sessions set-up, and for her efforts in coordinating and scheduling the promotional activities that took place during the conference; to John Bui and John Smith for their superb wireless networking and A/V set-up and support; to Cindy Latham for Web site design, graphic design, and quality control of proceedings submissions; and to Pamelia Nixon-Hartje of Ambassador for budget and quality control of catering. We are grateful for the highly professional dedicated efforts of all of these individuals, who were the cornerstones of the SciDAC 2006 Conference. Thanks also go to Angela Beach of the ORNL Conference Center for her efforts in executing the contracts with the hotel, Carolyn James of Colorado State for on-site registration supervision, Lora Wolfe and Brittany Hagen for administrative support at ORNL, and Dami Rich and Andrew Sproles for graphic design and production. We are also most grateful to the Oak Ridge National Laboratory, especially Jeff Nichols, and to our corporate sponsors, Data Direct Networks, Cray, IBM, SGI, and Institute of Physics Publishing for their support. We especially express our gratitude to the featured speakers, invited oral speakers, invited poster presenters, session chairs, and advanced architecture panelists and chair for their excellent contributions on behalf of SciDAC 2006. We would like to express our deep appreciation to Lali Chatterjee, Graham Douglas, Margaret Smith, and the production team of Institute of Physics Publishing, who worked tirelessly to publish the final conference proceedings in a timely manner. Finally, heartfelt thanks are extended to Michael Strayer, Associate Director for OASCR and SciDAC Director, and to the DOE program managers associated with SciDAC for their continuing enthusiasm and strong support for the annual SciDAC Conferences as a special venue to showcase the exciting scientific discovery achievements enabled by the interdisciplinary collaborations championed by the SciDAC program.

  11. Which risk models perform best in selecting ever-smokers for lung cancer screening?

    Cancer.gov

    A new analysis by scientists at NCI evaluates nine different individualized lung cancer risk prediction models based on their selections of ever-smokers for computed tomography (CT) lung cancer screening.

  12. Modality-Driven Classification and Visualization of Ensemble Variance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bensema, Kevin; Gosink, Luke; Obermaier, Harald

    Paper for the IEEE Visualization Conference Advances in computational power now enable domain scientists to address conceptual and parametric uncertainty by running simulations multiple times in order to sufficiently sample the uncertain input space.

  13. Proceedings of the Conference on Joint Problem Solving and Microcomputers (San Diego, California, March 31 - April 2, 1983). Technical Report No. 1.

    ERIC Educational Resources Information Center

    Cole, Michael; And Others

    A group of American and Japanese psychologists, anthropologists, linguists, and computer scientists gathered at the University of California, San Diego, to exchange ideas on models of joint problem solving and their special relevance to the design and implementation of computer-based systems of instruction. Much of the discussion focused on…

  14. Using Physical and Computer Simulations of Collective Behaviour as an Introduction to Modelling Concepts for Applied Biologists

    ERIC Educational Resources Information Center

    Rands, Sean A.

    2012-01-01

    Models are an important tool in science: not only do they act as a convenient device for describing a system or problem, but they also act as a conceptual tool for framing and exploring hypotheses. Models, and in particular computer simulations, are also an important education tool for training scientists, but it is difficult to teach students the…

  15. Turbulent Flow Simulation at the Exascale: Opportunities and Challenges Workshop: August 4-5, 2015, Washington, D.C.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sprague, Michael A.; Boldyrev, Stanislav; Fischer, Paul

    This report details the impact exascale will bring to turbulent-flow simulations in applied science and technology. The need for accurate simulation of turbulent flows is evident across the DOE applied-science and engineering portfolios, including combustion, plasma physics, nuclear-reactor physics, wind energy, and atmospheric science. The workshop brought together experts in turbulent-flow simulation, computational mathematics, and high-performance computing. Building upon previous ASCR workshops on exascale computing, participants defined a research agenda and path forward that will enable scientists and engineers to continually leverage, engage, and direct advances in computational systems on the path to exascale computing.

  16. Accelerating scientific discovery : 2007 annual report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckman, P.; Dave, P.; Drugan, C.

    2008-11-14

    As a gateway for scientific discovery, the Argonne Leadership Computing Facility (ALCF) works hand in hand with the world's best computational scientists to advance research in a diverse span of scientific domains, ranging from chemistry, applied mathematics, and materials science to engineering physics and life sciences. Sponsored by the U.S. Department of Energy's (DOE) Office of Science, researchers are using the IBM Blue Gene/L supercomputer at the ALCF to study and explore key scientific problems that underlie important challenges facing our society. For instance, a research team at the University of California-San Diego/ SDSC is studying the molecular basis ofmore » Parkinson's disease. The researchers plan to use the knowledge they gain to discover new drugs to treat the disease and to identify risk factors for other diseases that are equally prevalent. Likewise, scientists from Pratt & Whitney are using the Blue Gene to understand the complex processes within aircraft engines. Expanding our understanding of jet engine combustors is the secret to improved fuel efficiency and reduced emissions. Lessons learned from the scientific simulations of jet engine combustors have already led Pratt & Whitney to newer designs with unprecedented reductions in emissions, noise, and cost of ownership. ALCF staff members provide in-depth expertise and assistance to those using the Blue Gene/L and optimizing user applications. Both the Catalyst and Applications Performance Engineering and Data Analytics (APEDA) teams support the users projects. In addition to working with scientists running experiments on the Blue Gene/L, we have become a nexus for the broader global community. In partnership with the Mathematics and Computer Science Division at Argonne National Laboratory, we have created an environment where the world's most challenging computational science problems can be addressed. Our expertise in high-end scientific computing enables us to provide guidance for applications that are transitioning to petascale as well as to produce software that facilitates their development, such as the MPICH library, which provides a portable and efficient implementation of the MPI standard--the prevalent programming model for large-scale scientific applications--and the PETSc toolkit that provides a programming paradigm that eases the development of many scientific applications on high-end computers.« less

  17. U. S. GEOLOGICAL SURVEY LAND REMOTE SENSING ACTIVITIES.

    USGS Publications Warehouse

    Frederick, Doyle G.

    1983-01-01

    USGS uses all types of remotely sensed data, in combination with other sources of data, to support geologic analyses, hydrologic assessments, land cover mapping, image mapping, and applications research. Survey scientists use all types of remotely sensed data with ground verifications and digital topographic and cartographic data. A considerable amount of research is being done by Survey scientists on developing automated geographic information systems that can handle a wide variety of digital data. The Survey is also investigating the use of microprocessor computer systems for accessing, displaying, and analyzing digital data.

  18. Proposal for constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.

    1990-01-01

    Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.

  19. [NASA/DOD Aerospace Knowledge Diffusion Research Project. Paper 4:] Technical communications in aerospace: An analysis of the practices reported by US and European aerospace engineers and scientists

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Barclay, Rebecca O.; Kennedy, John M.; Glassman, Myron

    1990-01-01

    Two pilot studies were conducted that investigated the technical communications practices of U.S. and European aerospace engineers and scientists. Both studies had the same five objectives: (1) solicit opinions regarding the importance of technical communications; (2) determine the use and production of technical communications; (3) seek views about the appropriate content of an undergraduate course in technical communications; (4) determine use of libraries, information centers, and online database; (5) determine use and importance of computer and information technology to them. A self-administered questionnaire was mailed to randomly selected aerospace engineers and scientists, with a slightly modified version sent to European colleagues. Their responses to selected questions are presented in this paper.

  20. A Grid Infrastructure for Supporting Space-based Science Operations

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Redman, Sandra H.; McNair, Ann R. (Technical Monitor)

    2002-01-01

    Emerging technologies for computational grid infrastructures have the potential for revolutionizing the way computers are used in all aspects of our lives. Computational grids are currently being implemented to provide a large-scale, dynamic, and secure research and engineering environments based on standards and next-generation reusable software, enabling greater science and engineering productivity through shared resources and distributed computing for less cost than traditional architectures. Combined with the emerging technologies of high-performance networks, grids provide researchers, scientists and engineers the first real opportunity for an effective distributed collaborative environment with access to resources such as computational and storage systems, instruments, and software tools and services for the most computationally challenging applications.

  1. Network-based approaches to climate knowledge discovery

    NASA Astrophysics Data System (ADS)

    Budich, Reinhard; Nyberg, Per; Weigel, Tobias

    2011-11-01

    Climate Knowledge Discovery Workshop; Hamburg, Germany, 30 March to 1 April 2011 Do complex networks combined with semantic Web technologies offer the next generation of solutions in climate science? To address this question, a first Climate Knowledge Discovery (CKD) Workshop, hosted by the German Climate Computing Center (Deutsches Klimarechenzentrum (DKRZ)), brought together climate and computer scientists from major American and European laboratories, data centers, and universities, as well as representatives from industry, the broader academic community, and the semantic Web communities. The participants, representing six countries, were concerned with large-scale Earth system modeling and computational data analysis. The motivation for the meeting was the growing problem that climate scientists generate data faster than it can be interpreted and the need to prepare for further exponential data increases. Current analysis approaches are focused primarily on traditional methods, which are best suited for large-scale phenomena and coarse-resolution data sets. The workshop focused on the open discussion of ideas and technologies to provide the next generation of solutions to cope with the increasing data volumes in climate science.

  2. Space station Simulation Computer System (SCS) study for NASA/MSFC. Volume 1: Overview and summary

    NASA Technical Reports Server (NTRS)

    1989-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned Marshall Space Flight Center (MSFC) Payload Training Complex (PTC) required to meet this need will train the space station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs. This study was performed August 1988 to October 1989. Thus, the results are based on the SSFP August 1989 baseline, i.e., pre-Langley configuration/budget review (C/BR) baseline. Some terms, e.g., combined trainer, are being redefined. An overview of the study activities and a summary of study results are given here.

  3. Developing Data System Engineers

    NASA Astrophysics Data System (ADS)

    Behnke, J.; Byrnes, J. B.; Kobler, B.

    2011-12-01

    In the early days of general computer systems for science data processing, staff members working on NASA's data systems would most often be hired as mathematicians. Computer engineering was very often filled by those with electrical engineering degrees. Today, the Goddard Space Flight Center has special position descriptions for data scientists or as they are more commonly called: data systems engineers. These staff members are required to have very diverse skills, hence the need for a generalized position description. There is always a need for data systems engineers to develop, maintain and operate the complex data systems for Earth and space science missions. Today's data systems engineers however are not just mathematicians, they are computer programmers, GIS experts, software engineers, visualization experts, etc... They represent many different degree fields. To put together distributed systems like the NASA Earth Observing Data and Information System (EOSDIS), staff are required from many different fields. Sometimes, the skilled professional is not available and must be developed in-house. This paper will address the various skills and jobs for data systems engineers at NASA. Further it explores how to develop staff to become data scientists.

  4. Data Science Priorities for a University Hospital-Based Institute of Infectious Diseases: A Viewpoint.

    PubMed

    Valleron, Alain-Jacques

    2017-08-15

    Automation of laboratory tests, bioinformatic analysis of biological sequences, and professional data management are used routinely in a modern university hospital-based infectious diseases institute. This dates back to at least the 1980s. However, the scientific methods of this 21st century are changing with the increased power and speed of computers, with the "big data" revolution having already happened in genomics and environment, and eventually arriving in medical informatics. The research will be increasingly "data driven," and the powerful machine learning methods whose efficiency is demonstrated in daily life will also revolutionize medical research. A university-based institute of infectious diseases must therefore not only gather excellent computer scientists and statisticians (as in the past, and as in any medical discipline), but also fully integrate the biologists and clinicians with these computer scientists, statisticians, and mathematical modelers having a broad culture in machine learning, knowledge representation, and knowledge discovery. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.

  5. Facilitating NASA Earth Science Data Processing Using Nebula Cloud Computing

    NASA Technical Reports Server (NTRS)

    Pham, Long; Chen, Aijun; Kempler, Steven; Lynnes, Christopher; Theobald, Michael; Asghar, Esfandiari; Campino, Jane; Vollmer, Bruce

    2011-01-01

    Cloud Computing has been implemented in several commercial arenas. The NASA Nebula Cloud Computing platform is an Infrastructure as a Service (IaaS) built in 2008 at NASA Ames Research Center and 2010 at GSFC. Nebula is an open source Cloud platform intended to: a) Make NASA realize significant cost savings through efficient resource utilization, reduced energy consumption, and reduced labor costs. b) Provide an easier way for NASA scientists and researchers to efficiently explore and share large and complex data sets. c) Allow customers to provision, manage, and decommission computing capabilities on an as-needed bases

  6. Microgravity

    NASA Image and Video Library

    1999-05-26

    Looking for a faster computer? How about an optical computer that processes data streams simultaneously and works with the speed of light? In space, NASA researchers have formed optical thin-film. By turning these thin-films into very fast optical computer components, scientists could improve computer tasks, such as pattern recognition. Dr. Hossin Abdeldayem, physicist at NASA/Marshall Space Flight Center (MSFC) in Huntsville, Al, is working with lasers as part of an optical system for pattern recognition. These systems can be used for automated fingerprinting, photographic scarning and the development of sophisticated artificial intelligence systems that can learn and evolve. Photo credit: NASA/Marshall Space Flight Center (MSFC)

  7. New European Training Network to Improve Young Scientists' Capabilities in Computational Wave Propagation

    NASA Astrophysics Data System (ADS)

    Igel, Heiner

    2004-07-01

    The European Commission recently funded a Marie-Curie Research Training Network (MCRTN) in the field of computational seismology within the 6th Framework Program. SPICE (Seismic wave Propagation and Imaging in Complex media: a European network) is coordinated by the computational seismology group of the Ludwig-Maximilians-Universität in Munich linking 14 European research institutions in total. The 4-year project will provide funding for 14 Ph.D. students (3-year projects) and 14 postdoctoral positions (2-year projects) within the various fields of computational seismology. These positions have been advertised and are currently being filled.

  8. Computational Psychometrics for Modeling System Dynamics during Stressful Disasters.

    PubMed

    Cipresso, Pietro; Bessi, Alessandro; Colombo, Desirée; Pedroli, Elisa; Riva, Giuseppe

    2017-01-01

    Disasters can be very stressful events. However, computational models of stress require data that might be very difficult to collect during disasters. Moreover, personal experiences are not repeatable, so it is not possible to collect bottom-up information when building a coherent model. To overcome these problems, we propose the use of computational models and virtual reality integration to recreate disaster situations, while examining possible dynamics in order to understand human behavior and relative consequences. By providing realistic parameters associated with disaster situations, computational scientists can work more closely with emergency responders to improve the quality of interventions in the future.

  9. Uncooled EuSbTe3 photodetector highly sensitive from ultraviolet to terahertz frequencies

    NASA Astrophysics Data System (ADS)

    Niu, Ying Y.; Wu, Dong; Su, Yu Q.; Zhu, Hai; Wang, Biao; Wang, Ying X.; Zhao, Zi R.; Zheng, Ping; Niu, Jia S.; Zhou, Hui B.; Wei, Jian; Wang, Nan L.

    2018-01-01

    Light probe from Uv to THz is critical in photoelectronics and has great applications ranging from imaging, communication to medicine (Woodward et al 2002 Phys. Med. Biol. 47 3853-63 Pospischil et al 2013 Nat. Photon. 7 892-6 Martyniuk and Rogalski 2003 Prog. Quantum Electron. 27 59-210). However, the room temperature ultrabroadband photodetection across visible down to far-infrared is still challenging. The challenging arises mainly from the lack of suitable photoactive materials. Because that conventional semiconductors, such as silicon, have their photosensitive properties cut off by the bandgap and are transparent to spectrum at long-wavelength infrared side (Ciupa and Rogalski 1997 Opto-Electron. Rev. 5 257-66 Tonouchi 2007 Nat. Photon. 1 97-105 Sizov and Rogalski 2010 Prog. Quantum Electron. 34 278-347 Kinch 2000 J. Electron. Mater. 29 809-17). Comparatively, the dielectrics with very narrow band-gap but maintain the semiconductor-like electrical conduction would have priorities for ultrabroadband photodetection. Here we report on EuSbTe3 is highly sensitive from ultraviolet directly to terahertz (THz) at room temperature. High photoresponsivities 1-8 A W-1 reached in our prototype EuSbTe3 detectors with low noise equivalent power (NEP) recorded, for instances ~150 pW · Hz-1/2 (at λ  =  532 nm) and ~0.6 nW · Hz-1/2 (at λ  =  118.8 µm) respectively. Our results demonstrate a promising system with direct photosensitivity extending well into THz regime at room temperature, shed new light on exploring more sophisticated multi-band photoelectronics.

  10. Longitudinal Changes of Fixation Location and Stability within 12 Months in Stargardt Disease: ProgStar Report No. 12.

    PubMed

    Schönbach, Etienne M; Strauss, Rupert W; Kong, Xiangrong; Muñoz, Beatriz; Ibrahim, Mohamed A; Sunness, Janet S; Birch, David G; Hahn, Gesa-Astrid; Nasser, Fadi; Zrenner, Eberhart; Sadda, SriniVas R; West, Sheila K; Scholl, Hendrik P N

    2018-06-08

    To investigate the natural history of Stargardt disease (STGD1) using fixation location and fixation stability. Multicenter, international, prospective cohort study. Fixation testing was performed using the Nidek MP-1 microperimeter as part of the prospective, multicenter, natural history study on the Progression of Stargardt disease (ProgStar). A total of 238 patients with ABCA4-related STGD1 were enrolled at baseline (bilateral enrollment in 86.6 %) and underwent repeat testing at month 6 and 12. Outcome measures included the distance of the preferred retinal locus from the fovea (PRL) and the bivariate contour ellipse area (BCEA). After 12 months of follow-up, the change in the eccentricity of the PRL from the anatomical fovea was -0.0014 deg (95 % CI, - 0.27deg - 0.27 deg; p = 0.99). The deterioration in the stability of fixation as expressed by a larger BCEA encompassing 1 SD of all fixation points was 1.21 deg 2 (95 % CI, -1.23 deg 2 , 3.65 deg 2 ; p = 0.33). Eyes with increases and decreases in PRL eccentricity and/or BCEA values were observed. Our observations point to the complexity of fixation parameters. The association of increasingly eccentric and unstable fixation with longer disease duration that is typically found in cross-sectional studies may be countered within individual patients by poorly understood processes like neuronal adaptation. Nevertheless, fixation parameters may serve as useful secondary outcome parameters in selected cases and for counseling patients to explain changes to their visual functionality. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Analytical Cost Metrics : Days of Future Past

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prajapati, Nirmal; Rajopadhye, Sanjay; Djidjev, Hristo Nikolov

    As we move towards the exascale era, the new architectures must be capable of running the massive computational problems efficiently. Scientists and researchers are continuously investing in tuning the performance of extreme-scale computational problems. These problems arise in almost all areas of computing, ranging from big data analytics, artificial intelligence, search, machine learning, virtual/augmented reality, computer vision, image/signal processing to computational science and bioinformatics. With Moore’s law driving the evolution of hardware platforms towards exascale, the dominant performance metric (time efficiency) has now expanded to also incorporate power/energy efficiency. Therefore the major challenge that we face in computing systems researchmore » is: “how to solve massive-scale computational problems in the most time/power/energy efficient manner?”« less

  12. NASA/DOD Aerospace Knowledge Diffusion Research Project. Paper 41: Technical communication practices of Dutch and US aerospace engineers and scientists: International perspective on aerospace

    NASA Technical Reports Server (NTRS)

    Barclay, Rebecca O.; Pinelli, Thomas E.; Kennedy, John M.

    1994-01-01

    As part of Phase 4 of the NASA/DOD Aerospace Knowledge Diffusion Research Project, studies were conducted that investigated the technical communications practices of Dutch and U.S. aerospace engineers and scientists. The studies had the following objectives: (1) to solicit the opinions of aerospace engineers and scientists regarding the importance of technical communication to their professions, (2) to determine the use and production of technical communication by aerospace engineers and scientists, (3) to investigate their use of libraries and technical information centers, (4) to investigate their use of and the importance to them of computer and information technology, (5) to examine their use of electronic networks, and (6) to determine their use of foreign and domestically produced technical reports. Self-administered (mail) questionnaires were distributed to Dutch aerospace engineers and scientists at the National Aerospace Laboratory (NLR) in the Netherlands, the NASA Ames Research Center in the U.S., and the NASA Langley Research Center in the U.S. Responses of the Dutch and U.S. participants to selected questions are presented in this paper.

  13. Do Gender Differences in Perceived Prototypical Computer Scientists and Engineers Contribute to Gender Gaps in Computer Science and Engineering?

    PubMed

    Ehrlinger, Joyce; Plant, E Ashby; Hartwig, Marissa K; Vossen, Jordan J; Columb, Corey J; Brewer, Lauren E

    2018-01-01

    Women are vastly underrepresented in the fields of computer science and engineering (CS&E). We examined whether women might view the intellectual characteristics of prototypical individuals in CS&E in more stereotype-consistent ways than men might and, consequently, show less interest in CS&E. We asked 269 U.S. college students (187, 69.5% women) to describe the prototypical computer scientist (Study 1) or engineer (Study 2) through open-ended descriptions as well as through a set of trait ratings. Participants also rated themselves on the same set of traits and rated their similarity to the prototype. Finally, participants in both studies were asked to describe their likelihood of pursuing future college courses and careers in computer science (Study 1) or engineering (Study 2). Across both studies, we found that women offered more stereotype-consistent ratings than did men of the intellectual characteristics of prototypes in CS (Study 1) and engineering (Study 2). Women also perceived themselves as less similar to the prototype than men did. Further, the observed gender differences in prototype perceptions mediated the tendency for women to report lower interest in CS&E fields relative to men. Our work highlights the importance of prototype perceptions for understanding the gender gap in CS&E and suggests avenues for interventions that may increase women's representation in these vital fields.

  14. Final report for Conference Support Grant "From Computational Biophysics to Systems Biology - CBSB12"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansmann, Ulrich H.E.

    2012-07-02

    This report summarizes the outcome of the international workshop From Computational Biophysics to Systems Biology (CBSB12) which was held June 3-5, 2012, at the University of Tennessee Conference Center in Knoxville, TN, and supported by DOE through the Conference Support Grant 120174. The purpose of CBSB12 was to provide a forum for the interaction between a data-mining interested systems biology community and a simulation and first-principle oriented computational biophysics/biochemistry community. CBSB12 was the sixth in a series of workshops of the same name organized in recent years, and the second that has been held in the USA. As in previousmore » years, it gave researchers from physics, biology, and computer science an opportunity to acquaint each other with current trends in computational biophysics and systems biology, to explore venues of cooperation, and to establish together a detailed understanding of cells at a molecular level. The conference grant of $10,000 was used to cover registration fees and provide travel fellowships to selected students and postdoctoral scientists. By educating graduate students and providing a forum for young scientists to perform research into the working of cells at a molecular level, the workshop adds to DOE's mission of paving the way to exploit the abilities of living systems to capture, store and utilize energy.« less

  15. Mathematical techniques: A compilation

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Articles on theoretical and applied mathematics are introduced. The articles cover information that might be of interest to workers in statistics and information theory, computational aids that could be used by scientists and engineers, and mathematical techniques for design and control.

  16. Multidisciplinary Computational Research

    DTIC Science & Technology

    2006-07-01

    Aeronautical Society Silver Award and Busk Prize (2006);USAF Basic Research Award (2004); AFRL Fellow (1995); Outstanding Scientist of Dayton (2003...Speaker, ASME 2006 Fluids Engineering Conference; Royal Aeronautical Society Silver Award and Busk Prize (2006). Rizzetta, D., Gen. B. Foulois

  17. ENVIRONMENTAL BIOINFORMATICS AND COMPUTATIONAL TOXICOLOGY CENTER

    EPA Science Inventory

    The Center activities focused on integrating developmental efforts from the various research projects of the Center, and collaborative applications involving scientists from other institutions and EPA, to enhance research in critical areas. A representative sample of specif...

  18. A "Star Wars" Objector Lays His Research on the Line.

    ERIC Educational Resources Information Center

    Tobias, Sheila

    1987-01-01

    For one optical scientist, Harrison Barrett, the decision not to accept funding for research related to the Strategic Defense Initiative has meant giving up a major part of his work in optical computing. (MSE)

  19. Citizen Science

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess

    2015-01-01

    Scientists and engineers constantly face new challenges, despite myriad advances in computing. More sets of data are collected today from earth and sky than there is time or resources available to carefully analyze them. Some problems either don't have fast algorithms to solve them or have solutions that must be found among millions of options, a situation akin to finding a needle in a haystack. But all hope is not lost: advances in technology and the Internet have empowered the general public to participate in the scientific process via individual computational resources and brain cognition, which isn't matched by any machine. Citizen scientists are volunteers who perform scientific work by making observations, collecting and disseminating data, making measurements, and analyzing or interpreting data without necessarily having any scientific training. In so doing, individuals from all over the world can contribute to science in ways that wouldn't have been otherwise possible.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Carlo, Francesco; Gürsoy, Doğa; Ching, Daniel J.

    There is a widening gap between the fast advancement of computational methods for tomographic reconstruction and their successful implementation in production software at various synchrotron facilities. This is due in part to the lack of readily available instrument datasets and phantoms representative of real materials for validation and comparison of new numerical methods. Recent advancements in detector technology made sub-second and multi-energy tomographic data collection possible [1], but also increased the demand to develop new reconstruction methods able to handle in-situ [2] and dynamic systems [3] that can be quickly incorporated in beamline production software [4]. The X-ray Tomography Datamore » Bank, tomoBank, provides a repository of experimental and simulated datasets with the aim to foster collaboration among computational scientists, beamline scientists, and experimentalists and to accelerate the development and implementation of tomographic reconstruction methods for synchrotron facility production software by providing easy access to challenging dataset and their descriptors.« less

  1. The science of computing - The evolution of parallel processing

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1985-01-01

    The present paper is concerned with the approaches to be employed to overcome the set of limitations in software technology which impedes currently an effective use of parallel hardware technology. The process required to solve the arising problems is found to involve four different stages. At the present time, Stage One is nearly finished, while Stage Two is under way. Tentative explorations are beginning on Stage Three, and Stage Four is more distant. In Stage One, parallelism is introduced into the hardware of a single computer, which consists of one or more processors, a main storage system, a secondary storage system, and various peripheral devices. In Stage Two, parallel execution of cooperating programs on different machines becomes explicit, while in Stage Three, new languages will make parallelism implicit. In Stage Four, there will be very high level user interfaces capable of interacting with scientists at the same level of abstraction as scientists do with each other.

  2. Digital imaging of root traits (DIRT): a high-throughput computing and collaboration platform for field-based root phenomics.

    PubMed

    Das, Abhiram; Schneider, Hannah; Burridge, James; Ascanio, Ana Karine Martinez; Wojciechowski, Tobias; Topp, Christopher N; Lynch, Jonathan P; Weitz, Joshua S; Bucksch, Alexander

    2015-01-01

    Plant root systems are key drivers of plant function and yield. They are also under-explored targets to meet global food and energy demands. Many new technologies have been developed to characterize crop root system architecture (CRSA). These technologies have the potential to accelerate the progress in understanding the genetic control and environmental response of CRSA. Putting this potential into practice requires new methods and algorithms to analyze CRSA in digital images. Most prior approaches have solely focused on the estimation of root traits from images, yet no integrated platform exists that allows easy and intuitive access to trait extraction and analysis methods from images combined with storage solutions linked to metadata. Automated high-throughput phenotyping methods are increasingly used in laboratory-based efforts to link plant genotype with phenotype, whereas similar field-based studies remain predominantly manual low-throughput. Here, we present an open-source phenomics platform "DIRT", as a means to integrate scalable supercomputing architectures into field experiments and analysis pipelines. DIRT is an online platform that enables researchers to store images of plant roots, measure dicot and monocot root traits under field conditions, and share data and results within collaborative teams and the broader community. The DIRT platform seamlessly connects end-users with large-scale compute "commons" enabling the estimation and analysis of root phenotypes from field experiments of unprecedented size. DIRT is an automated high-throughput computing and collaboration platform for field based crop root phenomics. The platform is accessible at http://www.dirt.iplantcollaborative.org/ and hosted on the iPlant cyber-infrastructure using high-throughput grid computing resources of the Texas Advanced Computing Center (TACC). DIRT is a high volume central depository and high-throughput RSA trait computation platform for plant scientists working on crop roots. It enables scientists to store, manage and share crop root images with metadata and compute RSA traits from thousands of images in parallel. It makes high-throughput RSA trait computation available to the community with just a few button clicks. As such it enables plant scientists to spend more time on science rather than on technology. All stored and computed data is easily accessible to the public and broader scientific community. We hope that easy data accessibility will attract new tool developers and spur creative data usage that may even be applied to other fields of science.

  3. Visualizing a silicon quantum computer

    NASA Astrophysics Data System (ADS)

    Sanders, Barry C.; Hollenberg, Lloyd C. L.; Edmundson, Darran; Edmundson, Andrew

    2008-12-01

    Quantum computation is a fast-growing, multi-disciplinary research field. The purpose of a quantum computer is to execute quantum algorithms that efficiently solve computational problems intractable within the existing paradigm of 'classical' computing built on bits and Boolean gates. While collaboration between computer scientists, physicists, chemists, engineers, mathematicians and others is essential to the project's success, traditional disciplinary boundaries can hinder progress and make communicating the aims of quantum computing and future technologies difficult. We have developed a four minute animation as a tool for representing, understanding and communicating a silicon-based solid-state quantum computer to a variety of audiences, either as a stand-alone animation to be used by expert presenters or embedded into a longer movie as short animated sequences. The paper includes a generally applicable recipe for successful scientific animation production.

  4. ISMB Conference Funding to Support Attendance of Early Researchers and Students

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaasterland, Terry

    ISMB Conference Funding for Students and Young Scientists Historical Description The Intelligent Systems for Molecular Biology (ISMB) conference has provided a general forum for disseminating the latest developments in bioinformatics on an annual basis for the past 22 years. ISMB is a multidisciplinary conference that brings together scientists from computer science, molecular biology, mathematics and statistics. The goal of the ISMB meeting is to bring together biologists and computational scientists in a focus on actual biological problems, i.e., not simply theoretical calculations. The combined focus on “intelligent systems” and actual biological data makes ISMB a unique and highly important meeting.more » 21 years of experience in holding the conference has resulted in a consistently well-organized, well attended, and highly respected annual conference. "Intelligent systems" include any software which goes beyond straightforward, closed-form algorithms or standard database technologies, and encompasses those that view data in a symbolic fashion, learn from examples, consolidate multiple levels of abstraction, or synthesize results to be cognitively tractable to a human, including the development and application of advanced computational methods for biological problems. Relevant computational techniques include, but are not limited to: machine learning, pattern recognition, knowledge representation, databases, combinatorics, stochastic modeling, string and graph algorithms, linguistic methods, robotics, constraint satisfaction, and parallel computation. Biological areas of interest include molecular structure, genomics, molecular sequence analysis, evolution and phylogenetics, molecular interactions, metabolic pathways, regulatory networks, developmental control, and molecular biology generally. Emphasis is placed on the validation of methods using real data sets, on practical applications in the biological sciences, and on development of novel computational techniques. The ISMB conferences are distinguished from many other conferences in computational biology or artificial intelligence by an insistence that the researchers work with real molecular biology data, not theoretical or toy examples; and from many other biological conferences by providing a forum for technical advances as they occur, which otherwise may be shunned until a firm experimental result is published. The resulting intellectual richness and cross-disciplinary diversity provides an important opportunity for both students and senior researchers. ISMB has become the premier conference series in this field with refereed, published proceedings, establishing an infrastructure to promote the growing body of research.« less

  5. Microgravity

    NASA Image and Video Library

    2000-04-19

    Dr. Marc Pusey (seated) and Dr. Craig Kundrot use computers to analyze x-ray maps and generate three-dimensional models of protein structures. With this information, scientists at Marshall Space Flight Center can learn how proteins are made and how they work. The computer screen depicts a proten structure as a ball-and-stick model. Other models depict the actual volume occupied by the atoms, or the ribbon-like structures that are crucial to a protein's function.

  6. Advanced Methodologies for NASA Science Missions

    NASA Astrophysics Data System (ADS)

    Hurlburt, N. E.; Feigelson, E.; Mentzel, C.

    2017-12-01

    Most of NASA's commitment to computational space science involves the organization and processing of Big Data from space-based satellites, and the calculations of advanced physical models based on these datasets. But considerable thought is also needed on what computations are needed. The science questions addressed by space data are so diverse and complex that traditional analysis procedures are often inadequate. The knowledge and skills of the statistician, applied mathematician, and algorithmic computer scientist must be incorporated into programs that currently emphasize engineering and physical science. NASA's culture and administrative mechanisms take full cognizance that major advances in space science are driven by improvements in instrumentation. But it is less well recognized that new instruments and science questions give rise to new challenges in the treatment of satellite data after it is telemetered to the ground. These issues might be divided into two stages: data reduction through software pipelines developed within NASA mission centers; and science analysis that is performed by hundreds of space scientists dispersed through NASA, U.S. universities, and abroad. Both stages benefit from the latest statistical and computational methods; in some cases, the science result is completely inaccessible using traditional procedures. This paper will review the current state of NASA and present example applications using modern methodologies.

  7. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhatele, Abhinav

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research alongmore » the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.« less

  8. Rich client data exploration and research prototyping for NOAA

    NASA Astrophysics Data System (ADS)

    Grossberg, Michael; Gladkova, Irina; Guch, Ingrid; Alabi, Paul; Shahriar, Fazlul; Bonev, George; Aizenman, Hannah

    2009-08-01

    Data from satellites and model simulations is increasing exponentially as observations and model computing power improve rapidly. Not only is technology producing more data, but it often comes from sources all over the world. Researchers and scientists who must collaborate are also located globally. This work presents a software design and technologies which will make it possible for groups of researchers to explore large data sets visually together without the need to download these data sets locally. The design will also make it possible to exploit high performance computing remotely and transparently to analyze and explore large data sets. Computer power, high quality sensing, and data storage capacity have improved at a rate that outstrips our ability to develop software applications that exploit these resources. It is impractical for NOAA scientists to download all of the satellite and model data that may be relevant to a given problem and the computing environments available to a given researcher range from supercomputers to only a web browser. The size and volume of satellite and model data are increasing exponentially. There are at least 50 multisensor satellite platforms collecting Earth science data. On the ground and in the sea there are sensor networks, as well as networks of ground based radar stations, producing a rich real-time stream of data. This new wealth of data would have limited use were it not for the arrival of large-scale high-performance computation provided by parallel computers, clusters, grids, and clouds. With these computational resources and vast archives available, it is now possible to analyze subtle relationships which are global, multi-modal and cut across many data sources. Researchers, educators, and even the general public, need tools to access, discover, and use vast data center archives and high performance computing through a simple yet flexible interface.

  9. Data scientist: the sexiest job of the 21st century.

    PubMed

    Davenport, Thomas H; Patil, D J

    2012-10-01

    Back in the 1990s, computer engineer and Wall Street "quant" were the hot occupations in business. Today data scientists are the hires firms are competing to make. As companies wrestle with unprecedented volumes and types of information, demand for these experts has raced well ahead of supply. Indeed, Greylock Partners, the VC firm that backed Facebook and LinkedIn, is so worried about the shortage of data scientists that it has a recruiting team dedicated to channeling them to the businesses in its portfolio. Data scientists are the key to realizing the opportunities presented by big data. They bring structure to it, find compelling patterns in it, and advise executives on the implications for products, processes, and decisions. They find the story buried in the data and communicate it. And they don't just deliver reports: They get at the questions at the heart of problems and devise creative approaches to them. One data scientist who was studying a fraud problem, for example, realized it was analogous to a type of DNA sequencing problem. Bringing those disparate worlds together, he crafted a solution that dramatically reduced fraud losses. In this article, Harvard Business School's Davenport and Greylock's Patil take a deep dive on what organizations need to know about data scientists: where to look for them, how to attract and develop them, and how to spot a great one.

  10. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    NASA Astrophysics Data System (ADS)

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba

    2015-12-01

    The scientific discovery process can be advanced by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. We have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC); the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In this paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.

  11. Meet EPA Chemical Engineer Deborah Luecken

    EPA Pesticide Factsheets

    Deborah Luecken is a research scientist who develops descriptions of the chemistry that occurs in the atmosphere to form ozone and other air pollutants. She works with a group developing a computer model that can predict how air pollution is formed

  12. NCI scientists at forefront of new prostate cancer diagnostics

    Cancer.gov

    Introduction of the UroNav was the result of nearly a decade’s research and development, principally conducted at NCI. Resembling a stylized computer workstation on wheels, the system electronically fuses together pictures from magnetic resonance imaging

  13. The Opportunities and Controversies of Reversible Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeBenedictis, Erik P.; Mee, Jesse K.; Frank, Michael P.

    Industry's inability to reduce logic gates' energy consumption is slowing growth in an important part of the worldwide economy. Some scientists argue that alternative approaches could greatly reduce energy consumption. Lastly, these approaches entail myriad technical and political issues.

  14. Artificial intelligence: Learning to play Go from scratch

    NASA Astrophysics Data System (ADS)

    Singh, Satinder; Okun, Andy; Jackson, Andrew

    2017-10-01

    An artificial-intelligence program called AlphaGo Zero has mastered the game of Go without any human data or guidance. A computer scientist and two members of the American Go Association discuss the implications. See Article p.354

  15. The Opportunities and Controversies of Reversible Computing

    DOE PAGES

    DeBenedictis, Erik P.; Mee, Jesse K.; Frank, Michael P.

    2017-06-09

    Industry's inability to reduce logic gates' energy consumption is slowing growth in an important part of the worldwide economy. Some scientists argue that alternative approaches could greatly reduce energy consumption. Lastly, these approaches entail myriad technical and political issues.

  16. Game Imaging Meets Nuclear Reality

    ScienceCinema

    Michel, Kelly; Watkins, Adam

    2018-01-16

    At Los Alamos National Laboratory, a team of artists and animators, nuclear engineers and computer scientists is teaming to provide 3-D models of nuclear facilities to train IAEA safeguards inspectors and others who need fast familiarity with specific nuclear sites.

  17. Social Participation in Health 2.0

    PubMed Central

    Hesse, Bradford W.; Hansen, Derek; Finholt, Thomas; Munson, Sean; Kellogg, Wendy; Thomas, John C.

    2010-01-01

    Computer scientists are working with biomedical researchers, policy specialists, and medical practitioners to usher in a new era in healthcare. A recently convened panel of experts considered various research opportunities for technology-mediated social participation in Health 2.0. PMID:21379365

  18. Prediction of Skin Sensitization Potency Using Machine Learning Approaches

    EPA Science Inventory

    Replacing animal tests currently used for regulatory hazard classification of skin sensitizers is one of ICCVAM’s top priorities. Accordingly, U.S. federal agency scientists are developing and evaluating computational approaches to classify substances as sensitizers or nons...

  19. NASA Center for Climate Simulation (NCCS) Advanced Technology AT5 Virtualized Infiniband Report

    NASA Technical Reports Server (NTRS)

    Thompson, John H.; Bledsoe, Benjamin C.; Wagner, Mark; Shakshober, John; Fromkin, Russ

    2013-01-01

    The NCCS is part of the Computational and Information Sciences and Technology Office (CISTO) of Goddard Space Flight Center's (GSFC) Sciences and Exploration Directorate. The NCCS's mission is to enable scientists to increase their understanding of the Earth, the solar system, and the universe by supplying state-of-the-art high performance computing (HPC) solutions. To accomplish this mission, the NCCS (https://www.nccs.nasa.gov) provides high performance compute engines, mass storage, and network solutions to meet the specialized needs of the Earth and space science user communities

  20. Cumulative reports and publications

    NASA Technical Reports Server (NTRS)

    1993-01-01

    A complete list of Institute for Computer Applications in Science and Engineering (ICASE) reports are listed. Since ICASE reports are intended to be preprints of articles that will appear in journals or conference proceedings, the published reference is included when it is available. The major categories of the current ICASE research program are: applied and numerical mathematics, including numerical analysis and algorithm development; theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and computer science.

Top