47 CFR 52.20 - Thousands-block number pooling.
Code of Federal Regulations, 2010 CFR
2010-10-01
... separated into ten sequential blocks of 1,000 numbers each (thousands-blocks), and allocated separately... required to participate in thousands-block number pooling shall donate thousands-blocks with ten percent or... ten percent or less contaminated, as an initial block or footprint block. (d) Thousands-Block Pooling...
Krauss, Ken W.; From, Andrew S.; Doyle, Thomas W.; Doyle, Terry J.; Barry, Michael J.
2011-01-01
The Ten Thousand Islands region of southwestern Florida, USA is a major feeding and resting destination for breeding, migrating, and wintering birds. Many species of waterbirds rely specifically on marshes as foraging habitat, making mangrove encroachment a concern for wildlife managers. With the alteration of freshwater flow and sea-level rise trends for the region, mangroves have migrated upstream into traditionally salt and brackish marshes, mirroring similar descriptions around the world. Aside from localized freezes in some years, very little seems to be preventing mangrove encroachment. We mapped changes in mangrove stand boundaries from the Gulf of Mexico inland to the northern boundary of Ten Thousand Islands National Wildlife Refuge (TTINWR) from 1927 to 2005, and determined the area of mangroves to be approximately 7,281 hectares in 2005, representing an 1,878 hectare increase since 1927. Overall change represents an approximately 35% increase in mangrove coverage on TTINWR over 78 years. Sea-level rise is likely the primary driver of this change; however, the construction of new waterways facilitates the dispersal of mangrove propagules into new areas by extending tidal influence, exacerbating encroachment. Reduced volume of freshwater delivery to TTINWR via overland flow and localized rainfall may influence the balance between marsh and mangrove as well, potentially offering some options to managers interested in conserving marsh over mangrove.
DITTY - a computer program for calculating population dose integrated over ten thousand years
DOE Office of Scientific and Technical Information (OSTI.GOV)
Napier, B.A.; Peloquin, R.A.; Strenge, D.L.
The computer program DITTY (Dose Integrated Over Ten Thousand Years) was developed to determine the collective dose from long term nuclear waste disposal sites resulting from the ground-water pathways. DITTY estimates the time integral of collective dose over a ten-thousand-year period for time-variant radionuclide releases to surface waters, wells, or the atmosphere. This document includes the following information on DITTY: a description of the mathematical models, program designs, data file requirements, input preparation, output interpretations, sample problems, and program-generated diagnostic messages.
Raccoon removal reduces sea turtle nest depredation in the Ten Thousand Islands of Florida
Garmestani, A.S.; Percival, H.F.
2005-01-01
Predation by raccoons, Procyon lotor marinus (L.), is the primary cause of sea turtle nest loss in the Ten Thousand Islands archipelago. Four islands within Ten Thousand Islands National Wildlife Refuge were surveyed for sea turtle nesting activity from 1991-95. Raccoons depredated 76-100% of nests on Panther Key from 1991-94, until 14 raccoons were removed in 1995 resulting in 0% depredation and 2 more were removed in 1996 resulting in 0% depredation. Raccoon removal may be an effective management option for increasing sea turtle nest survival on barrier islands.
Strategies to identify microRNA targets: New advances
USDA-ARS?s Scientific Manuscript database
MicroRNAs (miRNAs) are small regulatory RNA molecules functioning to modulate gene expression at the post-transcriptional level, and playing an important role in many developmental and physiological processes. Ten thousand miRNAs have been discovered in various organisms. Although considerable progr...
ERIC Educational Resources Information Center
Snyder, Robin M.
2014-01-01
Just as the cost of high quality laser printing started in the tens of thousands of dollar and can now be purchased for under $100, so too has 3D printing technology started in the tens of thousands of dollars and is now in the thousand dollar range. Current 3D printing technology takes 2D printing into a third dimension. Many 3D printers are…
2010-06-11
International Labour Organisation Office for the Coordination of Humanitarian Affairs Peacebuilding Support Office United Kingdom Cabinet Office... absentees numbered in the tens of thousands (GAO 2005b, 7). Additionally, the total number of forces is misleading because both the trained
Core Principles for Transforming Remediation within a Comprehensive: Student Success Strategy
ERIC Educational Resources Information Center
Achieving the Dream, 2015
2015-01-01
Colleges and postsecondary systems across the nation are demonstrating remarkable progress in phasing out standalone or multi-course remediation sequences, resulting in tens of thousands of students more quickly enrolling in and completing college-level courses. These organizations have collaborated to describe the principles they see in common…
International Students in Australia: Read Ten Thousand Volumes of Books and Walk Ten Thousand Miles
ERIC Educational Resources Information Center
Arkoudis, Sophie; Tran, Ly Thi
2007-01-01
A number of international students, predominately from Asian countries, are present in universities in the UK, United States, and Australia. There is little research exploring their experiences as they negotiate the disciplinary requirements of their courses. This paper investigates students' agency as they write their first assignment for their…
77 FR 64019 - National School Lunch Week, 2012
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-17
... meals for tens of millions of students every day. These meals are a vital source of fruits, vegetables... Michelle Obama's Let's Move! initiative, we are continuing to bring together stakeholders at every level of... hand this twelfth day of October, in the year of our Lord two thousand twelve, and of the Independence...
Estuarine River Data for the Ten Thousand Islands Area, Florida, Water Year 2005
Byrne, Michael J.; Patino, Eduardo
2008-01-01
The U.S. Geological Survey collected stream discharge, stage, salinity, and water-temperature data near the mouths of 11 tributaries flowing into the Ten Thousand Islands area of Florida from October 2004 to June 2005. Maximum positive discharge from Barron River and Faka Union River was 6,000 and 3,200 ft3/s, respectively; no other tributary exceeded 2,600 ft3/s. Salinity variation was greatest at Barron River and Faka Union River, ranging from 2 to 37 ppt, and from 3 to 34 ppt, respectively. Salinity maximums were greatest at Wood River and Little Wood River, each exceeding 40 ppt. All data were collected prior to the commencement of the Picayune Strand Restoration Project, which is designed to establish a more natural flow regime to the tributaries of the Ten Thousand Islands area.
ERIC Educational Resources Information Center
Rahimian, Hamid; Kazemi, Mojtaba; Abbspour, Abbas
2017-01-01
This research aims to determine the effectiveness of training based on learning organization in the staff of cement industry with production capacity over ten thousand tons. The purpose of this study is to propose a training model based on learning organization. For this purpose, the factors of organizational learning were introduced by…
DOT National Transportation Integrated Search
1966-09-01
General aviation pilots are increasingly ascending to altitudes exceeding ten thousand feet. As one becomes exposed to heights above twelve thousand feet, blood oxygen saturation diminishes in accordance with a predicable schedule. Recommended measur...
Perez, Cristina R; Moye, John K; Cacela, Dave; Dean, Karen M; Pritsos, Chris A
2017-12-01
In 2010, the Deepwater Horizon oil spill released 134 million gallons of crude oil into the Gulf of Mexico making it the largest oil spill in US history. The three month oil spill left tens of thousands of birds dead; however, the fate of tens of thousands of other migratory birds that were affected but did not immediately die is unknown. We used the homing pigeon as a surrogate species for migratory birds to investigate the effects of a single external oiling event on the flight performance of birds. Data from GPS data loggers revealed that lightly oiled pigeons took significantly longer to return home and spent more time stopped en route than unoiled birds. This suggests that migratory birds affected by the oil spill could have experienced long term flight impairment and delayed arrival to breeding, wintering, or crucial stopover sites and subsequently suffered reductions in survival and reproductive success. Copyright © 2017 Elsevier Inc. All rights reserved.
An aeromagnetic survey in the Valley of Ten Thousand Smokes, Alaska. M.S. Thesis
NASA Technical Reports Server (NTRS)
Anma, K.
1971-01-01
Geologic and magnetic studies of the Katmai area have further demonstrated the close relationship between the Katmai Caldera, Novarupta plug, and the pyroclastic flows in the Valley of Ten Thousand Smokes. The magnetic fields observed appear to be associated with the thickness of the pyroclastic flow and the different rock units within it for lower flight levels, and also the contrast between the valley fill and the rock units at the Valley margins. Consistent magnetic anomalies are associated with the larger fumarole lines, which were presumably sites of large scale activity, while the smaller fumaroles are not usually seen in the aeromagnetic map. A possible correlation between low positive anomalies and nuee ardente deposits was revealed by the aeromagnetic survey, but was not strong. A ground survey was also carried out in several parts of the Valley with a view to detailed delineation of the magnetic signatures of the pyroclastic flow, as an aid to interpreting the aeromagnetic date.
ERIC Educational Resources Information Center
Carnegie Foundation for the Advancement of Teaching, 2017
2017-01-01
The Carnegie Foundation launched its Math Pathways initiative nearly six years ago at 29 colleges across the country with the aim of improving success rates in developmental math. Tens of thousands of students a year, who need additional preparation for college-level math, are shut out of earning degrees and fulfilling careers due to the huge…
Tomlinson, Robert
2018-05-01
Reacting to a never event is difficult and often embarrassing for staff involved. East Lancashire Hospitals NHS Trust has demonstrated that treating staff with respect after a never event, creates an open culture that encourages problem solving and service improvement. The approach has allowed learning to be shared and paved the way for the trust to be the first in the UK to launch the patient centric behavioural noise reduction strategy 'Below ten thousand'.
Inductive System for Reliable Magnesium Level Detection in a Titanium Reduction Reactor
NASA Astrophysics Data System (ADS)
Krauter, Nico; Eckert, Sven; Gundrum, Thomas; Stefani, Frank; Wondrak, Thomas; Frick, Peter; Khalilov, Ruslan; Teimurazov, Andrei
2018-05-01
The determination of the Magnesium level in a Titanium reduction retort by inductive methods is often hampered by the formation of Titanium sponge rings which disturb the propagation of electromagnetic signals between excitation and receiver coils. We present a new method for the reliable identification of the Magnesium level which explicitly takes into account the presence of sponge rings with unknown geometry and conductivity. The inverse problem is solved by a look-up-table method, based on the solution of the inductive forward problems for several tens of thousands parameter combinations.
Fann, Neal; Nolte, Christopher G; Dolwick, Patrick; Spero, Tanya L; Brown, Amanda Curry; Phillips, Sharon; Anenberg, Susan
2015-05-01
In this United States-focused analysis we use outputs from two general circulation models (GCMs) driven by different greenhouse gas forcing scenarios as inputs to regional climate and chemical transport models to investigate potential changes in near-term U.S. air quality due to climate change. We conduct multiyear simulations to account for interannual variability and characterize the near-term influence of a changing climate on tropospheric ozone-related health impacts near the year 2030, which is a policy-relevant time frame that is subject to fewer uncertainties than other approaches employed in the literature. We adopt a 2030 emissions inventory that accounts for fully implementing anthropogenic emissions controls required by federal, state, and/or local policies, which is projected to strongly influence future ozone levels. We quantify a comprehensive suite of ozone-related mortality and morbidity impacts including emergency department visits, hospital admissions, acute respiratory symptoms, and lost school days, and estimate the economic value of these impacts. Both GCMs project average daily maximum temperature to increase by 1-4°C and 1-5 ppb increases in daily 8-hr maximum ozone at 2030, though each climate scenario produces ozone levels that vary greatly over space and time. We estimate tens to thousands of additional ozone-related premature deaths and illnesses per year for these two scenarios and calculate an economic burden of these health outcomes of hundreds of millions to tens of billions of U.S. dollars (2010$). Near-term changes to the climate have the potential to greatly affect ground-level ozone. Using a 2030 emission inventory with regional climate fields downscaled from two general circulation models, we project mean temperature increases of 1 to 4°C and climate-driven mean daily 8-hr maximum ozone increases of 1-5 ppb, though each climate scenario produces ozone levels that vary significantly over space and time. These increased ozone levels are estimated to result in tens to thousands of ozone-related premature deaths and illnesses per year and an economic burden of hundreds of millions to tens of billions of U.S. dollars (2010$).
Straw man trade between multi-junction, gallium arsenide, and silicon solar cells
NASA Technical Reports Server (NTRS)
Gaddy, Edward M.
1995-01-01
Multi-junction (MJ), gallium arsenide (GaAs), and silicon (Si) solar cells have respective test efficiencies of approximately 24%, 18.5% and 14.8%. Multi-junction and gallium arsenide solar cells weigh more than silicon solar cells and cost approximately five times as much per unit power at the cell level. A straw man trade is performed for the TRMM spacecraft to determine which of these cell types would have offered an overall performance and price advantage to the spacecraft. A straw man trade is also performed for the multi-junction cells under the assumption that they will cost over ten times that of silicon cells at the cell level. The trade shows that the TRMM project, less the cost of the instrument, ground systems and mission operations, would spend approximately $552 thousand dollars per kilogram to launch and service science in the case of the spacecraft equipped with silicon solar cells. If these cells are changed out for gallium arsenide solar cells, an additional 31 kilograms of science can be launched and serviced at a price of approximately $90 thousand per kilogram. The weight reduction is shown to derive from the smaller area of the array and hence reductions in the weight of the array substrate and supporting structure. If the silicon solar cells are changed out for multi-junction solar cells, an additional 45 kilograms of science above the silicon base line can be launched and serviced at a price of approximately $58 thousand per kilogram. The trade shows that even if the multi-junction arrays are priced over ten times that of silicon cells, a price that is much higher than projected, that the additional 45 kilograms of science are launched and serviced at $182 thousand per kilogram. This is still much less than original $552 thousand per kilogram to launch and service the science. Data and qualitative factors are presented to show that these figures are subject to a great deal of uncertainty. Nonetheless, the benefit of the higher efficiency solar cells for TRMM is far greater than the uncertainties in the analysis.
NASA Astrophysics Data System (ADS)
Allen, R. L.
2016-12-01
Computer enhancing of side scanning sonar plots revealed images of massive art, apparent ruins of cities, and subsea temples. Some images are about four to twenty kilometers in length. Present water depths imply that many of the finds must have been created over ten thousand years ago. Also, large carvings of giant sloths, Ice Age elk, mammoths, mastodons, and other cold climate creatures concurrently indicate great age. In offshore areas of North America, some human faces have beards and what appear to be Caucasian characteristics that clearly contrast with the native tribal images. A few images have possible physical appearances associated with Polynesians. Contacts and at least limited migrations must have occurred much further in the ancient past than previously believed. Greatly rising sea levels and radical changes away from late Ice Age climates had to be devastating to very ancient civilizations. Many images indicate that these cultures were capable of construction and massive art at or near the technological level of the Old Kingdom in Egypt. Paleo astronomy is obvious in some plots. Major concerns are how to further evaluate, catalog, protect, and conserve the creations of those cultures.
Predicting point-of-departure values from the ToxCast data (TDS)
There are less than two-thousand health assessments available for the tens of thousands of chemicals in commerce today. Traditional toxicity testing takes time, money, and resources leading in part to this large discrepancy. Faster and more efficient ways of understanding adverse...
Thousands of military personnel and tens of thousands of civilian workers perform tank entry procedures. OSHA regulations (1910.146) require the internal atmosphere be tested, with a calibrated direct-reading instrument, for oxygen content, flammable gases and vapors, and poten...
Madison County energy conservation study : 2012-2013 survey of roadside vegetation : [summary
DOT National Transportation Integrated Search
2014-02-01
The many thousands of miles of roads in Floridas : State Highway System (SHS) are flanked by tens : of thousands of acres of planted right-of-way : and medians. The nature of the plants and soils : in the right-of-way is important in helping to : ...
[Actinomycetes of the genus Micromonospora in meadow ecosystems].
Zenova, G M; Zviagintsev, D G
2002-01-01
Investigations showed that micromonosporas, along with streptomycetes, are the major inhabitants of floodplain meadow ecosystems, where their population varies from tens of thousands to hundreds of thousands of CFU per g substrate. In spring, the population of micromonosporas in soil and on the plant roots was found to be denser than that of streptomycetes.
Thousands of military personnel and tens of thousands of civilian workers perform jet fuel tank entry procedures. Before entering the confined space of a jet fuel tank, OSHA regulations (29CFR1910.146) require the internal atmosphere be tested with a calibrated, direct-reading...
Ten Thousand Years of Solitude
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benford, G.; Kirkwood, C.W.; Harry, O.
1991-03-01
This report documents the authors work as an expert team advising the US Department of Energy on modes of inadvertent intrusion over the next 10,000 years into the Waste Isolation Pilot Project (WIPP) nuclear waste repository. Credible types of potential future accidental intrusion into the WIPP are estimated as a basis for creating warning markers to prevent inadvertent intrusion. A six-step process is used to structure possible scenarios for such intrusion, and it is concluded that the probability of inadvertent intrusion into the WIPP repository over the next ten thousand years lies between one and twenty-five percent. 3 figs., 5more » tabs.« less
Large scale exact quantum dynamics calculations: Ten thousand quantum states of acetonitrile
NASA Astrophysics Data System (ADS)
Halverson, Thomas; Poirier, Bill
2015-03-01
'Exact' quantum dynamics (EQD) calculations of the vibrational spectrum of acetonitrile (CH3CN) are performed, using two different methods: (1) phase-space-truncated momentum-symmetrized Gaussian basis and (2) correlated truncated harmonic oscillator basis. In both cases, a simple classical phase space picture is used to optimize the selection of individual basis functions-leading to drastic reductions in basis size, in comparison with existing methods. Massive parallelization is also employed. Together, these tools-implemented into a single, easy-to-use computer code-enable a calculation of tens of thousands of vibrational states of CH3CN to an accuracy of 0.001-10 cm-1.
Streamflow and water well responses to earthquakes.
Montgomery, David R; Manga, Michael
2003-06-27
Earthquake-induced crustal deformation and ground shaking can alter stream flow and water levels in wells through consolidation of surficial deposits, fracturing of solid rocks, aquifer deformation, and the clearing of fracture-filling material. Although local conditions affect the type and amplitude of response, a compilation of reported observations of hydrological response to earthquakes indicates that the maximum distance to which changes in stream flow and water levels in wells have been reported is related to earthquake magnitude. Detectable streamflow changes occur in areas within tens to hundreds of kilometers of the epicenter, whereas changes in groundwater levels in wells can occur hundreds to thousands of kilometers from earthquake epicenters.
ERIC Educational Resources Information Center
Wager, J. James
2012-01-01
Thousands--if not tens of thousands--of books, monographs, and articles have been written on the subject of leadership. A Google search of the word returns nearly a half-billion Web sites. As a professional who has spent nearly 40 years in the higher education sector, the author has been blessed with opportunities to view and practice leadership…
2013-10-03
the Stanford NLP Suite∗ to create an- notated dictionaries based on word morphologies ; the human descriptions provide the input. The predicted...keywords from the low level topic models are labeled through these dictionaries. For more than two POS for the same morphology , we prefer verbs, but other...redundancy particularly retaining subjects like “man,” “woman” etc. and verb morphologies (which otherwise stem to the same prefix) as proxies for ten
ERIC Educational Resources Information Center
Murphy, Edward; Bell, Randy L.
2005-01-01
On any night, the stars seen in the sky can be as close to Earth as a few light-years or as distant as a few thousand light-years. Distances this large are hard to comprehend. The stars are so far away that the fastest spacecraft would take tens of thousands of years to reach even the nearest one. Yet, astronomers have been able to accurately…
Building to Scale: An Analysis of Web-Based Services in CIC (Big Ten) Libraries.
ERIC Educational Resources Information Center
Dewey, Barbara I.
Advancing library services in large universities requires creative approaches for "building to scale." This is the case for CIC, Committee on Institutional Cooperation (Big Ten), libraries whose home institutions serve thousands of students, faculty, staff, and others. Developing virtual Web-based services is an increasingly viable…
Li, Hongdong; Zhang, Yang; Guan, Yuanfang; Menon, Rajasree; Omenn, Gilbert S
2017-01-01
Tens of thousands of splice isoforms of proteins have been catalogued as predicted sequences from transcripts in humans and other species. Relatively few have been characterized biochemically or structurally. With the extensive development of protein bioinformatics, the characterization and modeling of isoform features, isoform functions, and isoform-level networks have advanced notably. Here we present applications of the I-TASSER family of algorithms for folding and functional predictions and the IsoFunc, MIsoMine, and Hisonet data resources for isoform-level analyses of network and pathway-based functional predictions and protein-protein interactions. Hopefully, predictions and insights from protein bioinformatics will stimulate many experimental validation studies.
Flow Cytometry: Impact on Early Drug Discovery.
Edwards, Bruce S; Sklar, Larry A
2015-07-01
Modern flow cytometers can make optical measurements of 10 or more parameters per cell at tens of thousands of cells per second and more than five orders of magnitude dynamic range. Although flow cytometry is used in most drug discovery stages, "sip-and-spit" sampling technology has restricted it to low-sample-throughput applications. The advent of HyperCyt sampling technology has recently made possible primary screening applications in which tens of thousands of compounds are analyzed per day. Target-multiplexing methodologies in combination with extended multiparameter analyses enable profiling of lead candidates early in the discovery process, when the greatest numbers of candidates are available for evaluation. The ability to sample small volumes with negligible waste reduces reagent costs, compound usage, and consumption of cells. Improved compound library formatting strategies can further extend primary screening opportunities when samples are scarce. Dozens of targets have been screened in 384- and 1536-well assay formats, predominantly in academic screening lab settings. In concert with commercial platform evolution and trending drug discovery strategies, HyperCyt-based systems are now finding their way into mainstream screening labs. Recent advances in flow-based imaging, mass spectrometry, and parallel sample processing promise dramatically expanded single-cell profiling capabilities to bolster systems-level approaches to drug discovery. © 2015 Society for Laboratory Automation and Screening.
Flow Cytometry: Impact On Early Drug Discovery
Edwards, Bruce S.; Sklar, Larry A.
2015-01-01
Summary Modern flow cytometers can make optical measurements of 10 or more parameters per cell at tens-of-thousands of cells per second and over five orders of magnitude dynamic range. Although flow cytometry is used in most drug discovery stages, “sip-and-spit” sampling technology has restricted it to low sample throughput applications. The advent of HyperCyt sampling technology has recently made possible primary screening applications in which tens-of-thousands of compounds are analyzed per day. Target-multiplexing methodologies in combination with extended multi-parameter analyses enable profiling of lead candidates early in the discovery process, when the greatest numbers of candidates are available for evaluation. The ability to sample small volumes with negligible waste reduces reagent costs, compound usage and consumption of cells. Improved compound library formatting strategies can further extend primary screening opportunities when samples are scarce. Dozens of targets have been screened in 384- and 1536-well assay formats, predominantly in academic screening lab settings. In concert with commercial platform evolution and trending drug discovery strategies, HyperCyt-based systems are now finding their way into mainstream screening labs. Recent advances in flow-based imaging, mass spectrometry and parallel sample processing promise dramatically expanded single cell profiling capabilities to bolster systems level approaches to drug discovery. PMID:25805180
Middleweight black holes found at last
NASA Astrophysics Data System (ADS)
Clery, Daniel
2018-06-01
How did giant black holes grow so big? Astronomers have long had evidence of baby black holes with masses of no more than tens of suns, and of million- or billion-solar-mass behemoths lurking at the centers of galaxies. But middle-size ones, weighing thousands or tens of thousands of suns, seemed to be missing. Their absence forced theorists to propose that supermassive black holes didn't grow gradually by slowly consuming matter, but somehow emerged as ready-made giants. Now, astronomers appear to have located some missing middleweights. An international team has scoured an archive of galaxy spectra and found more than 300 small galaxies that have the signature of intermediate mass black holes in their cores, opening new questions for theorists.
Mechanisms and modelling of waste-cement and cement-host rock interactions
NASA Astrophysics Data System (ADS)
2017-06-01
Safe and sustainable disposal of hazardous and radioactive waste is a major concern in today's industrial societies. The hazardous waste forms originate from residues of thermal treatment of waste, fossil fuel combustion and ferrous/non-ferrous metal smelting being the most important ones in terms of waste production. Low- and intermediate-level radioactive waste is produced in the course of nuclear applications in research and energy production. For both waste forms encapsulation in alkaline, cement-based matrices is considered to ensure long-term safe disposal. Cementitious materials are in routine use as industrial materials and have mainly been studied with respect to their evolution over a typical service life of several decades. Use of these materials in waste management applications, however, requires assessments of their performance over much longer time periods on the order of thousands to several ten thousands of years.
Online tools for nucleosynthesis studies
NASA Astrophysics Data System (ADS)
Göbel, K.; Glorius, J.; Koloczek, A.; Pignatari, M.; Plag, R.; Reifarth, R.; Ritter, C.; Schmidt, S.; Sonnabend, K.; Thomas, B.; Travaglio, C.
2018-01-01
The nucleosynthesis of the elements between iron and uranium involves many different astrophysical scenarios covering wide ranges of temperatures and densities. Thousands of nuclei and ten thousands of reaction rates have to be included in the corresponding simulations. We investigate the impact of single rates on the predicted abundance distributions with post-processing nucleosynthesis simulations. We present online tools, which allow the investigation of sensitivities and integrated mass fluxes in different astrophysical scenarios.
Fabrication of arrayed Si nanowire-based nano-floating gate memory devices on flexible plastics.
Yoon, Changjoon; Jeon, Youngin; Yun, Junggwon; Kim, Sangsig
2012-01-01
Arrayed Si nanowire (NW)-based nano-floating gate memory (NFGM) devices with Pt nanoparticles (NPs) embedded in Al2O3 gate layers are successfully constructed on flexible plastics by top-down approaches. Ten arrayed Si NW-based NFGM devices are positioned on the first level. Cross-linked poly-4-vinylphenol (PVP) layers are spin-coated on them as isolation layers between the first and second level, and another ten devices are stacked on the cross-linked PVP isolation layers. The electrical characteristics of the representative Si NW-based NFGM devices on the first and second levels exhibit threshold voltage shifts, indicating the trapping and detrapping of electrons in their NPs nodes. They have an average threshold voltage shift of 2.5 V with good retention times of more than 5 x 10(4) s. Moreover, most of the devices successfully retain their electrical characteristics after about one thousand bending cycles. These well-arrayed and stacked Si NW-based NFGM devices demonstrate the potential of nanowire-based devices for large-scale integration.
Optical dipole forces: Working together
NASA Astrophysics Data System (ADS)
Aiello, Clarice D.
2017-03-01
Strength lies in numbers and in teamwork: tens of thousands of artificial atoms tightly packed in a nanodiamond act cooperatively, enhancing the optical trapping forces beyond the expected classical bulk polarizability contribution.
Yucca Mountain, Nevada - A proposed geologic repository for high-level radioactive waste
Levich, R.A.; Stuckless, J.S.
2006-01-01
Yucca Mountain in Nevada represents the proposed solution to what has been a lengthy national effort to dispose of high-level radioactive waste, waste which must be isolated from the biosphere for tens of thousands of years. This chapter reviews the background of that national effort and includes some discussion of international work in order to provide a more complete framework for the problem of waste disposal. Other chapters provide the regional geologic setting, the geology of the Yucca Mountain site, the tectonics, and climate (past, present, and future). These last two chapters are integral to prediction of long-term waste isolation. ?? 2007 Geological Society of America. All rights reserved.
NASA Astrophysics Data System (ADS)
Shapley, Alan H.; Hart, Pembroke J.
One of the lasting heritages of the International Geophysical Year (1957-58) is the system of world data centers (WDC) through which there has been international exchange of a wide variety of geophysical data on a continuing basis. This voluntary exchange mechanism has been remarkably successful. The basic operating costs of the centers are provided by the host country. The international exchanges are mainly by barter. The data providers number in the thousands and the users in the tens of thousands.
Dissecting the genetics of complex traits using summary association statistics.
Pasaniuc, Bogdan; Price, Alkes L
2017-02-01
During the past decade, genome-wide association studies (GWAS) have been used to successfully identify tens of thousands of genetic variants associated with complex traits and diseases. These studies have produced extensive repositories of genetic variation and trait measurements across large numbers of individuals, providing tremendous opportunities for further analyses. However, privacy concerns and other logistical considerations often limit access to individual-level genetic data, motivating the development of methods that analyse summary association statistics. Here, we review recent progress on statistical methods that leverage summary association data to gain insights into the genetic basis of complex traits and diseases.
Dissecting the genetics of complex traits using summary association statistics
Pasaniuc, Bogdan; Price, Alkes L.
2017-01-01
During the past decade, genome-wide association studies (GWAS) have successfully identified tens of thousands of genetic variants associated with complex traits and diseases. These studies have produced extensive repositories of genetic variation and trait measurements across large numbers of individuals, providing tremendous opportunities for further analyses. However, privacy concerns and other logistical considerations often limit access to individual-level genetic data, motivating the development of methods that analyze summary association statistics. Here we review recent progress on statistical methods that leverage summary association data to gain insights into the genetic basis of complex traits and diseases. PMID:27840428
Updated Starshade Technology Gap List
NASA Astrophysics Data System (ADS)
Crill, Brendan P.; Siegler, Nicholas
2017-01-01
NASA's Exoplanet Exploration Program (ExEP) guides the development of technology that enables the direct imaging and characterization of exo-Earths in the habitable zone of their stars, for future space observatories. Here we present the Starshade portion of the 2017 ExEP Enabling Technology Gap List, an annual update to ExEP's list of of technology to be advanced in the next 1-5 years. A Starshade is an external occulter on an independent spacecraft, allowing a space telescope to achieve exo-Earth imaging contrast requirements by blocking starlight before it enters the telescope. Building and operating a Starshade requires new technology: the occulter is a structure tens of meters in diameter that must be positioned precisely at a distance of tens of thousands of kilometers from the telescope. We review the current state-of-the-art performance and the performance level that must be achieved for a Starshade.
Hydrothermal plumes over spreading-center axes: Global distributions and geological inferences
NASA Astrophysics Data System (ADS)
Baker, Edward T.; German, Christopher R.; Elderfield, Henry
Seafloor hydrothermal circulation is the principal agent of energy and mass exchange between the ocean and the earth's crust. Discharging fluids cool hot rock, construct mineral deposits, nurture biological communities, alter deep-sea mixing and circulation patterns, and profoundly influence ocean chemistry and biology. Although the active discharge orifices themselves cover only a minuscule percentage of the ridge-axis seafloor, the investigation and quantification of their effects is enhanced as a consequence of the mixing process that forms hydrothermal plumes. Hydrothermal fluids discharged from vents are rapidly diluted with ambient seawater by factors of 104-105 [Lupton et al., 1985]. During dilution, the mixture rises tens to hundreds of meters to a level of neutral buoyancy, eventually spreading laterally as a distinct hydrographic and chemical layer with a spatial scale of tens to thousands of kilometers [e.g., Lupton and Craig, 1981; Baker and Massoth, 1987; Speer and Rona, 1989].
15 CFR 950.8 - Satellite Data Services Division (SDSD).
Code of Federal Regulations, 2012 CFR
2012-01-01
... Technology Satellites (ATS) I and III geostationary research spacecraft; tens of thousands of images from the... geostationary spacecraft. In addition to visible light imagery, infrared data are available from the NIMBUS...
15 CFR 950.8 - Satellite Data Services Division (SDSD).
Code of Federal Regulations, 2014 CFR
2014-01-01
... Technology Satellites (ATS) I and III geostationary research spacecraft; tens of thousands of images from the... geostationary spacecraft. In addition to visible light imagery, infrared data are available from the NIMBUS...
15 CFR 950.8 - Satellite Data Services Division (SDSD).
Code of Federal Regulations, 2013 CFR
2013-01-01
... Technology Satellites (ATS) I and III geostationary research spacecraft; tens of thousands of images from the... geostationary spacecraft. In addition to visible light imagery, infrared data are available from the NIMBUS...
15 CFR 950.8 - Satellite Data Services Division (SDSD).
Code of Federal Regulations, 2011 CFR
2011-01-01
... Technology Satellites (ATS) I and III geostationary research spacecraft; tens of thousands of images from the... geostationary spacecraft. In addition to visible light imagery, infrared data are available from the NIMBUS...
15 CFR 950.8 - Satellite Data Services Division (SDSD).
Code of Federal Regulations, 2010 CFR
2010-01-01
... Technology Satellites (ATS) I and III geostationary research spacecraft; tens of thousands of images from the... geostationary spacecraft. In addition to visible light imagery, infrared data are available from the NIMBUS...
Designing Agent Utilities for Coordinated, Scalable and Robust Multi-Agent Systems
NASA Technical Reports Server (NTRS)
Tumer, Kagan
2005-01-01
Coordinating the behavior of a large number of agents to achieve a system level goal poses unique design challenges. In particular, problems of scaling (number of agents in the thousands to tens of thousands), observability (agents have limited sensing capabilities), and robustness (the agents are unreliable) make it impossible to simply apply methods developed for small multi-agent systems composed of reliable agents. To address these problems, we present an approach based on deriving agent goals that are aligned with the overall system goal, and can be computed using information readily available to the agents. Then, each agent uses a simple reinforcement learning algorithm to pursue its own goals. Because of the way in which those goals are derived, there is no need to use difficult to scale external mechanisms to force collaboration or coordination among the agents, or to ensure that agents actively attempt to appropriate the tasks of agents that suffered failures. To present these results in a concrete setting, we focus on the problem of finding the sub-set of a set of imperfect devices that results in the best aggregate device. This is a large distributed agent coordination problem where each agent (e.g., device) needs to determine whether to be part of the aggregate device. Our results show that the approach proposed in this work provides improvements of over an order of magnitude over both traditional search methods and traditional multi-agent methods. Furthermore, the results show that even in extreme cases of agent failures (i.e., half the agents failed midway through the simulation) the system's performance degrades gracefully and still outperforms a failure-free and centralized search algorithm. The results also show that the gains increase as the size of the system (e.g., number of agents) increases. This latter result is particularly encouraging and suggests that this method is ideally suited for domains where the number of agents is currently in the thousands and will reach tens or hundreds of thousands in the near future.
Exploring the Influence of Dynamic Disorder on Excitons in Solid Pentacene
NASA Astrophysics Data System (ADS)
Wang, Zhiping; Sharifzadeh, Sahar; Doak, Peter; Lu, Zhenfei; Neaton, Jeffrey
2014-03-01
A complete understanding of the spectroscopic and charge transport properties of organic semiconductors requires knowledge of the role of thermal fluctuations and dynamic disorder. We present a first-principles theoretical study aimed at understanding the degree to which dynamic disorder at room temperature results in energy level broadening and excited-state localization within bulk crystalline pentacene. Ab initio molecular dynamics simulations are well-equilibrated for 7-9 ps and tens of thousands of structural snapshots, taken at 0.5 fs intervals, provide input for many-body perturbation theory within the GW approximation and Bethe-Salpeter equation (BSE) approach. The GW-corrected density of states, including thousands of snapshots, indicates that thermal fluctuations significantly broaden the valence and conduction states by >0.2 eV. Additionally, we investigate the nature and energy of the lowest energy singlet and triplet excitons, computed for a set of uncorrelated and energetically preferred structures. This work supported by DOE; computational resources provided by NERSC.
Our Globally Changing Climate. Chapter 1
NASA Technical Reports Server (NTRS)
Wuebbles, D. J.; Easterling, D. R.; Hayhoe, K.; Knutson, T.; Kopp, R. E.; Kossin, J. P.; Kunkel, K. E.; LeGrande, A. N.; Mears, C.; Sweet, W. V.;
2017-01-01
Since the Third U.S. National Climate Assessment (NCA3) was published in May 2014, new observations along multiple lines of evidence have strengthened the conclusion that Earth's climate is changing at a pace and in a pattern not explainable by natural influences. While this report focuses especially on observed and projected future changes for the United States, it is important to understand those changes in the global context (this chapter). The world has warmed over the last 150 years, especially over the last six decades, and that warming has triggered many other changes to Earth's climate. Evidence for a changing climate abounds, from the top of the atmosphere to the depths of the oceans. Thousands of studies conducted by tens of thousands of scientists around the world have documented changes in surface, atmospheric, and oceanic temperatures; melting glaciers; disappearing snow cover; shrinking sea ice; rising sea level; and an increase in atmospheric water vapor. Rainfall patterns and storms are changing, and the occurrence of droughts is shifting.
High-resolution 18 CM spectra of OH/IR stars
NASA Astrophysics Data System (ADS)
Fix, John D.
1987-02-01
High-velocity-resolution, high-signal-to-noise spectra have been obtained for the 18 cm maser emission lines from a number of optically visible OH/IR stars. The spectra have been interpreted in terms of a recent model by Alcock and Ross (1986), in which OH/IR stars lose mass in discrete elements rather than by a continuous wind. Comparison of the observed spectra with synthetic spectra shows that the lines are the composite emission from thousands or tens of thousands of individual elements.
77 FR 19648 - Receipt of Application for a Permit Modification
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-02
... primarily in the region of the Florida coast from Naples to Key West, encompassing the Ten Thousand Islands... following offices: Permits and Conservation Division, Office of Protected Resources, NMFS, 1315 East-West...
US Army Evaluations: A Study of Inaccurate and Inflated Reporting
2012-04-26
decisions such as promotions directly impacting the careers of tens of thousands of the Anny’s leaders, both officer and NCO, has few equals in the...accurate, and equitable perfonnance ratings throughout the Army.13 Many ofthe revisions were caused by the inability of selection boards to discern a...should be : assigned a numerical percentage; superior equals top ten percent, excellence equals top twenty- five percent, success equals top fifty
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-24
... susceptibility of enteric bacteria to antimicrobial agents of medical importance. The NARMS program, established... infected with these bacteria, resulting in tens of thousands of hospitalizations and hundreds of deaths...
75 FR 75845 - National Impaired Driving Prevention Month, 2010
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-07
... United States of America A Proclamation Every day, millions of Americans travel on our Nation's roadways... hand this first day of December, in the year of our Lord two thousand ten, and of the Independence of...
76 FR 45395 - National Korean War Veterans Armistice Day, 2011
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-29
... Agreement at Panmunjom secured the border near the 38th parallel. Together, American service members and... cause of freedom and stability in East Asia and around the world. Today, we honor the tens of thousands...
Tue, Nguyen Minh; Takahashi, Shin; Subramanian, Annamalai; Sakai, Shinichi; Tanabe, Shinsuke
2013-07-01
E-waste recycling using uncontrolled processes is a major source of dioxin-related compounds (DRCs), including not only the regulated polychlorinated dibenzo-p-dioxins/dibenzofurans (PCDD/Fs) and dioxin-like polychlorinated biphenyls (DL-PCBs) but also non-regulated brominated and mixed halogenated compounds (PBDD/Fs and PXDD/Fs). Various studies at informal e-waste recycling sites (EWRSs) in Asian developing countries found the soil contamination levels of PCDD/Fs from tens to ten thousand picogram TCDD-equivalents (TEQ) per gram and those of DL-PCBs up to hundreds of picogram TEQ per gram. The air concentration of PCDD/Fs was reported as high as 50 pg TEQ per m(3) in Guiyu, the largest Chinese EWRS. Non-regulated compounds also contributed substantially to the total DL toxicity of the DRC mixtures from e-waste, as evidenced by the high TEQ levels estimated for the currently identifiable PBDD/Fs as well as the large portion of unexplained bioassay-derived TEQ levels in soils/dusts from EWRSs. Considering the high exposure levels estimated for EWRS residents, especially children, comprehensive emission inventories of DRCs from informal e-waste recycling, the identities and toxic potencies of unidentified DRCs released, and their impacts on human health need to be investigated in future studies.
Physical and Chemical Properties of Anthropogenic Aerosols: An Overview
Aerosol chemical composition is complex. Combustion aerosols can comprise tens of thousands of organic compounds, refractory brown and black carbon, heavy metals, cations, anions, salts, and other inorganic phases. Aerosol organic matter normally contains semivolatile material th...
Connectivity-enhanced route selection and adaptive control for the Chevrolet Volt
Gonder, Jeffrey; Wood, Eric; Rajagopalan, Sai
2016-01-01
The National Renewable Energy Laboratory and General Motors evaluated connectivity-enabled efficiency enhancements for the Chevrolet Volt. A high-level model was developed to predict vehicle fuel and electricity consumption based on driving characteristics and vehicle state inputs. These techniques were leveraged to optimize energy efficiency via green routing and intelligent control mode scheduling, which were evaluated using prospective driving routes between tens of thousands of real-world origin/destination pairs. The overall energy savings potential of green routing and intelligent mode scheduling was estimated at 5% and 3%, respectively. Furthermore, these represent substantial opportunities considering that they only require software adjustments to implement.
Mining the human phenome using allelic scores that index biological intermediates.
Evans, David M; Brion, Marie Jo A; Paternoster, Lavinia; Kemp, John P; McMahon, George; Munafò, Marcus; Whitfield, John B; Medland, Sarah E; Montgomery, Grant W; Timpson, Nicholas J; St Pourcain, Beate; Lawlor, Debbie A; Martin, Nicholas G; Dehghan, Abbas; Hirschhorn, Joel; Smith, George Davey
2013-10-01
It is common practice in genome-wide association studies (GWAS) to focus on the relationship between disease risk and genetic variants one marker at a time. When relevant genes are identified it is often possible to implicate biological intermediates and pathways likely to be involved in disease aetiology. However, single genetic variants typically explain small amounts of disease risk. Our idea is to construct allelic scores that explain greater proportions of the variance in biological intermediates, and subsequently use these scores to data mine GWAS. To investigate the approach's properties, we indexed three biological intermediates where the results of large GWAS meta-analyses were available: body mass index, C-reactive protein and low density lipoprotein levels. We generated allelic scores in the Avon Longitudinal Study of Parents and Children, and in publicly available data from the first Wellcome Trust Case Control Consortium. We compared the explanatory ability of allelic scores in terms of their capacity to proxy for the intermediate of interest, and the extent to which they associated with disease. We found that allelic scores derived from known variants and allelic scores derived from hundreds of thousands of genetic markers explained significant portions of the variance in biological intermediates of interest, and many of these scores showed expected correlations with disease. Genome-wide allelic scores however tended to lack specificity suggesting that they should be used with caution and perhaps only to proxy biological intermediates for which there are no known individual variants. Power calculations confirm the feasibility of extending our strategy to the analysis of tens of thousands of molecular phenotypes in large genome-wide meta-analyses. We conclude that our method represents a simple way in which potentially tens of thousands of molecular phenotypes could be screened for causal relationships with disease without having to expensively measure these variables in individual disease collections.
Improving data quality in neuronal population recordings
Harris, Kenneth D.; Quian Quiroga, Rodrigo; Freeman, Jeremy; Smith, Spencer
2017-01-01
Understanding how the brain operates requires understanding how large sets of neurons function together. Modern recording technology makes it possible to simultaneously record the activity of hundreds of neurons, and technological developments will soon allow recording of thousands or tens of thousands. As with all experimental techniques, these methods are subject to confounds that complicate the interpretation of such recordings, and could lead to erroneous scientific conclusions. Here, we discuss methods for assessing and improving the quality of data from these techniques, and outline likely future directions in this field. PMID:27571195
Aviation Security: Slow Progress in Addressing Long-Standing Screener Performance Problems
2000-03-16
aviation security , in particular airport screeners. Securing an air transportation system the size of this nation’s-with hundreds of airports, thousands of aircraft, and tens of thousands of flights daily carrying millions of passengers and pieces of baggage-is a difficult task. Events over the past decade have shown that the threat of terrorism against the United States is an ever-present danger. Aviation is an attractive target for terrorists, and because the air transportation system is critical to the nation’s well-being, protecting it is an important
Cosmic impact: What are the odds?
NASA Astrophysics Data System (ADS)
Harris, A. W.
2009-12-01
Firestone et al. (PNAS 104, 16016-16021, 2007) propose that the impact of a ~4 km diameter comet (or multiple bodies making up a similar mass) led to the Younger Dryas cooling and extinction of megafauna in North America, 12,900 years ago. Even more provocatively, Firestone et al. (Cycle of Cosmic Catastrophes, Bear & Co. Books, 2006, 392pp), suggest that a nearby supernova may have produced a comet shower leading to the impact event, either by disturbing the Oort Cloud or by direct injection of 4-km comet-like bodies to the solar neighborhood. Here we show: (a) A supernova shockwave or mass ejection is not capable of triggering a shower of comets from the Oort Cloud. (b) An Oort Cloud shower from whatever cause would take 100,000 years or more for the perturbed comets to arrive in the inner solar system, and the peak flux would persist for some hundreds of thousands more years. (c) Even if all 20 solar masses or so of ejected matter from a SN were in the form of 4-km diameter balls, the probability of even one such ball hitting the Earth from an event 100 light years away would be about 3e-5. (d) A 4-km diameter ball traveling fast enough to get here from 100 LY away in some tens of thousands of years would deliver the energy of a 50 km diameter impactor traveling at typical Earth-impact velocity (~20 km/sec). We review the current impact flux on the Earth from asteroids and comets, and show that the probability of an impact of a 4-km diameter asteroid in an interval of 13,000 years is about one in a thousand, and the probability of a comet impact of that size is a few in a million. An "impact shower" caused by the injection or breakup of comets or asteroids in the inner solar system by whatever means would take tens to hundreds of thousands of years to clear out, thus the population of NEOs we see now with our telescopic surveys is what we’ve had for the last few tens of thousands of years, at least. Faced with such low odds, the evidence that such a large cosmic impact is the cause of the Younger Dryas boundary and cooling, and that there is no other possible cause, needs to be extraordinary indeed.
Estimation of toxicity using the Toxicity Estimation Software Tool (TEST)
Tens of thousands of chemicals are currently in commerce, and hundreds more are introduced every year. Since experimental measurements of toxicity are extremely time consuming and expensive, it is imperative that alternative methods to estimate toxicity are developed.
Developing Non-Targeted Measurement Methods to Characterize the Human Exposome
The exposome represents all exposures experienced by an individual during their lifetime. Registered chemicals currently number in the tens-of-thousands, and therefore comprise a significant portion of the human exposome. To date, quantitative monitoring methods have been develop...
Atmospheric Science Data Center
2013-04-15
... of which occurred north of Khartoum. According to the Food and Agriculture Organization of the United Nations, tens of thousands of ... fled their homes, and the number of people in need of urgent food assistance in Sudan, estimated at three million earlier in the year, was ...
ERIC Educational Resources Information Center
Wilson, David L.
1994-01-01
College administrators recently appealed to students and faculty to change their computer passwords after security experts announced that tens of thousands had been stolen by computer hackers. Federal officials are investigating. Such attacks are not uncommon, but the most effective solutions are either inconvenient or cumbersome. (MSE)
Federal Initiative: Tick-Borne Disease Integrated Pest Management White Paper
The numbers of human cases of Lyme disease and other tick-borne diseases (TBDs) reported each year to CDC have been increasing steadily in the United States (US), currently totaling tens of thousands of diagnosed human cases annually.
Exposure-Based Prioritization of Chemicals for Risk Assessment
Manufactured chemicals are used extensively to produce a wide variety of consumer goods and are required by important industrial sectors. Presently, information is insufficient to estimate risks posed to human health and the environment from the over ten thousand chemical substan...
Ashy Aftermath of Indonesian Volcano Eruption seen by NASA Spacecraft
2014-02-23
On Feb. 13, 2014, violent eruption of Kelud stratovolcano in Java, Indonesia sent volcanic ash covering an area of 70,000 square miles, prompting the evacuation of tens of thousands of people. This image is from NASA Terra spacecraft.
Takemori, Nobuaki; Takemori, Ayako; Tanaka, Yuki; Endo, Yaeta; Hurst, Jane L.; Gómez-Baena, Guadalupe; Harman, Victoria M.; Beynon, Robert J.
2017-01-01
A major challenge in proteomics is the absolute accurate quantification of large numbers of proteins. QconCATs, artificial proteins that are concatenations of multiple standard peptides, are well established as an efficient means to generate standards for proteome quantification. Previously, QconCATs have been expressed in bacteria, but we now describe QconCAT expression in a robust, cell-free system. The new expression approach rescues QconCATs that previously were unable to be expressed in bacteria and can reduce the incidence of proteolytic damage to QconCATs. Moreover, it is possible to cosynthesize QconCATs in a highly-multiplexed translation reaction, coexpressing tens or hundreds of QconCATs simultaneously. By obviating bacterial culture and through the gain of high level multiplexing, it is now possible to generate tens of thousands of standard peptides in a matter of weeks, rendering absolute quantification of a complex proteome highly achievable in a reproducible, broadly deployable system. PMID:29055021
Predicting organ toxicity using in vitro bioactivity data and chemical structure
Animal testing alone cannot practically evaluate the health hazard posed by tens of thousands of environmental chemicals. Computational approaches together with high-throughput experimental data may provide more efficient means to predict chemical toxicity. Here, we use a superv...
Acupuncture Reduces Breast Cancer Joint Pain | Division of Cancer Prevention
In the largest, most rigorous study of its kind, acupuncture was found to significantly reduce the debilitating joint pain experienced by tens of thousands of women each year while being treated for early stage breast cancer with aromatase inhibitors (AIs). |
TOXCAST: A TOOL FOR THE PRIORITIZATION OF CHEMICALS FOR TOXICOLOGICAL EVALUATION
Due to various legislatiave mandates, the US EPA is faced with evaluating the potential of tens of thousands of chemicals (e.g., high production volume chemicals, pestididal inerts, and drinking water contaminants) to cause adverse human health & environmental effects.
TOXCAST: A PROGRAM FOR PRIORTITIZING TOXICITY TESTING OF ENVIRONMENTAL CHEMICALS
Evaluating the potential of tens of thousands of chemicals for risk to human health and the environment is beyond the resource limits of the Environmental Protection Agency. The EPA's ToxCast program will explore alternative methods comprising computational chemistry, high-throug...
3 CFR 8695 - Proclamation 8695 of July 26, 2011. National Korean War Veterans Armistice Day, 2011
Code of Federal Regulations, 2012 CFR
2012-01-01
... Armistice Agreement at Panmunjom secured the border near the 38th parallel. Together, American service... cause of freedom and stability in East Asia and around the world. Today, we honor the tens of thousands...
Analysis of the chemical and physical properties of combustion aerosols: Properties overview
Aerosol chemical composition is remarkably complex. Combustion aerosols can comprise tens of thousands of organic compounds and fragments, refractory carbon, metals, cations, anions, salts, and other inorganic phases and substituents [Hays et al., 2004]. Aerosol organic matter no...
Etoile Project : Social Intelligent ICT-System for very large scale education in complex systems
NASA Astrophysics Data System (ADS)
Bourgine, P.; Johnson, J.
2009-04-01
The project will devise new theory and implement new ICT-based methods of delivering high-quality low-cost postgraduate education to many thousands of people in a scalable way, with the cost of each extra student being negligible (< a few Euros). The research will create an in vivo laboratory of one to ten thousand postgraduate students studying courses in complex systems. This community is chosen because it is large and interdisciplinary and there is a known requirement for courses for thousand of students across Europe. The project involves every aspect of course production and delivery. Within this the research focused on the creation of a Socially Intelligent Resource Mining system to gather large volumes of high quality educational resources from the internet; new methods to deconstruct these to produce a semantically tagged Learning Object Database; a Living Course Ecology to support the creation and maintenance of evolving course materials; systems to deliver courses; and a ‘socially intelligent assessment system'. The system will be tested on one to ten thousand postgraduate students in Europe working towards the Complex System Society's title of European PhD in Complex Systems. Étoile will have a very high impact both scientifically and socially by (i) the provision of new scalable ICT-based methods for providing very low cost scientific education, (ii) the creation of new mathematical and statistical theory for the multiscale dynamics of complex systems, (iii) the provision of a working example of adaptation and emergence in complex socio-technical systems, and (iv) making a major educational contribution to European complex systems science and its applications.
NASA Astrophysics Data System (ADS)
Draisma, Stefano G. A.; Prud'homme van Reine, Willem F.; Herandarudewi, Sekar M. C.; Hoeksema, Bert W.
2018-01-01
The Jakarta Bay - Thousand Islands reef complex extends to more than 80 km in northwest direction from the major conurbation Jakarta (Indonesia) along a pronounced inshore to offshore environmental gradient. The present study aims to determine to what extent environmental factors can explain the composition of macroalgal communities on the reefs off Jakarta. Therefore, the presence-absence of 67 macroalgal taxa was recorded for 27 sampling sites along the inshore-offshore disturbance gradient and analysed with substrate variables and water quality variables. The macroalgal richness pattern matches the pattern of other reef taxa. The 27 sites could be assigned to one of four geographical zones with 85% certainty based on their macroalgal taxon assemblages. These four zones (i.e., Jakarta Bay and, respectively, South, Central, and North Thousand Islands) had significantly different macroalgal assemblages, except for the North and South zones. Along the nearshore gradient there was a greater shift in taxon composition than within the central Thousand Islands. The patterns of ten habitat and water quality variables resembled the macroalgal diversity patterns by 56%. All ten variables together explained 69% of the variation in macroalgal composition. Shelf depth, % sand cover, gelbstoff/detrital material, chlorophyll a concentration, seawater surface temperature, and % dead coral cover were the best predictors of seaweed flora composition. Furthermore, 44 macroalgal species represented new records for the area. The present study provides important baseline data of macroalgae in the area for comparison in future biodiversity assessments in the area and elsewhere in the region.
Nuclear waste`s human dimension
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erikson, K.; Colglazier, E.W.; White, G.F.
1994-12-31
The United States has pinned its hopes for a permanent underground repository for its high-level nuclear wastes on Yucca Mountain, Nevada. Nevertheless, the Department of Energy`s (DOE) site research efforts have failed {open_quotes}to adequately consider human behavior and emotions,{close_quotes} write Kai Erikson of Yale University, E. William Colglazier of the National Academy of Sciences, and Gilbert F. White of the University of Colorado. The authors maintain that it is impossible to predict changes in geology, seismology, and hydrology that may affect the Yucca Mountain area over the next 1,000 years. Predicting human behavior in that time frame remains even moremore » daunting, they insist. They admit that {open_quotes}DOE...has been given the impossible assignment to take tens of thousands of metric tons of the most hazardous materials ever created and, in the face of growing opposition, entomb them so that they will do little harm for thousands of years.{close_quotes} The researchers suggest that the government seek a secure, retrievable storage arrangement while it continues its search for safer long-term options.« less
Diversity arrays technology: a generic genome profiling technology on open platforms.
Kilian, Andrzej; Wenzl, Peter; Huttner, Eric; Carling, Jason; Xia, Ling; Blois, Hélène; Caig, Vanessa; Heller-Uszynska, Katarzyna; Jaccoud, Damian; Hopper, Colleen; Aschenbrenner-Kilian, Malgorzata; Evers, Margaret; Peng, Kaiman; Cayla, Cyril; Hok, Puthick; Uszynski, Grzegorz
2012-01-01
In the last 20 years, we have observed an exponential growth of the DNA sequence data and simular increase in the volume of DNA polymorphism data generated by numerous molecular marker technologies. Most of the investment, and therefore progress, concentrated on human genome and genomes of selected model species. Diversity Arrays Technology (DArT), developed over a decade ago, was among the first "democratizing" genotyping technologies, as its performance was primarily driven by the level of DNA sequence variation in the species rather than by the level of financial investment. DArT also proved more robust to genome size and ploidy-level differences among approximately 60 organisms for which DArT was developed to date compared to other high-throughput genotyping technologies. The success of DArT in a number of organisms, including a wide range of "orphan crops," can be attributed to the simplicity of underlying concepts: DArT combines genome complexity reduction methods enriching for genic regions with a highly parallel assay readout on a number of "open-access" microarray platforms. The quantitative nature of the assay enabled a number of applications in which allelic frequencies can be estimated from DArT arrays. A typical DArT assay tests for polymorphism tens of thousands of genomic loci with the final number of markers reported (hundreds to thousands) reflecting the level of DNA sequence variation in the tested loci. Detailed DArT methods, protocols, and a range of their application examples as well as DArT's evolution path are presented.
Teach Astronomy: An Educational Resource for Formal and Informal Learners
NASA Astrophysics Data System (ADS)
Impey, Chris David
2018-01-01
Teach Astronomy is an educational resource, available in the form of a user-friendly, platform-agnostic website. Ideal for college-level, introductory astronomy courses, Teach Astronomy can be a valuable reference for astronomers at all levels, especially informal learners. Over the past year, multiple changes have been made to the infrastructure behind Teach Astronomy to provide high availability to our tens of thousands of monthly, unique users, as well as fostering in new features. Teach Astronomy contains interactive tools which supplement the free textbook, such as a Quiz Tool with real-time feedback. The site also provides a searchable collection of Chris Impey’s responses to questions frequently asked by our users. The developers and educators behind Teach Astronomy are working to create an environment which encourages astronomy students of all levels to continue to increase their knowledge and help others learn.
Organizational capacity of nonprofit social service agencies.
Paynter, Sharon; Berner, Marueen
2014-01-01
The U.S. social safety net is formed by governmental and nonprofit organizations, which are trying to respond to record levels of need. This is especially true for local level organizations, such as food pantries. The organizational capacity literature has not covered front-line, local, mostly volunteer and low resource organizations in the same depth as larger ones. This analysis is a consideration of whether grassroots nonprofit organizations have the ability to be a strong component of the social safety net. Based on the literature on organizational capacity, a model is developed to examine how service delivery at the local level is affected by organizational capacity. Surprisingly, we find few of the characteristics previously identified as important are statistically significant in this study. Even when so, the material effect is negligible. Current organizational capacity research may apply to larger nonprofits, but not to the tens of thousands of small community nonprofits, a significant limitation to the research to date.
Utility of acoustical detection of Coptotermes Formosanus (Isoptera: Rhinotermitidae)
USDA-ARS?s Scientific Manuscript database
The AED 2000 and 2010 are extremely sensitive listening devices which can effectively detect and monitor termite activity through a wave guide (e.g. bolt) both qualitatively and quantitatively. Experiments conducted with one to ten thousand termites from differing colonies infesting wood in buckets...
Mining Human Biomonitoring Data to Identify Prevalent Chemical Mixtures (SOT abstract)
Through food, water, air, and consumer products, humans are exposed to tens of thousands of environmental chemicals, and most of these have not been evaluated to determine their potential toxicities. In recent years, high-throughput screening (HTS) methods have been developed tha...
Exposure Considerations for Chemical Prioritization and Toxicity Testing
Globally there is a need to characterize potential risk to human health and the environment that arises from the manufacture and use of tens of thousands of chemicals. Currently, a significant research effort is underway to apply new technologies to screen and prioritize chemica...
Source-to-Dose Modeling of Phthalates: Lessons for Prioritization
Globally there is a need to characterize potential risk to human health and the environment that arises from the manufacture and use of tens of thousands of chemicals. The US EPA is developing methods for using computational chemistry, high-throughput screening, and toxicogenomi...
20180312 - Structure-based QSAR Models to Predict Systemic Toxicity Points of Departure (SOT)
Human health risk assessment associated with environmental chemical exposure is limited by the tens of thousands of chemicals with little or no experimental in vivo toxicity data. Data gap filling techniques, such as quantitative structure activity relationship (QSAR) models base...
Quantitative Prediction of Systemic Toxicity Points of Departure (OpenTox USA 2017)
Human health risk assessment associated with environmental chemical exposure is limited by the tens of thousands of chemicals little or no experimental in vivo toxicity data. Data gap filling techniques, such as quantitative models based on chemical structure information, are c...
Developing an Experimental Model of Vascular Toxicity in Embryonic Zebrafish
Developing an Experimental Model of Vascular Toxicity in Embryonic Zebrafish Tamara Tal, Integrated Systems Toxicology Division, U.S. EPA Background: There are tens of thousands of chemicals that have yet to be fully evaluated for their toxicity by validated in vivo testing ...
Extraterrestrial Communications.
ERIC Educational Resources Information Center
Deardorff, James W.
1987-01-01
Discusses the embargo hypothesis--the theory that Earth is apparently free from alien exploitation because of a presumed cosmic quarantine against this planet--which implies that, instead of being only a few hundred years technologically in advance of earthly civilization, extraterrestrials in charge are likely tens of thousands of years in…
NASA Astrophysics Data System (ADS)
Baek, Seung Ki; Minnhagen, Petter; Kim, Beom Jun
2011-07-01
In Korean culture, the names of family members are recorded in special family books. This makes it possible to follow the distribution of Korean family names far back in history. It is shown here that these name distributions are well described by a simple null model, the random group formation (RGF) model. This model makes it possible to predict how the name distributions change and these predictions are shown to be borne out. In particular, the RGF model predicts that for married women entering a collection of family books in a certain year, the occurrence of the most common family name 'Kim' should be directly proportional to the total number of married women with the same proportionality constant for all the years. This prediction is also borne out to a high degree. We speculate that it reflects some inherent social stability in the Korean culture. In addition, we obtain an estimate of the total population of the Korean culture down to the year 500 AD, based on the RGF model, and find about ten thousand Kims.
The Eleventh Plague: The Politics of Biological and Chemical Warfare
NASA Astrophysics Data System (ADS)
Kovac, Jeffrey
1997-07-01
Leonard A. Cole. W. H. Freeman: New York, 1997. 250 pp. ISBN 0-7167-2950-4. $22.95 hc. The Eleventh Plague begins with a recitation of the ten plagues brought down upon Egypt, part of the Passover Seder celebrated each spring by Jews all over the world. Spring is also the anniversary of the first use of chemical weapons. On April 22, 1915, German soldiers released chlorine gas from 5,739 cylinders installed along the battle line at Ypres in southeastern Belgium. Germany achieved complete surprise. The gas drifted across no man's land, causing widespread terror and creating ten thousand serious casualties and five thousand deaths. Chlorine, of course, was a poor weapon, easily neutralized, but German scientists, including future Nobel laureates Fritz Haber, Otto Hahn, and James Franck, and the German chemical industry created ever more dangerous chemical weapons, culminating with the introduction of mustard gas in 1917. Despite cries of moral outrage, the Allies countered with their own chemical weapons efforts. The eleventh plague had been unleashed.
Robben, Antonius C G M
2014-01-01
This article uses the dual process model (DPM) in an analysis of the national mourning of tens of thousands of disappeared in Chile and Argentina by adapting the model from the individual to the collective level where society as a whole is bereaved. Perpetrators are also involved in the national mourning process as members of a bereaved society. This article aims to (a) demonstrate the DPMs significance for the analysis of national mourning in post-conflict societies and (b) explain oscillations between loss orientation and restoration orientation in coping with massive losses that seem contradictory from a grief work perspective.
Simulation of Chronic Liver Injury Due to Environmental Chemicals
US EPA Virtual Liver (v-Liver) is a cellular systems model of hepatic tissues to predict the effects of chronic exposure to chemicals. Tens of thousands of chemicals are currently in commerce and hundreds more are introduced every year. Few of these chemicals have been adequate...
EDSP Prioritization: Collaborative Estrogen Receptor Activity Prediction Project (CERAPP) (SOT)
Humans are potentially exposed to tens of thousands of man-made chemicals in the environment. It is well known that some environmental chemicals mimic natural hormones and thus have the potential to be endocrine disruptors. Most of these environmental chemicals have never been te...
Invited presentation at Dalton College, Dalton, GA to the Alliance for Innovation & Sustainability, April 20, 2017. U.S. EPA’s Computational Toxicology Program: Innovation Powered by Chemistry It is estimated that tens of thousands of commercial and industrial chemicals are ...
Characterization of the potential adverse effects is lacking for tens of thousands of chemicals that are present in the environment, and characterization of developmental neurotoxicity (DNT) hazard lags behind that of other adverse outcomes (e.g. hepatotoxicity). This is due in p...
Characterization of the potential adverse effects is lacking for tens of thousands of chemicals that are present in the environment, and characterization of developmental neurotoxicity (DNT) hazard lags behind that of other adverse outcomes (e.g. hepatotoxicity). This is due in p...
Theoretical Framework to Extend Adverse Outcome Pathways to Include Pharmacokinetic Considerations
Adverse Outcome Pathways (AOPs) have generated intense interest for their utility in linking known population outcomes to a molecular initiating event (MIE) that can be quantified using in vitro methods. While there are tens of thousands of chemicals in commercial use, biology h...
Recessionary Layoffs in Museum Education: Survey Results and Implications
ERIC Educational Resources Information Center
Kley, Ron
2009-01-01
A recent survey of recession-driven museum staff reductions suggests the possible loss of tens of thousands of museum personnel nationwide and identifies educators as among those most severely impacted. Survey findings are summarized, and the implications for both affected personnel and downsized institutions are considered.
SPATIAL ASSOCIATION BETWEEN SPECIATED FINE PARTICLES AND MORTALITY
Particulate matter (PM) has been linked to a range of serious cardiovascular and respiratory health problems. Some of the recent epidemiologic studies suggest that exposures to PM may result in tens of thousands of excess deaths per year and many more cases of illness among the ...
Booth, Amanda C.; Soderqvist, Lars E.
2016-12-12
Freshwater flow to the Ten Thousand Islands estuary has been altered by the construction of the Tamiami Trail and the Southern Golden Gate Estates. The Picayune Strand Restoration Project, which is associated with the Comprehensive Everglades Restoration Plan, has been implemented to improve freshwater delivery to the Ten Thousand Islands estuary by removing hundreds of miles of roads, emplacing hundreds of canal plugs, removing exotic vegetation, and constructing three pump stations. Quantifying the tributary flows and salinity patterns prior to, during, and after the restoration is essential to assessing the effectiveness of upstream restoration efforts.Tributary flow and salinity patterns during preliminary restoration efforts and prior to the installation of pump stations were analyzed to provide baseline data and preliminary analysis of changes due to restoration efforts. The study assessed streamflow and salinity data for water years1 2007–2014 for the Faka Union River (canal flow included), East River, Little Wood River, Pumpkin River, and Blackwater River. Salinity data from the Palm River and Faka Union Boundary water-quality stations were also assessed.Faka Union River was the dominant contributor of freshwater during water years 2007–14 to the Ten Thousand Islands estuary, followed by Little Wood and East Rivers. Pumpkin River and Blackwater River were the least substantial contributors of freshwater flow. The lowest annual flow volumes, the highest annual mean salinities, and the highest percentage of salinity values greater than 35 parts per thousand (ppt) occurred in water year 2011 at all sites with available data, corresponding with the lowest annual rainfall during the study. The highest annual flow volumes and the lowest percentage of salinities greater than 35 ppt occurred in water year 2013 for all sites with available data, corresponding with the highest rainfall during the study.In water year 2014, the percentage of monitored annual flow contributed by East River increased and the percentage of flow contributed by Faka Union River decreased, compared to the earlier years. No changes in annual flow occurred at any sites west of Faka Union River. No changes in the relative flow contributions were observed during the wet season; however, the relative amounts of streamflow increased during the dry season at East River in 2014. East River had only 1 month of negative flow in 2014 compared to 6 months in 2011 and 7 months in 2008. Higher dry season flows in East River may be in response to restoration efforts. The sites to the west of Faka Union River had higher salinities on average than Faka Union River and East River. Faka Union River had the highest range in salinities, and Faka Union Boundary had the lowest range in salinities. Pumpkin River was the tributary with the lowest range in salinities.1Water year is defined as the 12-month period from October 1, for any given year, through September 30 of the following year.
Opportunities and Challenges of Linking Scientific Core Samples to the Geoscience Data Ecosystem
NASA Astrophysics Data System (ADS)
Noren, A. J.
2016-12-01
Core samples generated in scientific drilling and coring are critical for the advancement of the Earth Sciences. The scientific themes enabled by analysis of these samples are diverse, and include plate tectonics, ocean circulation, Earth-life system interactions (paleoclimate, paleobiology, paleoanthropology), Critical Zone processes, geothermal systems, deep biosphere, and many others, and substantial resources are invested in their collection and analysis. Linking core samples to researchers, datasets, publications, and funding agencies through registration of globally unique identifiers such as International Geo Sample Numbers (IGSNs) offers great potential for advancing several frontiers. These include maximizing sample discoverability, access, reuse, and return on investment; a means for credit to researchers; and documentation of project outputs to funding agencies. Thousands of kilometers of core samples and billions of derivative subsamples have been generated through thousands of investigators' projects, yet the vast majority of these samples are curated at only a small number of facilities. These numbers, combined with the substantial similarity in sample types, make core samples a compelling target for IGSN implementation. However, differences between core sample communities and other geoscience disciplines continue to create barriers to implementation. Core samples involve parent-child relationships spanning 8 or more generations, an exponential increase in sample numbers between levels in the hierarchy, concepts related to depth/position in the sample, requirements for associating data derived from core scanning and lithologic description with data derived from subsample analysis, and publications based on tens of thousands of co-registered scan data points and thousands of analyses of subsamples. These characteristics require specialized resources for accurate and consistent assignment of IGSNs, and a community of practice to establish norms, workflows, and infrastructure to support implementation.
77 FR 60605 - National Breast Cancer Awareness Month, 2012
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-04
... National Breast Cancer Awareness Month, 2012 By the President of the United States of America A Proclamation Breast cancer touches the lives of Americans from every background and in every community across...,000 women will be diagnosed with breast cancer this year, and tens of thousands are expected to lose...
Predictive Toxicology and In Vitro to In Vivo Extrapolation (AsiaTox2015)
A significant challenge in toxicology is the “too many chemicals” problem. Humans and environmental species are exposed to as many as tens of thousands of chemicals, few of which have been thoroughly tested using standard in vivo test methods. This talk will discuss several appro...
Humans are potentially exposed to tens of thousands of man-made chemicals in the environment. It is well known that some environmental chemicals mimic natural hormones and thus have the potential to be endocrine disruptors. Most of these environmental chemicals have never been te...
Humans are potentially exposed to tens of thousands of man-made chemicals in the environment, some of which may mimic natural endocrine hormones and thus have the potential to be endocrine disruptors. Predictive in silico tools can be used to quickly and efficiently evaluate thes...
Spatial Ability: A Neglected Talent in Educational and Occupational Settings
ERIC Educational Resources Information Center
Kell, Harrison J.; Lubinski, David
2013-01-01
For over 60 years, longitudinal research on tens of thousands of high ability and intellectually precocious youth has consistently revealed the importance of spatial ability for hands-on creative accomplishments and the development of expertise in science, technology, engineering, and mathematical (STEM) disciplines. Yet, individual differences in…
ERIC Educational Resources Information Center
Meisenhelder, Susan
2013-01-01
The push for increased use of online teaching in colleges and universities has been gaining momentum for some time, but even in that context the recent enthusiasm for MOOCs (Massive Open Online Courses), free online courses that often enroll tens of thousands of students, is remarkable and rightly dubbed "MOOC Mania." As with so many…
25 CFR 141.13 - Amusement company licenses.
Code of Federal Regulations, 2011 CFR
2011-04-01
... licenses. (a) No person may operate a portable dance pavilion, mechanical amusement device such as a ferris... amount not exceeding ten thousand dollars ($10,000) and a personal injury and property damage liability... to the tribe and for the protection of the public against personal injury and property damage by bond...
25 CFR 141.13 - Amusement company licenses.
Code of Federal Regulations, 2010 CFR
2010-04-01
... licenses. (a) No person may operate a portable dance pavilion, mechanical amusement device such as a ferris... amount not exceeding ten thousand dollars ($10,000) and a personal injury and property damage liability... to the tribe and for the protection of the public against personal injury and property damage by bond...
A Conceptual Framework for U.S. EPA’s National Exposure Research Laboratory
Fulfilling the U.S. EPA mission to protect human health and the environment carries with it the challenge of understanding exposures for tens of thousands of chemical contaminants, a wide range of biological stressors, and many physical stressors. The U.S. EPA’s National Exposur...
The EPA ToxCast Program: Developing Predictive Bioactivity Signatures for Chemicals
There are tens of thousands of chemicals used in the environment for which little or no toxicology information is known. Current testing paradigms that use large numbers of animals to perform in vivo toxicology are too slow and expensive to apply to this large number of chemicals...
The Second Phase of ToxCast and Initial Applications to Chemical Prioritization
Tens of thousands of chemicals and other contaminants exist in our environment, but only a fraction of these have been characterized for their potential hazard to humans. ToxCast is focused on closing this data gap and improving the management of chemical risk through a high thro...
Sudden Oak Death - Western (Pest Alert)
Susan Frankel
2002-01-01
Tens of thousands of tanoak (Lithocarpus densiflorus), coast live oak (Quercus agrifolia), California black oak (Quercus kelloggii), Shreve oak (Quercus parvula var. shrevei), and madrone (Arbutus menziesii) have been killed by a newly identified species, Phytophthora ramorum, which causes Sudden Oak Death. Sudden Oak Death was first reported in 1995 in central coastal...
Visualizing the Solute Vaporization Interference in Flame Atomic Absorption Spectroscopy
ERIC Educational Resources Information Center
Dockery, Christopher R.; Blew, Michael J.; Goode, Scott R.
2008-01-01
Every day, tens of thousands of chemists use analytical atomic spectroscopy in their work, often without knowledge of possible interferences. We present a unique approach to study these interferences by using modern response surface methods to visualize an interference in which aluminum depresses the calcium atomic absorption signal. Calcium…
Development of a Context-Rich Database of ToxCast Assay Annotations (SOT)
Major concerns exist for the large number of environmental chemicals which lack toxicity data. The tens of thousands of commercial substances in need of screening for potential human health effects would cost millions of dollars and several decades to test in traditional animal-b...
A Method for Identifying Prevalent Chemical Combinations in the US Population
Through the food and water they ingest, the air they breathe, and the consumer products with which they interact at home and at work, humans are exposed to tens of thousands of chemicals, many of which have not been evaluated to determine their potential toxicities. In recent yea...
Triaging Chemical Exposure Data Needs and Tools for Advancing Next-Generation Risk Assessment
The timely assessment of the risks posed to public health by tens of thousands of existing and emerging commercial chemicals is a critical challenge facing the U.S. Environmental Protection Agency and regulatory bodies worldwide. The pace of conducting risk assessments is limited...
Neuroimaging Research: from Null-Hypothesis Falsification to Out-Of-Sample Generalization
ERIC Educational Resources Information Center
Bzdok, Danilo; Varoquaux, Gaël; Thirion, Bertrand
2017-01-01
Brain-imaging technology has boosted the quantification of neurobiological phenomena underlying human mental operations and their disturbances. Since its inception, drawing inference on neurophysiological effects hinged on classical statistical methods, especially, the general linear model. The tens of thousands of variables per brain scan were…
ERIC Educational Resources Information Center
Simon, David
2008-01-01
Energy costs are projected to rise as much as 12 percent in 2008, and a facility's "carbon footprint" has become an issue of increasing importance. So, many schools and universities are taking a hard look at their energy consumption. Education facilities can save tens of thousands of dollars in yearly electric costs, and cut harmful emissions by…
Where's the Beef in Administrator Pay?
ERIC Educational Resources Information Center
Cunningham, William G.; Sperry, J. Brent
2001-01-01
Salary differences between educators and business leaders range from tens of thousands of dollars for principals to millions for superintendents. Employees valuing monetary incentives will not be attracted to or remain in the education field. Wealthy taxpayers get too many breaks. Progressive income taxes should replace skewed property taxes. (MLH)
The challenge of assessing the potential developmental health risks for the tens of thousands of environmental chemicals is beyond the capacity for resource-intensive animal protocols. Large data streams coming from high-throughput (HTS) and high-content (HCS) profiling of biolog...
Agricultural Career Education in the City of New York
ERIC Educational Resources Information Center
Chrein, George
1975-01-01
More than one thousand students in ten high schools throughout the City of New York are presently enrolled in an agricultural career program, specializing in farm production and management, ornamental horticulture, animal care, or conservation. More than 90 percent continue in occupational agriculture in the post-secondary schools. (Author/AJ)
Weight Misperception and Health Risk Behaviors among Early Adolescents
ERIC Educational Resources Information Center
Pasch, Keryn E.; Klein, Elizabeth G.; Laska, Melissa N.; Velazquez, Cayley E.; Moe, Stacey G.; Lytle, Leslie A.
2011-01-01
Objectives: To examine associations between weight misperception and youth health risk and protective factors. Methods: Three thousand ten US seventh-graders (72.1% white, mean age: 12.7 years) self-reported height, weight, risk, and protective factors. Analyses were conducted to determine cross-sectional and longitudinal associations between…
Operation and Maintenance Support Information (OMSI) Creation, Management, and Repurposing With XML
2004-09-01
engines that cost tens of thousands of dollars. There are many middleware applications on the commercial and open-source market . The “Big Four......planners can begin an incremental planning effort early in the facility construction phase. This thesis provides a non-proprietary, no- cost solution to
Big Results from Small Samples: Evaluation of Amplification Protocols for Gene Expression Profiling
Microarrays have revolutionized many areas of biology due to our technical ability to quantify tens of thousands of transcripts within a single experiment. However, there are still many areas that cannot benefit from this technology due to the amount of biological material needed...
Developing Young Children's Multidigit Number Sense.
ERIC Educational Resources Information Center
Diezmann, Carmel M.; English, Lyn D.
2001-01-01
This article describes a series of enrichment experiences designed to develop young (ages 5 to 8) gifted children's understanding of large numbers, central to their investigation of space travel. It describes activities designed to teach reading of large numbers and exploring numbers to a thousand and then a million. (Contains ten references.) (DB)
ERIC Educational Resources Information Center
Guilbert, Juliette
2006-01-01
This article focusses on defining the Parent Academy. The Parent Academy is a deeply ambitious, privately funded project aimed at improving students' education by improving their parents'. Since Miami-Dade County Public Schools superintendent Rudy Crew launched it last year, TPA has reached tens of thousands of parents through hundreds of free…
A Web-Hosted R Workflow to Simplify and Automate the Analysis of 16S NGS Data
Next-Generation Sequencing (NGS) produces large data sets that include tens-of-thousands of sequence reads per sample. For analysis of bacterial diversity, 16S NGS sequences are typically analyzed in a workflow that containing best-of-breed bioinformatics packages that may levera...
Over the past ten years, the US government has invested in high-throughput (HT) methods to screen chemicals for biological activity. Under the interagency Tox21 consortium and the US Environmental Protection Agency’s (EPA) ToxCast™ program, thousands of chemicals have...
Romano, Michael
2003-03-24
HealthSouth and its chief executive Richard Scrushy, left, find themselves coping with a public relations nightmare after federal officials last week charged the rehabilitation giant with "massive accounting fraud" and a systematic betrayal of tens of thousands of investors.
ERIC Educational Resources Information Center
Morrison, David
1982-01-01
Discusses the effects on astronomy courses/curriculum if equal time were given to the concept that the universe was created in its present form about ten thousand years ago. Includes the full text on a resolution concerning creationism passed by the Board of Directors of the Astronomical Society of the Pacific. (Author/JN)
Predictive Modeling of Apical Toxicity Endpoints Using Data From ToxCast
The US EPA and other regulatory agencies face a daunting challenge of evaluating potential toxicity for tens of thousands of environmental chemicals about which little is currently known. The EPA’s ToxCast program is testing a novel approach to this problem by screening compounds...
Mycelial actinobacteria in salt-affected soils of arid territories of Ukraine and Russia
NASA Astrophysics Data System (ADS)
Grishko, V. N.; Syshchikova, O. V.; Zenova, G. M.; Kozhevin, P. A.; Dubrova, M. S.; Lubsanova, D. A.; Chernov, I. Yu.
2015-01-01
A high population density (up to hundreds of thousands or millions CFU/g soil) of mycelial bacteria (actinomycetes) is determined in salt-affected soils of arid territories of Ukraine, Russia, and Turkmenistan. Of all the studied soils, the lowest amounts of actinomycetes (thousands and tens of thousands CFU/g soil) are isolated from sor (playa) and soda solonchaks developed on the bottoms of drying salt lakes in Buryatia and in the Amu Darya Delta. Actinomycetes of the Streptomyces, Micromonospora, and Nocardiopsis genera were recorded in the studied soils. It is found that conditions of preincubation greatly affect the activity of substrate consumption by the cultures of actinomycetes. This could be attributed to changes in the metabolism of actinomycetes as a mechanism of their adaptation to the increased osmotic pressure of the medium. The alkali tolerance of halotolerant actinomycetes isolated from the salt-affected soils is experimentally proved.
Increasing awareness about endocrine disrupting chemicals (EDCs) in the environment has driven concern about their potential impact on human health and wildlife. Tens of thousands of natural and synthetic xenobiotics are presently in commerce with little to no toxicity data and t...
HTS Data and In Silico Models for High-Throughout Risk Assessment (FutureTox II)
A significant challenge in toxicology is the “too many chemicals” problem. Humans and environmental species are exposed to as many as tens of thousands of chemicals, few of which have been thoroughly tested using standard in vivo test methods. This talk will discuss several appro...
Tens of thousands of chemicals and other man-made contaminants exist in our environment, but only a fraction of these have been characterized for their potential risk to humans and there is widespread interest in closing this data gap in order to better manage contaminant risk. C...
Enabling Easier Information Access in Online Discussion Forums
ERIC Educational Resources Information Center
Bhatia, Sumit
2013-01-01
Online discussion forums have become popular in recent times. They provide a platform for people from different parts of the world sharing a common interest to come together and topics of mutual interest and seek solutions to their problems. There are hundreds of thousands of internet forums containing tens of millions of discussion threads and…
Sudden Oak Death - Eastern (Pest Alert)
Joseph O' Brien; Manfred Mielke; Steve Oak; Bruce Moltzan
2002-01-01
A phenomenon known as Sudden Oak Death was first reported in 1995 in central coastal California. Since then, tens of thousands of tanoaks (Lithocarpus densiflorus), coast live oaks (Quercus agrifolia), and California black oaks (Quercus kelloggii) have been killed by a newly identified fungus, Phytophthora ramorum. On these hosts, the fungus causes a bleeding canker on...
Teacher as Trickster on the Learner's Journey
ERIC Educational Resources Information Center
Davis, Kenneth W.; Weeden, Scott R.
2009-01-01
For tens of thousands of years, teachers have used stories to promote learning. Today's teachers can do the same. In particular, we can employ Joseph Campbell's "monomyth"--with its stages of separation, initiation, and return--as a model for structuring learning experiences. Within the monomyth, one tempting role for teachers is the sage, but we…
Creating a Sustainable University and Community through a Common Experience
ERIC Educational Resources Information Center
Lopez, Omar S.
2013-01-01
Purpose: This article aims to provide an overview of Texas State University's Common Experience, an innovative initiative that engaged tens of thousands of people in shared consideration of sustainability as a single topic during academic year 2010-2011. Design/methodology/approach: The discourse begins with an overview of the Common Experience…
Ten Qualities of a Strong Community College Leader
ERIC Educational Resources Information Center
Wheelan, Belle
2012-01-01
There are thousands of articles, books, essays, dissertations, and more devoted to leadership in higher education. All of them highlight the importance of a person "out front" who is charged with moving the organization forward and people who follow to ensure that movement takes place. The author's favorite definition of leadership is not found in…
USDA-ARS?s Scientific Manuscript database
Mosquitoes of various species mate in swarms comprised of tens to thousands flying males. Yet little information is known about mosquito swarming mechanism. Discovering chemical cues involved in mosquito biology leads to better adaptation of disease control interventions. In this study, we aimed ...
The High Price of For-Profit Colleges
ERIC Educational Resources Information Center
Yeoman, Barry
2011-01-01
Critics say that for-profit career colleges--which, according to industry figures, enrolled 3.2 million students in the United States in 2009--have been plagued by deceptive recruiting practices that lure students into programs they could find elsewhere for much less money. Students often borrow tens of thousands of dollars to attend these…
Guest-Host Encounters in Diaspora-Heritage Tourism: The Taglit-Birthright Israel Mifgash (Encounter)
ERIC Educational Resources Information Center
Sasson, Theodore; Mittelberg, David; Hecht, Shahar; Saxe, Leonard
2011-01-01
More than 300,000 diaspora Jewish young adults and tens of thousands of their Israeli peers have participated in structured, cross-cultural encounters--"mifgashim"--in the context of an experiential education program known as Taglit-Birthright Israel. Drawing on field observations, interviews, and surveys, the formal and informal…
Characterization of the potential adverse effects is lacking for tens of thousands of chemicals that are present in the environment, and characterization of developmental neurotoxicity (DNT) hazard lags behind that of other adverse outcomes (e.g. hepatotoxicity). This is due in p...
Modeling belowground biomass of black cohosh, a medicinal forest product.
James Chamberlain; Gabrielle Ness; Christine Small; Simon Bonner; Elizabeth Hiebert
2014-01-01
Tens of thousands of kilograms of rhizomes and roots of Actaea racemosa L., a native Appalachian forest perennial, are harvested every year and used for the treatment of menopausal conditions. Sustainable management of this and other wild-harvested non-timber forest products requires the ability to effectively and reliably inventory marketable plant...
Extension Online: Utilizing Technology to Enhance Educational Outreach
ERIC Educational Resources Information Center
Green, Stephen
2012-01-01
Extension Online is an Internet-based online course platform that enables the Texas AgriLife Extension Service's Family Development and Resource Management (FDRM) unit to reach tens of thousands of users across the U.S. annually with research-based information. This article introduces readers to Extension Online by describing the history of its…
Time and Practice: Learning to Become a Geographer
ERIC Educational Resources Information Center
Downs, Roger M.
2014-01-01
A goal of geography education is fostering geographic literacy for all and building significant expertise for some. How much time and practice do students need to become literate or expert in geography? There is not an answer to this question. Using two concepts from cognitive psychology--the ideas of ten thousand hours and deliberate…
USDA-ARS?s Scientific Manuscript database
Background: Faced with tens of thousands of food choices, consumers frequently turn to promotional advertising, such as Sunday sales circulars, to make purchasing decisions. To date, little research has examined the content of sales circulars over multiple seasons. Methods: Food items from 12 months...
College Savings Plans: A Bad Gamble
ERIC Educational Resources Information Center
Carey, Kevin
2009-01-01
With all the economic pain and consternation--surging unemployment, enormous corporate bankruptcy, trillions becoming the new billions--it's easy to overlook the fact that tens of thousands of families have suddenly lost a great deal of the money they socked away to pay for college. They lost it because public officials told them to risk their…
The Big Fixes Now Needed for "No Child Left Behind"
ERIC Educational Resources Information Center
Stover, Del
2007-01-01
The underlying principles of No Child Left Behind (NCLB)--the demand for high standards, greater accountability, and the focus on long-overlooked student populations--are good. NCLB has done well for public education. Still, tens of thousands of educators nationwide are hoping that this year's reauthorization debate in Congress will lead to…
Tens of thousands of chemicals are currently in commerce, and hundreds more are introduced every year. Because current chemical testing is resource intensive, only a small fraction of chemicals have been adequately evaluated for potential human health effects. To address this ch...
Google Books Mutilates the Printed Past
ERIC Educational Resources Information Center
Musto, Ronald G.
2009-01-01
In this article, the author discusses a mutilation that he has encountered involving Google Book Search. That massive text-digitization project, working in collaboration with several of the world's most important library collections, has now made available, in both PDF and text view, tens of thousands of 19th-century titles while it awaits the…
There are tens of thousands of closed landfills in the United States, many of whicih are unlined and sited on alluvial deposits. Landfills are of concern because leachate contains a variety of pollutants that can contaminate ground and surface water. Data from chemical analysis...
High throughput toxicity testing (HTT) holds the promise of providing data for tens of thousands of chemicals that currently have no data due to the cost and time required for animal testing. Interpretation of these results require information linking the perturbations seen in vi...
Abstract: There are tens of thousands of man-made chemicals to which humans are exposed, but only a fraction of these have the extensive in vivo toxicity data used in most traditional risk assessments. This lack of data, coupled with concerns about testing costs and animal use, a...
TSA and Standards-Based Learning through TECH-Know
ERIC Educational Resources Information Center
Taylor, Jerianne S.; Peterson, Richard E.; Ernst, Jeremy
2005-01-01
Career and technical student organizations (CTSOs) serve as an integral part of many career and technical education (CTE) programs across the country. Their activities and competitions make up many of the strongest CTE programs due to their co-curricular nature. With memberships ranging from tens of thousands to almost a half million, it is hard…
Betsy DeVos, the (Relatively Mainstream) Reformer
ERIC Educational Resources Information Center
McShane, Michael Q.
2017-01-01
A privatization extremist. A religious zealot. A culture warrior. The new Secretary of Education, Betsy DeVos, was painted as any or all of these things in the fevered weeks between the 2016 presidential election and her confirmation hearing. In the days following that hearing, tens of thousands of people flooded the lines of congressional…
Tens of thousands of stream kilometers around the world are degraded by a legacy of environmental impacts and acid mine drainage (AMD) caused by abandoned underground and surface mines, piles of discarded coal wastes, and tailings. Increased acidity, high concentrations of metals...
Ready, Aim, Perform! Targeted Micro-Training for Performance Intervention
ERIC Educational Resources Information Center
Carpenter, Julia; Forde, Dahlia S.; Stevens, Denise R.; Flango, Vincent; Babcock, Lisa K.
2016-01-01
The Department of Veterans Affairs has an immediate problem at hand. Tens of thousands of employees are working in a high-stress work environment where fast-paced daily production requirements are critical. Employees are faced with a tremendous backlog of veterans' claims. Unfortunately, not only are the claims extremely complex, but there is…
ERIC Educational Resources Information Center
Education Commission of the States, 2015
2015-01-01
Colleges and postsecondary systems across the nation have demonstrated remarkable progress since "Core Principles for Transforming Remediation" was published in 2012. States and institutions are phasing out stand alone or multi-course remediation sequences, resulting in tens of thousands of students more quickly enrolling in and…
Libraries Achieving Greatness: Technology at the Helm
ERIC Educational Resources Information Center
Muir, Scott P.
2009-01-01
Libraries have been around for thousands of years. Many of them are considered great because of their magnificent architecture or because of the size of their collections. This paper offers ten case studies of libraries that have used technology to achieve greatness. Because almost any library can implement technology, a library does not have to…
Violence: innate or acquired? A survey and some opinions.
Bacciagaluppi, Marco
2004-01-01
Freud's psychoanalysis and Lorenz's ethology consider human aggressiveness to be innate. According to recent archaeological excavations and evolutionary studies, human groups in the Upper Paleolithic and Early Neolithic were peaceful and cooperative. This culture was replaced ten thousand years ago by a predatory hierarchical structure, which is here viewed as a cultural variant.
CAMUS: Automatically Mapping Cyber Assets to Mission and Users (PREPRINT)
2009-10-01
which machines regularly use a particular mail server. Armed with these basic data sources – LDAP, NetFlow traffic and user logs – fuselets were created... NetFlow traffic used in the demonstration has over ten thousand unique IP Addresses and is over one gigabyte in size. A number of high performance
ERIC Educational Resources Information Center
Barnett, R. Michael
2013-01-01
After half a century of waiting, the drama was intense. Physicists slept overnight outside the auditorium to get seats for the seminar at the CERN lab in Geneva, Switzerland. Ten thousand miles away on the other side of the planet, at the world's most prestigious international particle physics conference, hundreds of physicists from every corner…
School-Aged Victims of Sexual Abuse: Implications for Educators.
ERIC Educational Resources Information Center
Wishon, Phillip M.
Each year in the United States, thousands of school-aged children become involved in sexual activities arranged by adults for purposes of pleasure and profit. Nationwide, annual profits from the child pornography industry and from female and male child prostitution are in the tens of millions of dollars. Heretofore, the majority of…
There is an urgent need to characterize potential risk to human health and the environment that arises from the manufacture and use of tens of thousands of chemicals. Computational tools and approaches for characterizing and prioritizing exposure are required: to provide input f...
NASA Technical Reports Server (NTRS)
1992-01-01
Mike Morris, former Associate Director of STAC, formed pHish Doctor, Inc. to develop and sell a pH monitor for home aquariums. The monitor, or pHish Doctor, consists of a sensor strip and color chart that continually measures pH levels in an aquarium. This is important because when the level gets too high, ammonia excreted by fish is highly toxic; at low pH, bacteria that normally break down waste products stop functioning. Sales have run into the tens of thousands of dollars. A NASA Tech Brief Technical Support Package later led to a salt water version of the system and a DoE Small Business Innovation Research (SBIR) grant for development of a sensor for sea buoys. The company, now known as Ocean Optics, Inc., is currently studying the effects of carbon dioxide buildup as well as exploring other commercial applications for the fiber optic sensor.
Gale, Trevor V; Horton, Timothy M; Grant, Donald S; Garry, Robert F
2017-09-01
Lassa fever afflicts tens of thousands of people in West Africa annually. The rapid progression of patients from febrile illness to fulminant syndrome and death provides incentive for development of clinical prognostic markers that can guide case management. The small molecule profile of serum from febrile patients triaged to the Viral Hemorrhagic Fever Ward at Kenema Government Hospital in Sierra Leone was assessed using untargeted Ultra High Performance Liquid Chromatography Mass Spectrometry. Physiological dysregulation resulting from Lassa virus (LASV) infection occurs at the small molecule level. Effects of LASV infection on pathways mediating blood coagulation, and lipid, amino acid, nucleic acid metabolism are manifest in changes in the levels of numerous metabolites in the circulation. Several compounds, including platelet activating factor (PAF), PAF-like molecules and products of heme breakdown emerged as candidates that may prove useful in diagnostic assays to inform better care of Lassa fever patients.
A Computer Graphics Human Figure Application Of Biostereometrics
NASA Astrophysics Data System (ADS)
Fetter, William A.
1980-07-01
A study of improved computer graphic representation of the human figure is being conducted under a National Science Foundation grant. Special emphasis is given biostereometrics as a primary data base from which applications requiring a variety of levels of detail may be prepared. For example, a human figure represented by a single point can be very useful in overview plots of a population. A crude ten point figure can be adequate for queuing theory studies and simulated movement of groups. A one hundred point figure can usefully be animated to achieve different overall body activities including male and female figures. A one thousand point figure si-milarly animated, begins to be useful in anthropometrics and kinesiology gross body movements. Extrapolations of this order-of-magnitude approach ultimately should achieve very complex data bases and a program which automatically selects the correct level of detail for the task at hand. See Summary Figure 1.
Gale, Trevor V.; Horton, Timothy M.; Grant, Donald S.
2017-01-01
Lassa fever afflicts tens of thousands of people in West Africa annually. The rapid progression of patients from febrile illness to fulminant syndrome and death provides incentive for development of clinical prognostic markers that can guide case management. The small molecule profile of serum from febrile patients triaged to the Viral Hemorrhagic Fever Ward at Kenema Government Hospital in Sierra Leone was assessed using untargeted Ultra High Performance Liquid Chromatography Mass Spectrometry. Physiological dysregulation resulting from Lassa virus (LASV) infection occurs at the small molecule level. Effects of LASV infection on pathways mediating blood coagulation, and lipid, amino acid, nucleic acid metabolism are manifest in changes in the levels of numerous metabolites in the circulation. Several compounds, including platelet activating factor (PAF), PAF-like molecules and products of heme breakdown emerged as candidates that may prove useful in diagnostic assays to inform better care of Lassa fever patients. PMID:28922385
Korir, Geoffrey; Karam, P Andrew
2018-06-11
In the event of a significant radiological release in a major urban area where a large number of people reside, it is inevitable that radiological screening and dose assessment must be conducted. Lives may be saved if an emergency response plan and radiological screening method are established for use in such cases. Thousands to tens of thousands of people might present themselves with some levels of external contamination and/or the potential for internal contamination. Each of these individuals will require varying degrees of radiological screening, and those with a high likelihood of internal and/or external contamination will require radiological assessment to determine the need for medical attention and decontamination. This sort of radiological assessment typically requires skilled health physicists, but there are insufficient numbers of health physicists in any city to perform this function for large populations, especially since many (e.g., those at medical facilities) are likely to be engaged at their designated institutions. The aim of this paper is therefore to develop and describe the technical basis for a novel, scoring-based methodology that can be used by non-health physicists for performing radiological assessment during such radiological events.
Using ANTS to explore small body populations in the solar system.
NASA Astrophysics Data System (ADS)
Clark, P. E.; Rilee, M.; Truszkowski, W.; Curtis, S.; Marr, G.; Chapman, C.
2001-11-01
ANTS (Autonomous Nano-Technology Swarm), a NASA advanced mission concept, is a large (100 to 1000 member) swarm of pico-class (1 kg) totally autonomous spacecraft that prospect the asteroid belt. Little data is available for asteroids because the vast majority are too small to be observed except in close proximity. Light curves are available for thousands of asteroids, confirmed trajectories for tens of thousands, detailed shape models for approximately ten. Asteroids originated in the transitional region between the inner (rocky) and outer (solidified gases) solar system. Many have remained largely unmodified since formation, and thus have more primitive composition than planetary surfaces. Determination of the systematic distribution of physical and compositional properties within the asteroid population is crucial in the understanding of solar system formation. The traditional exploration approach of using few, large spacecraft for sequential exploration, could be improved. Our far more cost-effective approach utilizes distributed intelligence in a swarm of tiny highly maneuverable spacecraft, each with specialized instrument capability (e.g., advanced computing, imaging, spectrometry). NASA is at the forefront of Intelligent Software Agents (ISAs) research, performing experiments in space and on the ground to advance deliberative and collaborative autonomous control techniques. The advanced development under consideration here is in the use of ISAs at a strategic level, to explore remote frontiers of the solar system, potentially involving a large class of objects such as asteroids. Supervised clusters of spacecraft operate simultaneously within a broadly defined framework of goals to select targets (> 1000) from among available candidates while developing scenarios for studying targets. Swarm members use solar sails to fly directly to asteroids > 1 kilometer in diameter, and then perform maneuvers appropriate for the instrument carried, ranging from hovering to orbiting. Selected members return with data and are replaced as needed.
Variations in mid-ocean ridge magmatism and carbon emissions driven by glacial cycles
NASA Astrophysics Data System (ADS)
Katz, R. F.; Burley, J. M.; Huybers, P. J.; Langmuir, C. H.; Crowley, J. W.; Park, S. H.; Carbotte, S. M.; Ferguson, D.; Proistosescu, C.; Boulahanis, B.
2015-12-01
Glacial cycles transfer ˜5×10^19 kg of water between the oceans and ice sheets, causing pressure changes in the upper mantle with consequences for the melting of Earth's interior. Forced with Plio-Pleistocene sea-level variations, theoretical models of mid-ocean ridge magma/mantle dynamics predict temporal variations up to 10% in melt supply to the base of the crust. Moreover, a transport model for a perfectly incompatible element suggests that CO2 emissions from mid-ocean ridges could vary by a similar proportion, though with a longer time-lag.Bathymetry from the Australian-Antarctic ridge shows statistically significant spectral energy near the Milankovitch periods of 23, 41, and 100 thousand years, which is consistent with model predictions. These results suggest that abyssal hills record the magmatic response to changes in sea level. The mechanism by which variations in the rate of melt supply are expressed in the bathymetry is not understood.The same pressure variations that modulate the melting rate could also modulate the depth of the onset of silicate melting. As ice sheets grow and sea level drops, this onset deepens, causing melting at the base of the silicate melting regime. Excess highly incompatible elements like CO2 enter the melt and begin their journey to the ridge axis. Tens of thousands of years later, this additional CO2 flux is emitted into the climate system. Because of its delay with respect to sea-level change, the predicted variation in CO2 emissions could represent a restoring force on climate (and sea-level) excursions. This mechanism has a response time determined by the time scale of melt transport; it potentially introduces a resonant frequency into the climate system.
NASA Astrophysics Data System (ADS)
Moyer, R. P.; Khan, N.; Radabaugh, K.; Engelhart, S. E.; Smoak, J. M.; Horton, B.; Rosenheim, B. E.; Kemp, A.; Chappel, A. R.; Schafer, C.; Jacobs, J. A.; Dontis, E. E.; Lynch, J.; Joyse, K.; Walker, J. S.; Halavik, B. T.; Bownik, M.
2017-12-01
Since 2014, our collaborative group has been working in coastal marshes and mangroves across Southwest Florida, including Tampa Bay, Charlotte Harbor, Ten Thousand Islands, Biscayne Bay, and the lower Florida Keys. All existing field sites were located within 50 km of Hurricane Irma's eye path, with a few sites in the Lower Florida Keys and Naples/Ten Thousand Islands region suffering direct eyewall hits. As a result, we have been conducting storm-impact and damage assessments at these locations with the primary goal of understanding how major hurricanes contribute to and/or modify the sedimentary record of mangroves and salt marshes. We have also assessed changes to the vegetative structure of the mangrove forests at each site. Preliminary findings indicate a reduction in mangrove canopy cover from 70-90% pre-storm, to 30-50% post-Irma, and a reduction in tree height of approximately 1.2 m. Sedimentary deposits consisting of fine carbonate mud up to 12 cm thick were imported into the mangroves of the lower Florida Keys, Biscayne Bay, and the Ten Thousand Islands. Import of siliciclastic mud up to 5 cm thick was observed in Charlotte Harbor. In addition to fine mud, all sites had imported tidal wrack consisting of a mixed seagrass and mangrove leaf litter, with some deposits as thick as 6 cm. In areas with newly opened canopy, a microbial layer was coating the surface of the imported wrack layer. Overwash and shoreline erosion were also documented at two sites in the lower Keys and Biscayne Bay, and will be monitored for change and recovery over the next few years. Because active research was being conducted, a wealth of pre-storm data exists, thus these locations are uniquely positioned to quantify hurricane impacts to the sedimentary record and standing biomass across a wide geographic area. Due to changes in intensity along the storm path, direct comparisons of damage metrics can be made to environmental setting, wind speed, storm surge, and distance to eyewall.
Ongoing hydrothermal heat loss from the 1912 ash-flow sheet, Valley of Ten Thousand Smokes, Alaska
Hogeweg, N.; Keith, T.E.C.; Colvard, E.M.; Ingebritsen, S.E.
2005-01-01
The June 1912 eruption of Novarupta filled nearby glacial valleys on the Alaska Peninsula with ash-flow tuff (ignimbrite), and post-eruption observations of thousands of steaming fumaroles led to the name 'Valley of Ten Thousand Smokes' (VTTS). By the late 1980s most fumarolic activity had ceased, but the discovery of thermal springs in mid-valley in 1987 suggested continued cooling of the ash-flow sheet. Data collected at the mid-valley springs between 1987 and 2001 show a statistically significant correlation between maximum observed chloride (Cl) concentration and temperature. These data also show a statistically significant decline in the maximum Cl concentration. The observed variation in stream chemistry across the sheet strongly implies that most solutes, including Cl, originate within the area of the VTTS occupied by the 1912 deposits. Numerous measurements of Cl flux in the Ukak River just below the ash-flow sheet suggest an ongoing heat loss of ???250 MW. This represents one of the largest hydrothermal heat discharges in North America. Other hydrothermal discharges of comparable magnitude are related to heat obtained from silicic magma bodies at depth, and are quasi-steady on a multidecadal time scale. However, the VTTS hydrothermal flux is not obviously related to a magma body and is clearly declining. Available data provide reasonable boundary and initial conditions for simple transient modeling. Both an analytical, conduction-only model and a numerical model predict large rates of heat loss from the sheet 90 years after deposition.
BEAM web server: a tool for structural RNA motif discovery.
Pietrosanto, Marco; Adinolfi, Marta; Casula, Riccardo; Ausiello, Gabriele; Ferrè, Fabrizio; Helmer-Citterich, Manuela
2018-03-15
RNA structural motif finding is a relevant problem that becomes computationally hard when working on high-throughput data (e.g. eCLIP, PAR-CLIP), often represented by thousands of RNA molecules. Currently, the BEAM server is the only web tool capable to handle tens of thousands of RNA in input with a motif discovery procedure that is only limited by the current secondary structure prediction accuracies. The recently developed method BEAM (BEAr Motifs finder) can analyze tens of thousands of RNA molecules and identify RNA secondary structure motifs associated to a measure of their statistical significance. BEAM is extremely fast thanks to the BEAR encoding that transforms each RNA secondary structure in a string of characters. BEAM also exploits the evolutionary knowledge contained in a substitution matrix of secondary structure elements, extracted from the RFAM database of families of homologous RNAs. The BEAM web server has been designed to streamline data pre-processing by automatically handling folding and encoding of RNA sequences, giving users a choice for the preferred folding program. The server provides an intuitive and informative results page with the list of secondary structure motifs identified, the logo of each motif, its significance, graphic representation and information about its position in the RNA molecules sharing it. The web server is freely available at http://beam.uniroma2.it/ and it is implemented in NodeJS and Python with all major browsers supported. marco.pietrosanto@uniroma2.it. Supplementary data are available at Bioinformatics online.
A national voice network with satellite and small transceivers
NASA Technical Reports Server (NTRS)
Reilly, N. B.; Smith, J. G.
1978-01-01
A geostationary satellite utilizing a large multiple-beam UHF antenna is shown to be potentially capable of providing tens of thousands of voice channels for hundreds of thousands of mobile ground terminals using hand-held or vehicular-mounted transceivers with whip antennas. Inclusion of on-board network switching facilities permits full interconnection between any terminal pair within the continental United States (CONUS). Configuration tradeoff studies at selected frequencies from 150 to 1500 MHz, with antenna diameters ranging from 20 to 200 m, and CONUS-coverage multiple beams down to 0.3 deg beamwidth, establish that monthly system user costs in the range of $90 to $150, including leased and maintained ground equipment, are feasible.
Fierstein, J.; Wilson, C.J.N.
2005-01-01
The 1912 Valley of Ten Thousand Smokes (VTTS) ignimbrite was constructed from 9 compositionally distinct, sequentially emplaced packages, each with distinct proportions of rhyolite (R), dacite (D), and andesite (A) pumices that permit us to map package boundaries and flow paths from vent to distal extents. Changing pumice proportions and interbedding relationships link ignimbrite formation to coeval fall deposition during the first ???16 h (Episode I) of the eruption. Pumice compositional proportions in the ignimbrite were estimated by counts on ???100 lapilli at multiple levels in vertical sections wherever accessible and more widely over most of the ignimbrite surface in the VTTS. The initial, 100% rhyolite ignimbrite package (equivalent to regional fall Layer A and occupying ???3.5 h) was followed by packages with increasing proportions of andesite, then dacite, emplaced over ???12.5 h and equivalent to regional fall Layers B1-B3. Coeval fall deposits are locally intercalated with the ignimbrite and show parallel changes in R:D (rhyolite:dacite) proportions, but lack significant amounts of andesite. Andesite was thus dominantly a low-fountaining component in the eruption column and is preferentially represented in packages filling the VTTS north of the vent. The most extensive packages (3 and 4) occur in B1 and early B2 times where flow mobility and volume were optimized; earlier all-rhyolite flows (Package 1) were highly energetic but less voluminous, while later packages (5-9) were both less voluminous and emplaced at lower velocities. Package boundaries are expressed as one or more of the following: sharp color changes corresponding to compositional variations; persistent finer-grained basal parts of flow units; compaction swales filled by later packages; erosional channels cut by the flows that fill them; lobate accumulations of one package; and (mostly south of the vent) intercalated fall deposit layers. Clear flow-unit boundaries are best developed between ignimbrite of non-successive packages, indicating time breaks of tens of minutes to hours. Less well-defined stratification may represent rapidly emplaced successive flow units but often changes over short distances and indicates variations in localized depositional conditions. ?? 2005 Geological Society of America.
A significant challenge in toxicology is the “too many chemicals” problem. Humans and environmental species are exposed to as many as tens of thousands of chemicals, only a small percentage of which have been tested thoroughly using standard in vivo test methods. This paper revie...
Single-cell genomics for the masses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tringe, Susannah G.
In this issue of Nature Biotechnology, Lan et al. describe a new tool in the toolkit for studying uncultivated microbial communities, enabling orders of magnitude higher single cell genome throughput than previous methods. This is achieved by a complex droplet microfluidics workflow encompassing steps from physical cell isolation through genome sequencing, producing tens of thousands of lowcoverage genomes from individual cells.
Single-cell genomics for the masses
Tringe, Susannah G.
2017-07-12
In this issue of Nature Biotechnology, Lan et al. describe a new tool in the toolkit for studying uncultivated microbial communities, enabling orders of magnitude higher single cell genome throughput than previous methods. This is achieved by a complex droplet microfluidics workflow encompassing steps from physical cell isolation through genome sequencing, producing tens of thousands of lowcoverage genomes from individual cells.
ERIC Educational Resources Information Center
Steinberg, Adria; Almeida, Cheryl
2015-01-01
Few Americans know the importance of community-based organizations, or CBOs, in helping tens of thousands of undereducated, underemployed young people find a job or go back to school. But the role of CBOs is growing more critical as the business, education, and philanthropic sectors increasingly recognize the need to enable the nation's millions…
2003-07-25
This is the first Deep Imaging Survey image taken by NASA Galaxy Evolution Explorer. On June 22 and 23, 2003, the spacecraft obtained this near ultraviolet image of the Groth region by adding multiple orbits for a total exposure time of 14,000 seconds. Tens of thousands of objects can be identified in this picture. http://photojournal.jpl.nasa.gov/catalog/PIA04627
ERIC Educational Resources Information Center
Hoang, Hai; Huang, Melrose; Sulcer, Brian; Yesilyurt, Suleyman
2017-01-01
College math is a gateway course that has become a constraining gatekeeper for tens of thousands of students annually. Every year, over 500,000 students fail developmental mathematics, preventing them from achieving their college and career goals. The Carnegie Math Pathways initiative offers students an alternative. It comprises two Pathways…
The National Association of Charter School Authorizers' Index of Essential Practices
ERIC Educational Resources Information Center
National Association of Charter School Authorizers (NJ1), 2011
2011-01-01
Authorizers are as varied as the schools they oversee. Some are responsible for just one charter, while others monitor hundreds of charters serving tens of thousands of students. Some are school districts, while others are independent statewide boards, universities, not-for-profits, or state education agencies. Regardless of their size and type,…
Inferences of Recent and Ancient Human Population History Using Genetic and Non-Genetic Data
ERIC Educational Resources Information Center
Kitchen, Andrew
2008-01-01
I have adopted complementary approaches to inferring human demographic history utilizing human and non-human genetic data as well as cultural data. These complementary approaches form an interdisciplinary perspective that allows one to make inferences of human history at varying timescales, from the events that occurred tens of thousands of years…
Announcing the First Results from Daya Bay: Discovery of a New Kind of
collaboration observed tens of thousands of interactions of electron antineutrinos, caught by six massive was the sizable disappearance, equal to about six percent. Although disappearance has been observed in . "Even with only the six detectors already operating, we have more target mass than any similar
Update: Report on Innovations in Developmental Mathematics--Moving Mathematical Graveyards
ERIC Educational Resources Information Center
Merseth, Katherine K.
2011-01-01
Every year tens of thousands of students step foot on community college campuses, many for the first time. These students all have one thing in common: hope. They enter these institutions with lofty goals and a fervent expectation that the educative experience they are about to embark upon will fundamentally improve their lives. Yet, their hopes…
Israel’s Efforts to Defeat Iran’s Nuclear Program: An Integrated Use of National Power
2013-05-03
such as Chernobyl , Fukushima, Three Mile Island or Bhopal;” would likely cause the deaths of tens of thousands of noncombatants; and spread...a Chernobyl or Fukushima type disaster transpire. Most Iranians are not aware of the potential risks to which they and their country are being
Be That Teacher! Breaking the Cycle for Struggling Readers
ERIC Educational Resources Information Center
Risko, Victoria J.; Walker-Dalhouse, Doris
2012-01-01
Tens of thousands of students begin each new school year with the hope that they will finally find "the" teacher who will help them succeed as readers, writers, and learners. This book shows how teachers can provide the type of differentiated instruction that struggling readers need by drawing on students' individual and cultural backgrounds, as…
ERIC Educational Resources Information Center
Ellison, L. Marc
2013-01-01
This study explores the current ability of higher education to effectively educate and support college students diagnosed with Asperger's Disorder. As the prevalence of autism spectrum disorders increased dramatically during the past decade, it is estimated that tens of thousands of individuals diagnosed with Asperger's Disorder are…
Emerging Economies Make Ripe Markets for Recruiting Industry
ERIC Educational Resources Information Center
Overland, Martha Ann
2008-01-01
Tens of thousands of international students every year use local recruiters in their homeland to help them get into colleges abroad. Despite the proliferation of the Internet, with e-mail and applications that can be submitted online, students in the developing world still heavily depend on commissioned agents to help them navigate what is to many…
Publishing landscape ecology research in the 21st Century
Eric J. Gustafson
2011-01-01
With the proliferation of journals and scientific papers, it has become impossible to sustain a familiarity with the corpus of ecological literature, which totals tens of thousands of pages per year. Given the number of papers that a well-read ecologist should read, it takes an inordinate amount of time to extract the critical details necessary to superficially...
Emergency Systems Save Tens of Thousands of Lives
NASA Technical Reports Server (NTRS)
2013-01-01
To improve distress signal communications, NASA pioneered the Search and Rescue Satellite Aided Tracking (SARSAT) system. Since its inception, the international system known as Cospas-Sarsat has resulted in the rescue of more than 30,000 people. Techno-Sciences Inc., of Beltsville, Maryland, has been involved with the ground station component of the system from its earliest days.
Student Learning, Student Achievement: How Do Teachers Measure up?
ERIC Educational Resources Information Center
National Board for Professional Teaching Standards, 2011
2011-01-01
The National Board for Professional Teaching Standards (NBPTS) welcomes the efforts of federal, state, and local policymakers to find new ways to ensure an accomplished teacher for every student in America. The National Board has advanced this mission since its inception in 1987. Today, that mission is carried out by the tens of thousands of…
Thrilling but Pointless: General JO Shelby’s 1863 Cavalry Raid
2013-12-13
painfully acute. The air seems filled with exquisite music ; cities and towns rise up on every hand, crowned with spires and radiant with ten thousand...Raid. By the end of festivities , at nearly 2 a.m., Captain Hart recited a prepared poem entitled “Jo Shelby’s Raid.” The spirit of Shelby’s Brigade
How Military Service Affects Student Veteran Success at Community Colleges
ERIC Educational Resources Information Center
O'Rourke, Patrick C., Jr.
2013-01-01
Increasingly more service members are separating from the military as the United States draws down the force and moves towards a post-war era. Tens of thousands of these veterans will leverage their GI Bill tuition and housing benefits in an attempt to access Southern California community colleges and bolster their transition into mainstream…
ERIC Educational Resources Information Center
Cenziper, Debbie; Grotto, Jason
This series of articles examines the condition of public schools and public school construction in Florida's Miami and Dade Counties. To prepare the series, the Miami Herald studied thousands of pages of construction records, correspondence, school district reports, and accounting statements over 15 years. It analyzed state and national…
ERIC Educational Resources Information Center
Carlson, Scott; Lipka, Sara
2009-01-01
In today's tough economy, students and parents alike are looking for ways to save on college tuition. With sticker prices well into the tens of thousands per year at any private liberal-arts institution, the prospect of shaving a year off the typical four-year journey is an added attraction at a number of colleges, like Franklin & Marshall,…
Gravity waves in the thermosphere observed by the AE satellites
NASA Technical Reports Server (NTRS)
Gross, S. H.; Reber, C. A.; Huang, F. T.
1983-01-01
Atmospheric Explorer (AE) satellite data were used to investigate the spectra characteristics of wave-like structure observed in the neutral and ionized components of the thermosphere. Power spectral analysis derived by the maximum entropy method indicate the existence of a broad spectrum of scale sizes for the fluctuations ranging from tens to thousands of kilometers.
Case Study: Youth Transitions Task Force--A Ten-Year Retrospective, Spring 2015
ERIC Educational Resources Information Center
Poulos, Jennifer; d'Entremont, Chad; Culbertson, Nina
2015-01-01
In 2004, Boston Public Schools reported that more than 8% of its students dropped out of school that year. The city faced a crisis. Thousands of students were failing to earn a high-school diploma, a necessary credential for entrance into postsecondary education and/or the twenty-first century workforce. Factors driving students' decisions to…
ERIC Educational Resources Information Center
Hauser, Daniel C.; Johnston, Alison
2016-01-01
American students graduate from college with tens of thousands of dollars in debt, leading to substantial repayment burdens and potentially inefficient shifts in spending patterns and career choices. A political trend towards austerity coupled with the rising student debt make the effective allocation of federal higher education resources and…
Earth Observations taken by the Expedition 16 Crew
2008-01-01
ISS016-E-023723 (January 2008) --- This nocturnal view of the Glendale/Phoenix/Mesa, Arizona area was photographed by one of the Expedition 16 crewmembers aboard the International Space Station. During the last week, this area has been teeming with tens of thousands of football fans here for a big football game in Glendale on Feb. 3.
Abstract: There are tens of thousands of man-made chemicals to which humans are exposed, but only a fraction of these have the extensive in vivo toxicity data used in most traditional risk assessments. This lack of data, coupled with concerns about testing costs and animal use, a...
Tripping with Stephen Gaskin: An Exploration of a Hippy Adult Educator
ERIC Educational Resources Information Center
Morley, Gabriel Patrick
2012-01-01
For the last 40 years, Stephen Gaskin has been an adult educator on the fringe, working with tens of thousands of adults in the counterculture movement in pursuit of social change regarding marijuana legalization, women's rights, environmental justice issues and beyond. Gaskin has written 11 books about his experiences teaching and learning…
Con Artists Attack Colleges with Fake Help-Desk E-Mail
ERIC Educational Resources Information Center
Young, Jeffrey R.
2008-01-01
An e-mail scam has hit tens of thousands of users at dozens of colleges over the past few weeks, leaving network administrators scrambling to respond before campus computer accounts are taken over by spammers. Students, professors, and staff members at the affected colleges received e-mail messages that purported to come from the colleges' help…
NASA Astrophysics Data System (ADS)
Yamamoto, H.; Nakajima, K.; Zhang, K.; Nanai, S.
2015-12-01
Powerful numerical codes that are capable of modeling complex coupled processes of physics and chemistry have been developed for predicting the fate of CO2 in reservoirs as well as its potential impacts on groundwater and subsurface environments. However, they are often computationally demanding for solving highly non-linear models in sufficient spatial and temporal resolutions. Geological heterogeneity and uncertainties further increase the challenges in modeling works. Two-phase flow simulations in heterogeneous media usually require much longer computational time than that in homogeneous media. Uncertainties in reservoir properties may necessitate stochastic simulations with multiple realizations. Recently, massively parallel supercomputers with more than thousands of processors become available in scientific and engineering communities. Such supercomputers may attract attentions from geoscientist and reservoir engineers for solving the large and non-linear models in higher resolutions within a reasonable time. However, for making it a useful tool, it is essential to tackle several practical obstacles to utilize large number of processors effectively for general-purpose reservoir simulators. We have implemented massively-parallel versions of two TOUGH2 family codes (a multi-phase flow simulator TOUGH2 and a chemically reactive transport simulator TOUGHREACT) on two different types (vector- and scalar-type) of supercomputers with a thousand to tens of thousands of processors. After completing implementation and extensive tune-up on the supercomputers, the computational performance was measured for three simulations with multi-million grid models, including a simulation of the dissolution-diffusion-convection process that requires high spatial and temporal resolutions to simulate the growth of small convective fingers of CO2-dissolved water to larger ones in a reservoir scale. The performance measurement confirmed that the both simulators exhibit excellent scalabilities showing almost linear speedup against number of processors up to over ten thousand cores. Generally this allows us to perform coupled multi-physics (THC) simulations on high resolution geologic models with multi-million grid in a practical time (e.g., less than a second per time step).
Ten Years of Speckle Interferometry at SOAR
NASA Astrophysics Data System (ADS)
Tokovinin, Andrei
2018-03-01
Since 2007, close binary and multiple stars are observed by speckle interferometry at the 4.1 m Southern Astrophysical Research (SOAR) telescope. The HRCam instrument, observing strategy and planning, data processing and calibration methods, developed and improved during ten years, are presented here in a concise way. Thousands of binary stars were measured with diffraction-limited resolution (29 mas at 540 nm wavelength) and a high accuracy reaching 1 mas; 200 new pairs or subsystems were discovered. To date, HRCam has performed over 11,000 observations with a high efficiency (up to 300 stars per night). An overview of the main results delivered by this instrument is given.
Benchmarking algorithms for the solution of Collisional Radiative Model (CRM) equations.
NASA Astrophysics Data System (ADS)
Klapisch, Marcel; Busquet, Michel
2007-11-01
Elements used in ICF target designs can have many charge states in the same plasma conditions, each charge state having numerous energy levels. When LTE conditions are not met, one has to solve CRM equations for the populations of energy levels, which are necessary for opacities/emissivities, Z* etc. In case of sparse spectra, or when configuration interaction is important (open d or f shells), statistical methods[1] are insufficient. For these cases one must resort to a detailed level CRM rate generator[2]. The equations to be solved may involve tens of thousands of levels. The system is by nature ill conditioned. We show that some classical methods do not converge. Improvements of the latter will be compared with new algorithms[3] with respect to performance, robustness, and accuracy. [1] A Bar-Shalom, J Oreg, and M Klapisch, J. Q. S. R. T.,65, 43 (2000). [2] M Klapisch, M Busquet and A. Bar-Shalom, Proceedings of APIP'07, AIP series (to be published). [3] M Klapisch and M Busquet, High Ener. Density Phys. 3,143 (2007)
Pettengill, James B; Pightling, Arthur W; Baugher, Joseph D; Rand, Hugh; Strain, Errol
2016-01-01
The adoption of whole-genome sequencing within the public health realm for molecular characterization of bacterial pathogens has been followed by an increased emphasis on real-time detection of emerging outbreaks (e.g., food-borne Salmonellosis). In turn, large databases of whole-genome sequence data are being populated. These databases currently contain tens of thousands of samples and are expected to grow to hundreds of thousands within a few years. For these databases to be of optimal use one must be able to quickly interrogate them to accurately determine the genetic distances among a set of samples. Being able to do so is challenging due to both biological (evolutionary diverse samples) and computational (petabytes of sequence data) issues. We evaluated seven measures of genetic distance, which were estimated from either k-mer profiles (Jaccard, Euclidean, Manhattan, Mash Jaccard, and Mash distances) or nucleotide sites (NUCmer and an extended multi-locus sequence typing (MLST) scheme). When analyzing empirical data (whole-genome sequence data from 18,997 Salmonella isolates) there are features (e.g., genomic, assembly, and contamination) that cause distances inferred from k-mer profiles, which treat absent data as informative, to fail to accurately capture the distance between samples when compared to distances inferred from differences in nucleotide sites. Thus, site-based distances, like NUCmer and extended MLST, are superior in performance, but accessing the computing resources necessary to perform them may be challenging when analyzing large databases.
Pettengill, James B.; Pightling, Arthur W.; Baugher, Joseph D.; ...
2016-11-10
The adoption of whole-genome sequencing within the public health realm for molecular characterization of bacterial pathogens has been followed by an increased emphasis on real-time detection of emerging outbreaks (e.g., food-borne Salmonellosis). In turn, large databases of whole-genome sequence data are being populated. These databases currently contain tens of thousands of samples and are expected to grow to hundreds of thousands within a few years. For these databases to be of optimal use one must be able to quickly interrogate them to accurately determine the genetic distances among a set of samples. Being able to do so is challenging duemore » to both biological (evolutionary diverse samples) and computational (petabytes of sequence data) issues. We evaluated seven measures of genetic distance, which were estimated from either k-mer profiles (Jaccard, Euclidean, Manhattan, Mash Jaccard, and Mash distances) or nucleotide sites (NUCmer and an extended multi-locus sequence typing (MLST) scheme). Finally, when analyzing empirical data (wholegenome sequence data from 18,997 Salmonella isolates) there are features (e.g., genomic, assembly, and contamination) that cause distances inferred from k-mer profiles, which treat absent data as informative, to fail to accurately capture the distance between samples when compared to distances inferred from differences in nucleotide sites. Thus, site-based distances, like NUCmer and extended MLST, are superior in performance, but accessing the computing resources necessary to perform them may be challenging when analyzing large databases.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pettengill, James B.; Pightling, Arthur W.; Baugher, Joseph D.
The adoption of whole-genome sequencing within the public health realm for molecular characterization of bacterial pathogens has been followed by an increased emphasis on real-time detection of emerging outbreaks (e.g., food-borne Salmonellosis). In turn, large databases of whole-genome sequence data are being populated. These databases currently contain tens of thousands of samples and are expected to grow to hundreds of thousands within a few years. For these databases to be of optimal use one must be able to quickly interrogate them to accurately determine the genetic distances among a set of samples. Being able to do so is challenging duemore » to both biological (evolutionary diverse samples) and computational (petabytes of sequence data) issues. We evaluated seven measures of genetic distance, which were estimated from either k-mer profiles (Jaccard, Euclidean, Manhattan, Mash Jaccard, and Mash distances) or nucleotide sites (NUCmer and an extended multi-locus sequence typing (MLST) scheme). Finally, when analyzing empirical data (wholegenome sequence data from 18,997 Salmonella isolates) there are features (e.g., genomic, assembly, and contamination) that cause distances inferred from k-mer profiles, which treat absent data as informative, to fail to accurately capture the distance between samples when compared to distances inferred from differences in nucleotide sites. Thus, site-based distances, like NUCmer and extended MLST, are superior in performance, but accessing the computing resources necessary to perform them may be challenging when analyzing large databases.« less
Southern Pine Beetle Ecology: Populations within Stands
Matthew P. Ayres; Sharon J. Martinson; Nicholas A. Friedenberg
2011-01-01
Populations of southern pine beetle (SPB) are typically substructured into local aggregations, each with tens of thousands of individual beetles. These aggregations, known as âspotsâ because of their appearance during aerial surveys, are the basic unit for the monitoring and management of SPB populations in forested regions. They typically have a maximum lifespan of 1...
ERIC Educational Resources Information Center
Ollerenshaw, Alison; Aidman, Eugene; Kidd, Garry
1997-01-01
This study examined comprehension in four groups of undergraduates under text only, multimedia, and two diagram conditions of text supplementation. Results indicated that effects of text supplementation are mediated by prior knowledge and learning style: multimedia appears more beneficial to surface learners with little prior knowledge and makes…
Variable Selection Strategies for Small-area Estimation Using FIA Plots and Remotely Sensed Data
Andrew Lister; Rachel Riemann; James Westfall; Mike Hoppus
2005-01-01
The USDA Forest Service's Forest Inventory and Analysis (FIA) unit maintains a network of tens of thousands of georeferenced forest inventory plots distributed across the United States. Data collected on these plots include direct measurements of tree diameter and height and other variables. We present a technique by which FIA plot data and coregistered...
James L. Chamberlain; Gabrielle Ness; Christine J. Small; Simon J. Bonner; Elizabeth B. Hiebert
2013-01-01
Non-timber forest products, particularly herbaceous understory plants, support a multi-billion dollar industry and are extracted from forests worldwide for their therapeutic value. Tens of thousands of kilograms of rhizomes and roots of Actaea racemosa L., a native Appalachian forest perennial, are harvested every year and used for the treatment of...
The Choctaw Nation: Changing the Appearance of American Higher Education, 1830-1907
ERIC Educational Resources Information Center
Crum, Steven
2007-01-01
In September 1830 the U.S. government negotiated the Treaty of Dancing Rabbit Creek with some leaders of the Choctaw Nation. The treaty reinforced the congressional Indian Removal Act of 1830, which paved the way for the large-scale physical removal of tens of thousands of tribal people of the southeast, including many of the Choctaw. It provided…
Leaving No Worker Behind: Community Colleges Retrain the Michigan Workforce--and Themselves
ERIC Educational Resources Information Center
Hilliard, Tom
2011-01-01
In 2007, Michigan undertook a bold mission: to retrain tens of thousands of adults to qualify for jobs in emerging and expanding sectors of the economy. The state's proposal to jobless, dislocated, and low-income residents was simple but appealing: enroll in up to two years of postsecondary education, and Michigan would cover up to $5,000 in…
Which Learning Style is Most Effective in Learning Chinese as a Second Language
ERIC Educational Resources Information Center
Ren, Guanxin
2013-01-01
Chinese is not only a tonal but also a visual language represented by tens of thousands of characters which are pictographic in nature. This presents a great challenge to learners whose mother tongue is alphabetical-based such as English. To assist English-speaking background learners to learn Chinese as a Second Language (CSL) well, a good…
Sen. Udall, Mark [D-CO
2012-07-16
Senate - 07/16/2012 Submitted in the Senate, considered, and agreed to without amendment and with a preamble by Unanimous Consent. (All Actions) Tracker: This bill has the status Agreed to in SenateHere are the steps for Status of Legislation:
ERIC Educational Resources Information Center
O'Gorman, Lyndal
2017-01-01
Through the multiple languages of the arts, many ideas about sustainability can be explored with young children. This paper discusses the ethical issues involved in the implementation of a research study that uses artist Chris Jordan's confronting images about sustainability. Jordan's images typically depict tens of thousands of objects such as…
3 CFR 8543 - Proclamation 8543 of July 26, 2010. National Korean War Veterans Armistice Day, 2010
Code of Federal Regulations, 2011 CFR
2011-01-01
... of the United States of America A Proclamation Today we celebrate the signing of the Military... respect, and this partnership is vital to peace and stability in Asia and the world. Since our Nation’s... rallied to the young republic’s defense. Tens of thousands of our Nation’s servicemembers lost their lives...
Ten Things You Should Know about Today's Student Veteran
ERIC Educational Resources Information Center
Lighthall, Alison
2012-01-01
With America's military out of Iraq, and funding for global military operations on the decline, thousands of newly discharged men and women are trying to figure out "What's next?" Most of the Soldiers, Marines, Airmen, and Sailors joined the military before their 21st birthday, and it is often the only job they have ever held. While it is true…
Opening Doors to Nursing Degrees: Time for Action. A Proposal from Ontario's Colleges
ERIC Educational Resources Information Center
Colleges Ontario, 2015
2015-01-01
This report argues that Ontario must expand the educational options for people who want to become registered nurses (RNs). It argues that the change Ontario requires is to authorize colleges to offer their own high-quality nursing degrees. Until 2005, about 70 per cent of Ontario's RNs were educated at colleges. Today, tens of thousands of RNs who…
Microlearning as Innovative Pedagogy for Mobile Learning in MOOCs
ERIC Educational Resources Information Center
Kamilali, Despina; Sofianopoulou, Chryssa
2015-01-01
MOOCs are open online courses offered by major universities, free to everyone, anywhere in the world. Hundreds or tens of thousands of learners enrollee in MOOCs but completion rate is extremely low, sometimes less than 10%. There is a need to explore new and more engaging forms of pedagogy to improve retention. Focusing on this need, this paper,…
ERIC Educational Resources Information Center
Reder, Stephen
2012-01-01
Professor Stephen Reder presented the Longitudinal Study of Adult Learning (LSAL) at The Centre's 2011 Fall Institute--IALS: Its Meaning and Impact for Policy and Practice--whose findings had implications far beyond assessment. Based on evidence from the ten-year study of more than a thousand adult high school drop-outs, Dr. Reder challenges many…
Real-Time Interactive Tree Animation.
Quigley, Ed; Yu, Yue; Huang, Jingwei; Lin, Winnie; Fedkiw, Ronald
2018-05-01
We present a novel method for posing and animating botanical tree models interactively in real time. Unlike other state of the art methods which tend to produce trees that are overly flexible, bending and deforming as if they were underwater plants, our approach allows for arbitrarily high stiffness while still maintaining real-time frame rates without spurious artifacts, even on quite large trees with over ten thousand branches. This is accomplished by using an articulated rigid body model with as-stiff-as-desired rotational springs in conjunction with our newly proposed simulation technique, which is motivated both by position based dynamics and the typical algorithms for articulated rigid bodies. The efficiency of our algorithm allows us to pose and animate trees with millions of branches or alternatively simulate a small forest comprised of many highly detailed trees. Even using only a single CPU core, we can simulate ten thousand branches in real time while still maintaining quite crisp user interactivity. This has allowed us to incorporate our framework into a commodity game engine to run interactively even on a low-budget tablet. We show that our method is amenable to the incorporation of a large variety of desirable effects such as wind, leaves, fictitious forces, collisions, fracture, etc.
Abstraction networks for terminologies: Supporting management of "big knowledge".
Halper, Michael; Gu, Huanying; Perl, Yehoshua; Ochs, Christopher
2015-05-01
Terminologies and terminological systems have assumed important roles in many medical information processing environments, giving rise to the "big knowledge" challenge when terminological content comprises tens of thousands to millions of concepts arranged in a tangled web of relationships. Use and maintenance of knowledge structures on that scale can be daunting. The notion of abstraction network is presented as a means of facilitating the usability, comprehensibility, visualization, and quality assurance of terminologies. An abstraction network overlays a terminology's underlying network structure at a higher level of abstraction. In particular, it provides a more compact view of the terminology's content, avoiding the display of minutiae. General abstraction network characteristics are discussed. Moreover, the notion of meta-abstraction network, existing at an even higher level of abstraction than a typical abstraction network, is described for cases where even the abstraction network itself represents a case of "big knowledge." Various features in the design of abstraction networks are demonstrated in a methodological survey of some existing abstraction networks previously developed and deployed for a variety of terminologies. The applicability of the general abstraction-network framework is shown through use-cases of various terminologies, including the Systematized Nomenclature of Medicine - Clinical Terms (SNOMED CT), the Medical Entities Dictionary (MED), and the Unified Medical Language System (UMLS). Important characteristics of the surveyed abstraction networks are provided, e.g., the magnitude of the respective size reduction referred to as the abstraction ratio. Specific benefits of these alternative terminology-network views, particularly their use in terminology quality assurance, are discussed. Examples of meta-abstraction networks are presented. The "big knowledge" challenge constitutes the use and maintenance of terminological structures that comprise tens of thousands to millions of concepts and their attendant complexity. The notion of abstraction network has been introduced as a tool in helping to overcome this challenge, thus enhancing the usefulness of terminologies. Abstraction networks have been shown to be applicable to a variety of existing biomedical terminologies, and these alternative structural views hold promise for future expanded use with additional terminologies. Copyright © 2015 Elsevier B.V. All rights reserved.
Mahieu, Nathaniel G; Patti, Gary J
2017-10-03
When using liquid chromatography/mass spectrometry (LC/MS) to perform untargeted metabolomics, it is now routine to detect tens of thousands of features from biological samples. Poor understanding of the data, however, has complicated interpretation and masked the number of unique metabolites actually being measured in an experiment. Here we place an upper bound on the number of unique metabolites detected in Escherichia coli samples analyzed with one untargeted metabolomics method. We first group multiple features arising from the same analyte, which we call "degenerate features", using a context-driven annotation approach. Surprisingly, this analysis revealed thousands of previously unreported degeneracies that reduced the number of unique analytes to ∼2961. We then applied an orthogonal approach to remove nonbiological features from the data using the 13 C-based credentialing technology. This further reduced the number of unique analytes to less than 1000. Our 90% reduction in data is 5-fold greater than previously published studies. On the basis of the results, we propose an alternative approach to untargeted metabolomics that relies on thoroughly annotated reference data sets. To this end, we introduce the creDBle database ( http://creDBle.wustl.edu ), which contains accurate mass, retention time, and MS/MS fragmentation data as well as annotations of all credentialed features.
Hydrochemical tracers in the middle Rio Grande Basin, USA: 1. Conceptualization of groundwater flow
Plummer, Niel; Bexfield, L.M.; Anderholm, S.K.; Sanford, W.E.; Busenberg, E.
2004-01-01
Chemical and isotopic data for groundwater from throughout the Middle Rio Grande Basin, central New Mexico, USA, were used to identify and map groundwater flow from 12 sources of water to the basin,evaluate radiocarbon ages, and refine the conceptual model of the Santa Fe Group aquifer system. Hydrochemical zones, representing groundwater flow over thousands to tens of thousands of years, can be traced over large distances through the primarily siliciclastic aquifer system. The locations of the hydrochemical zones mostly reflect the "modern" predevelopment hydraulic-head distribution, but are inconsistent with a trough in predevelopment water levels in the west-central part of the basin, indicating that this trough is a transient rather than a long-term feature of the aquifer system. Radiocarbon ages adjusted for geochemical reactions, mixing, and evapotranspiration/dilution processes in the aquifer system were nearly identical to the unadjusted radiocarbon ages, and ranged from modern to more than 30 ka. Age gradients from piezometer nests ranged from 0.1 to 2 year cm-1 and indicate a recharge rate of about 3 cm year-1 for recharge along the eastern mountain front and infiltration from the Rio Grande near Albuquerque. There has been appreciably less recharge along the eastern mountain front north and south of Albuquerque. ?? Springer-Verlag 2004.
Michot, B.D.; Meselhe, E.A.; Krauss, Ken W.; Shrestha, Surendra; From, Andrew S.; Patino, Eduardo
2017-01-01
At the fringe of Everglades National Park in southwest Florida, United States, the Ten Thousand Islands National Wildlife Refuge (TTINWR) habitat has been heavily affected by the disruption of natural freshwater flow across the Tamiami Trail (U.S. Highway 41). As the Comprehensive Everglades Restoration Plan (CERP) proposes to restore the natural sheet flow from the Picayune Strand Restoration Project area north of the highway, the impact of planned measures on the hydrology in the refuge needs to be taken into account. The objective of this study was to develop a simple, computationally efficient mass balance model to simulate the spatial and temporal patterns of water level and salinity within the area of interest. This model could be used to assess the effects of the proposed management decisions on the surface water hydrological characteristics of the refuge. Surface water variations are critical to the maintenance of wetland processes. The model domain is divided into 10 compartments on the basis of their shared topography, vegetation, and hydrologic characteristics. A diversion of +10% of the discharge recorded during the modeling period was simulated in the primary canal draining the Picayune Strand forest north of the Tamiami Trail (Faka Union Canal) and this discharge was distributed as overland flow through the refuge area. Water depths were affected only modestly. However, in the northern part of the refuge, the hydroperiod, i.e., the duration of seasonal flooding, was increased by 21 days (from 115 to 136 days) for the simulation during the 2008 wet season, with an average water level rise of 0.06 m. The average salinity over a two-year period in the model area just south of Tamiami Trail was reduced by approximately 8 practical salinity units (psu) (from 18 to 10 psu), whereas the peak dry season average was reduced from 35 to 29 psu (by 17%). These salinity reductions were even larger with greater flow diversions (+20%). Naturally, the reduction in salinity diminished toward the open water areas where the daily flood tides mix in saline bay water. Partially restoring hydrologic flows to TTINWR will affect hydroperiod and salinity regimes within downslope wetlands, and perhaps serve as a management tool to reduce the speed of future encroachment of mangroves into marsh as sea levels rise.
ERIC Educational Resources Information Center
Moore, Michael G.
2016-01-01
A systems methodology was employed to design and deliver a highly successful demonstration of the effectiveness of distance education as a means of providing high quality training to tens of thousands of teachers in the most remote areas of Brazil. Key elements in the success of the program were significant funding, top political buy-in, and…
Focusing on function to mine cancer genome data | Center for Cancer Research
CCR scientists have devised a strategy to sift through the tens of thousands of mutations in cancer genome data to find mutations that actually drive the disease. They have used the method to discover that the JNK signaling pathway, which in different contexts can either spur cancerous growth or rein it in, acts as a tumor suppressor in gastric cancers.
A Very Small Astrometry Satellite, Nano-JASMINE: Its Telescope and Mission Goals
NASA Astrophysics Data System (ADS)
Hatsutori, Yoichi; Suganuma, Masahiro; Kobayashi, Yukiyasu; Gouda, Naoteru; Yano, Taihei; Yamada, Yoshiyuki; Yamauchi, Masahiro
This paper introduces a small astrometry satellite, Nano-JASMINE. Nano-JASMINE is mounted a 5-cm effective diameter telescope and aims to measure positions of ten or twenty thousands of stars of z ≤ 8 mag for all-sky with the accuracy of a few milli-arcseconds. The mission goals are clarified and the current status of development of the telescope is reported.
L.B. Brown; B. Allen-Diaz
2009-01-01
Sudden oak death (SOD), caused by the recently discovered non-native invasive pathogen, Phytophthora ramorum, has already killed tens of thousands of native coast live oak and tanoak trees in California. Little is known of potential short and long term impacts of this novel plantâpathogen interaction on forest structure and composition. Coast live...
Major Software Vendor Puts Students on Many Campuses at Risk of Identity Theft
ERIC Educational Resources Information Center
Foster, Andrea
2008-01-01
At least 18 colleges are scrambling to inform tens of thousands of students that they are at risk of having their identities stolen after SunGard, a leading software vendor, reported that a laptop owned by one of its consultants was stolen. The extent of the problem is still unknown, though many of the campuses that have been identified are in…
Risk Management and At-Risk Students: Pernicious Fantasies of Educator Omnipotence. The Cutting Edge
ERIC Educational Resources Information Center
Clabaugh, Gary K.
2004-01-01
For tens of thousands of years human beings relied on oracles, prophets, medicine men, and resignation to try to manage unknown risks. Then, in the transformative 200-year period from the mid-17th through the mid-19th centuries, a series of brilliant insights created groundbreaking tools for rational risk taking. Discoveries such as the theory of…
Deformation and Failure of Protein Materials in Physiologically Extreme Conditions and Disease
2009-03-01
resonance (NMR) spectroscopy and X- ray crystallography have advanced our ability to identify 3D protein structures57. Site-specific studies using NMR, a... ray crystallography, providing structural and temporal information about mechanisms of deformation and assembly (for example in intermediate...tens of thousands of 3D atomistic protein structures, identifying the structure of numerous proteins from varying species sources60. X- ray
ERIC Educational Resources Information Center
Granato, Mona; Krekel, Elisabeth M.; Ulrich, Joachim Gerd
2015-01-01
Every year, tens of thousands of young people in Germany fail to find access to dual vocational education and training (VET), because they cannot find a company to hire them as apprentices. This particularly affects persons with poor school leaving qualifications, socially deprived persons or people with a migrant background. In order to improve…
ERIC Educational Resources Information Center
Miech, Edward J.; Nave, Bill; Mosteller, Frederick
2005-01-01
This article describes what a structured abstract is and how a structured abstract can help researchers sort out information. Today over 1,000 education journals publish more than 20,000 articles in the English language each year. No systematic tool is available at present to get the research findings from these tens of thousands of articles to…
Registering the Human Terrain: A Valuation of Cadastre
2008-01-01
which is also an intelligence topic of increasing salience. Ethno-linguistic maps, such as Figure 1 depicting languages spoken or religions ...Desert] to Congo, tens of thousands of people are at war. You might think these struggles are about religion , or ethnicity, or even political diff...Nazi pseudoscience responsible for 70 million deaths. Academia quickly distanced itself from environmental determinism, the theory behind Geopolitik
Bryan D. Watts; Dana S. Bradshaw
2005-01-01
Within the mid-Atlantic Coastal Plain, lands owned or controlled by government agencies and organizations within the Partners in Flight (PIF) program are highly fragmented. These lands represent tens of thousands of habitat patches that are managed by hundreds of individuals responding to a diversity of directives. Moving this patchwork of lands forward to achieve...
Shared Data Reveal the Invisible Achievement Gap of Students in Foster Care
ERIC Educational Resources Information Center
WestEd, 2014
2014-01-01
At any given time, tens of thousands of children and youth in the U.S. are in the foster care system. Many have been abused, neglected, or abandoned, and they face a challenging journey of uncertainty, often not knowing where they will live next, where they will go to school, or whether they will have contact with friends and relatives. Child…
Assessing Security of Supply: Three Methods Used in Finland
NASA Astrophysics Data System (ADS)
Sivonen, Hannu
Public Private Partnership (PPP) has an important role in securing supply in Finland. Three methods are used in assessing the level of security of supply. First, in national expert groups, a linear mathematical model has been used. The model is based on interdependency estimates. It ranks societal functions or its more detailed components, such as items in the food supply chain, according to the effect and risk pertinent to the interdependencies. Second, the security of supply is assessed in industrial branch committees (clusters and pools) in the form of indicators. The level of security of supply is assessed against five generic factors (dimension 1) and tens of business branch specific functions (dimension 2). Third, in two thousand individual critical companies, the maturity of operational continuity management is assessed using Capability Maturity Model (CMM) in an extranet application. The pool committees and authorities obtain an anonymous summary. The assessments are used in allocating efforts for securing supply. The efforts may be new instructions, training, exercising, and in some cases, investment and regulation.
A Numerical Study on the Streams of Star Debris after Tidal Disruption
NASA Astrophysics Data System (ADS)
Camacho Olachea, Priscila; Ramirez-Ruiz, Enrico; Law-Smith, Jamie
2017-01-01
Lurking at the centers of most galaxies are gigantic star and gas devouring monsters. These monsters are supermassive black holes (SMBHs), some of which are larger than our solar system and ten billion times as massive as our own Sun. The vast majority of stars in the universe live for tens of billions of years, eventually dying from old age as the nuclearreactions that power them become progressively less effective. But for every ten thousand stars that die peacefully, one star will be brutally torn apart by the extreme tidal forces present as it passes near a SMBH. My recent work has been to develop computational tools necessary to study the fates of stars disrupted by SMBHs. In this research project I presentthe results of my numerical study aimed at understanding the streams of star debris that result after disruption.
Legal Regulation of Sodium Consumption to Reduce Chronic Conditions
Barraza, Leila F.
2016-01-01
In the United States, tens of thousands of Americans die each year of heart disease, stroke, or other chronic conditions tied to hypertension from long-term overconsumption of sodium compounds. Major strides to lower dietary sodium have been made over decades, but the goal of reducing Americans’ daily consumption is elusive. The Food and Drug Administration (FDA) has been urged to consider stronger regulatory limits on sodium, especially in processed and prepared foods. Still, FDA categorizes salt (and many other sodium compounds) as “generally recognized as safe,” meaning they can be added to foods when ingested in reasonable amounts. Legal reforms or actions at each level of government offer traditional and new routes to improving chronic disease outcomes. However, using law as a public health tool must be assessed carefully, given potential trade-offs and unproven efficacy. PMID:26890409
Genetic determinants of in vivo fitness and diet responsiveness in multiple human gut Bacteroides
Wu, Meng; McNulty, Nathan P.; Rodionov, Dmitry A.; Khoroshkin, Matvei S.; Griffin, Nicholas W.; Cheng, Jiye; Latreille, Phil; Kerstetter, Randall A.; Terrapon, Nicolas; Henrissat, Bernard; Osterman, Andrei L.; Gordon, Jeffrey I.
2015-01-01
Libraries of tens of thousands of transposon mutants generated from each of four human gut Bacteroides strains, two representing the same species, were introduced simultaneously into gnotobiotic mice together with 11 other wild-type strains to generate a 15-member artificial human gut microbiota. Mice received one of two distinct diets monotonously, or both in ordered sequence. Quantifying the abundance of mutants in different diet contexts allowed gene-level characterization of fitness determinants, niche, stability and resilience, and yielded a prebiotic (arabinoxylan) that allowed targeted manipulation of the community. The approach described is generalizable and should be useful for defining mechanisms critical for sustaining and/or approaches for deliberately reconfiguring the highly adaptive and durable relationship between the human gut microbiota and host in ways that promote wellness. PMID:26430127
Precise starshade stationkeeping and pointing with a Zernike wavefront sensor
NASA Astrophysics Data System (ADS)
Bottom, Michael; Martin, Stefan; Seubert, Carl; Cady, Eric; Zareh, Shannon Kian; Shaklan, Stuart
2017-09-01
Starshades, large occulters positioned tens of thousands of kilometers in front of space telescopes, offer one of the few paths to imaging and characterizing Earth-like extrasolar planets. However, for a starshade to generate a sufficiently dark shadow on the telescope, the two must be coaligned to just 1 meter laterally, even at these large separations. The principal challenge to achieving this level of control is in determining the position of the starshade with respect to the space telescope. In this paper, we present numerical simulations and laboratory results demonstrating that a Zernike wavefront sensor coupled to a WFIRST-type telescope is able to deliver the stationkeeping precision required, by measuring light outside of the science wavelengths. The sensor can determine the starshade lateral position to centimeter level in seconds of open shutter time for stars brighter than eighth magnitude, with a capture range of 10 meters. We discuss the potential for fast (ms) tip/tilt pointing control at the milli-arcsecond level by illuminating the sensor with a laser mounted on the starshade. Finally, we present early laboratory results.
Fibrillar Adhesive for Climbing Robots
NASA Technical Reports Server (NTRS)
Pamess, Aaron; White, Victor E.
2013-01-01
A climbing robot needs to use its adhesive patches over and over again as it scales a slope. Replacing the adhesive at each step is generally impractical. If the adhesive or attachment mechanism cannot be used repeatedly, then the robot must carry an extra load of this adhesive to apply a fresh layer with each move. Common failure modes include tearing, contamination by dirt, plastic deformation of fibers, and damage from loading/ unloading. A gecko-like fibrillar adhesive has been developed that has been shown useful for climbing robots, and may later prove useful for grasping, anchoring, and medical applications. The material consists of a hierarchical fibrillar structure that currently contains two levels, but may be extended to three or four levels in continuing work. The contacting level has tens of thousands of microscopic fibers made from a rubberlike material that bend over and create intimate contact with a surface to achieve maximum van der Waals forces. By maximizing the real area of contact that these fibers make and minimizing the bending energy necessary to achieve that contact, the net amount of adhesion has been improved dramatically.
Elom, Michael O; Eyo, Joseph E; Okafor, Fabian C; Nworie, Amos; Usanga, Victor U; Attamah, Gerald N; Igwe, Chibueze C
2017-02-01
One hundred and fifty-two malaria-infected pregnant women whose pregnancies had advanced to the 6th month were randomised into two study groups - supplemented and placebo groups, after obtaining their approved consents. Ten thousand international units of vitamin A soft gels were administered to the supplemented group three times per week. Vitamin A soft gels devoid of their active ingredients were administered thrice weekly to the placebo group. Two hundred thousand international units of vitamin A was administered to the supplemented groups within 8 weeks postpartum. Placebo was given to the control group at same time after delivery. The regimen was continued in the two groups at three-month intervals until 12 months. Quarterly, 3 ml of venous blood was collected from each infant in the two groups and was used for the estimation of hemoglobin concentrations and determination of blood glucose levels. Hemoglobin concentrations were estimated using hemiglobincyanide method while the blood glucose levels were determined with a glucometer. Analysis of variance, Fisher's least significant difference and t-test were used for data analysis. Statistical significance was established at p < 0.05. Both hemoglobin concentrations and blood glucose levels were significantly (p < 0.05) higher in the supplemented group than in the placebo group. The malaria infection mitigating effects of maternal vitamin A supplementation have been established in the present study and supported by previous studies. Vitamin A supplementation, fortification of foods with vitamin A and diversification of diets, are advocated for maintenance of good health and protection against some infectious diseases.
Hinkle, Stephen R; Böhlke, J K; Fisher, Lawrence H
2008-12-15
Septic tank systems are an important source of NO3(-) to many aquifers, yet characterization of N mass balance and isotope systematics following septic tank effluent discharge into unsaturated sediments has received limited attention. In this study, samples of septic tank effluent before and after transport through single-pass packed-bed filters (sand filters) were evaluated to elucidate mass balance and isotope effects associated with septic tank effluent discharge to unsaturated sediments. Chemical and isotopic data from five newly installed pairs and ten established pairs of septic tanks and packed-bed filters serving single homes in Oregon indicate that aqueous solute concentrations are affected by variations in recharge (precipitation, evapotranspiration), NH4+ sorption (primarily in immature systems), nitrification, and gaseous N loss via NH3 volatilization and(or) N2 or N2O release during nitrification/denitrification. Substantial NH4+ sorption capacity was also observed in laboratory columns with synthetic effluent. Septic tank effluent delta15N-NH4+ values were almost constant and averaged +4.9 per thousand+/-0.4 per thousand (1 sigma). In contrast, delta15N values of NO3(-) leaving mature packed-bed filters were variable (+0.8 to +14.4 per thousand) and averaged +7.2 per thousand+/-2.6 per thousand. Net N loss in the two networks of packed-bed filters was indicated by average 10-30% decreases in Cl(-)-normalized N concentrations and 2-3 per thousand increases in delta15N, consistent with fractionation accompanying gaseous N losses and corroborating established links between septic tank effluent and NO3(-) in a local, shallow aquifer. Values of delta18O-NO3(-) leaving mature packed-bed filters ranged from -10.2 to -2.3 per thousand (mean -6.4 per thousand+/-1.8 per thousand), and were intermediate between a 2/3 H2O-O+1/3 O2-O conceptualization and a 100% H2O-O conceptualization of delta18O-NO3(-) generation during nitrification.
Zhang, Han; Wheeler, William; Hyland, Paula L; Yang, Yifan; Shi, Jianxin; Chatterjee, Nilanjan; Yu, Kai
2016-06-01
Meta-analysis of multiple genome-wide association studies (GWAS) has become an effective approach for detecting single nucleotide polymorphism (SNP) associations with complex traits. However, it is difficult to integrate the readily accessible SNP-level summary statistics from a meta-analysis into more powerful multi-marker testing procedures, which generally require individual-level genetic data. We developed a general procedure called Summary based Adaptive Rank Truncated Product (sARTP) for conducting gene and pathway meta-analysis that uses only SNP-level summary statistics in combination with genotype correlation estimated from a panel of individual-level genetic data. We demonstrated the validity and power advantage of sARTP through empirical and simulated data. We conducted a comprehensive pathway-based meta-analysis with sARTP on type 2 diabetes (T2D) by integrating SNP-level summary statistics from two large studies consisting of 19,809 T2D cases and 111,181 controls with European ancestry. Among 4,713 candidate pathways from which genes in neighborhoods of 170 GWAS established T2D loci were excluded, we detected 43 T2D globally significant pathways (with Bonferroni corrected p-values < 0.05), which included the insulin signaling pathway and T2D pathway defined by KEGG, as well as the pathways defined according to specific gene expression patterns on pancreatic adenocarcinoma, hepatocellular carcinoma, and bladder carcinoma. Using summary data from 8 eastern Asian T2D GWAS with 6,952 cases and 11,865 controls, we showed 7 out of the 43 pathways identified in European populations remained to be significant in eastern Asians at the false discovery rate of 0.1. We created an R package and a web-based tool for sARTP with the capability to analyze pathways with thousands of genes and tens of thousands of SNPs.
Zhang, Han; Wheeler, William; Hyland, Paula L.; Yang, Yifan; Shi, Jianxin; Chatterjee, Nilanjan; Yu, Kai
2016-01-01
Meta-analysis of multiple genome-wide association studies (GWAS) has become an effective approach for detecting single nucleotide polymorphism (SNP) associations with complex traits. However, it is difficult to integrate the readily accessible SNP-level summary statistics from a meta-analysis into more powerful multi-marker testing procedures, which generally require individual-level genetic data. We developed a general procedure called Summary based Adaptive Rank Truncated Product (sARTP) for conducting gene and pathway meta-analysis that uses only SNP-level summary statistics in combination with genotype correlation estimated from a panel of individual-level genetic data. We demonstrated the validity and power advantage of sARTP through empirical and simulated data. We conducted a comprehensive pathway-based meta-analysis with sARTP on type 2 diabetes (T2D) by integrating SNP-level summary statistics from two large studies consisting of 19,809 T2D cases and 111,181 controls with European ancestry. Among 4,713 candidate pathways from which genes in neighborhoods of 170 GWAS established T2D loci were excluded, we detected 43 T2D globally significant pathways (with Bonferroni corrected p-values < 0.05), which included the insulin signaling pathway and T2D pathway defined by KEGG, as well as the pathways defined according to specific gene expression patterns on pancreatic adenocarcinoma, hepatocellular carcinoma, and bladder carcinoma. Using summary data from 8 eastern Asian T2D GWAS with 6,952 cases and 11,865 controls, we showed 7 out of the 43 pathways identified in European populations remained to be significant in eastern Asians at the false discovery rate of 0.1. We created an R package and a web-based tool for sARTP with the capability to analyze pathways with thousands of genes and tens of thousands of SNPs. PMID:27362418
Single-cell barcoding and sequencing using droplet microfluidics.
Zilionis, Rapolas; Nainys, Juozas; Veres, Adrian; Savova, Virginia; Zemmour, David; Klein, Allon M; Mazutis, Linas
2017-01-01
Single-cell RNA sequencing has recently emerged as a powerful tool for mapping cellular heterogeneity in diseased and healthy tissues, yet high-throughput methods are needed for capturing the unbiased diversity of cells. Droplet microfluidics is among the most promising candidates for capturing and processing thousands of individual cells for whole-transcriptome or genomic analysis in a massively parallel manner with minimal reagent use. We recently established a method called inDrops, which has the capability to index >15,000 cells in an hour. A suspension of cells is first encapsulated into nanoliter droplets with hydrogel beads (HBs) bearing barcoding DNA primers. Cells are then lysed and mRNA is barcoded (indexed) by a reverse transcription (RT) reaction. Here we provide details for (i) establishing an inDrops platform (1 d); (ii) performing hydrogel bead synthesis (4 d); (iii) encapsulating and barcoding cells (1 d); and (iv) RNA-seq library preparation (2 d). inDrops is a robust and scalable platform, and it is unique in its ability to capture and profile >75% of cells in even very small samples, on a scale of thousands or tens of thousands of cells.
NASA Astrophysics Data System (ADS)
Schulthess, Thomas C.
2013-03-01
The continued thousand-fold improvement in sustained application performance per decade on modern supercomputers keeps opening new opportunities for scientific simulations. But supercomputers have become very complex machines, built with thousands or tens of thousands of complex nodes consisting of multiple CPU cores or, most recently, a combination of CPU and GPU processors. Efficient simulations on such high-end computing systems require tailored algorithms that optimally map numerical methods to particular architectures. These intricacies will be illustrated with simulations of strongly correlated electron systems, where the development of quantum cluster methods, Monte Carlo techniques, as well as their optimal implementation by means of algorithms with improved data locality and high arithmetic density have gone hand in hand with evolving computer architectures. The present work would not have been possible without continued access to computing resources at the National Center for Computational Science of Oak Ridge National Laboratory, which is funded by the Facilities Division of the Office of Advanced Scientific Computing Research, and the Swiss National Supercomputing Center (CSCS) that is funded by ETH Zurich.
Roebroeks, Wil; Soressi, Marie
2016-01-01
The last decade has seen a significant growth of our knowledge of the Neandertals, a population of Pleistocene hunter-gatherers who lived in (western) Eurasia between ∼400,000 and 40,000 y ago. Starting from a source population deep in the Middle Pleistocene, the hundreds of thousands of years of relative separation between African and Eurasian groups led to the emergence of different phenotypes in Late Pleistocene Europe and Africa. Both recently obtained genetic evidence and archeological data show that the biological and cultural gaps between these populations were probably smaller than previously thought. These data, reviewed here, falsify inferences to the effect that, compared with their near-modern contemporaries in Africa, Neandertals were outliers in terms of behavioral complexity. It is only around 40,000 y ago, tens of thousands of years after anatomically modern humans first left Africa and thousands of years after documented interbreeding between modern humans, Neandertals and Denisovans, that we see major changes in the archeological record, from western Eurasia to Southeast Asia, e.g., the emergence of representational imagery and the colonization of arctic areas and of greater Australia (Sahul). PMID:27274044
Dunes on Titan observed by Cassini Radar
Radebaugh, J.; Lorenz, R.D.; Lunine, J.I.; Wall, S.D.; Boubin, G.; Reffet, E.; Kirk, R.L.; Lopes, R.M.; Stofan, E.R.; Soderblom, L.; Allison, M.; Janssen, M.; Paillou, P.; Callahan, P.; Spencer, C.; ,
2008-01-01
Thousands of longitudinal dunes have recently been discovered by the Titan Radar Mapper on the surface of Titan. These are found mainly within ??30?? of the equator in optically-, near-infrared-, and radar-dark regions, indicating a strong proportion of organics, and cover well over 5% of Titan's surface. Their longitudinal duneform, interactions with topography, and correlation with other aeolian forms indicate a single, dominant wind direction aligned with the dune axis plus lesser, off-axis or seasonally alternating winds. Global compilations of dune orientations reveal the mean wind direction is dominantly eastwards, with regional and local variations where winds are diverted around topographically high features, such as mountain blocks or broad landforms. Global winds may carry sediments from high latitude regions to equatorial regions, where relatively drier conditions prevail, and the particles are reworked into dunes, perhaps on timescales of thousands to tens of thousands of years. On Titan, adequate sediment supply, sufficient wind, and the absence of sediment carriage and trapping by fluids are the dominant factors in the presence of dunes. ?? 2007 Elsevier Inc. All rights reserved.
Inexpensive Cable Space Launcher of High Capability
NASA Technical Reports Server (NTRS)
Bolonkin, Alexander
2002-01-01
This paper proposes a new method and transportation system to fly into space, to the Moon, Mars, and other planets. This transportation system uses a mechanical energy transfer and requires only minimal energy so that it provides a 'Free Trip' into space. The method uses the rotary and kinetic energy of planets, asteroids, moons, satellites and other natural space bodies. computations for the following projects: 1. Non-Rocket Method for free launch of payload in Space and to other planets. The low cost project will accommodate one hundred thousand tourists annually. 2. Free Trips to the Mars for two thousand annually. 3. Free Trips to the Moon for ten thousand people annually. The projects use artificial materials like nanotubes and whiskers that have a ratio of tensile strength to density equal 4 million meters. In the future, nanotubes will be produced that can reach a specific stress up 100 millions meter and will significantly improve the parameters of suggested projects. The author is prepared to discuss the problems with serious organizations that want to research and develop these inventions.
Mining for metals in society's waste
Smith, Kathleen S.; Plumlee, Geoffrey S.; Hageman, Philip L.
2015-01-01
Metals and minerals are natural resources that human beings have been mining for thousands of years. Contemporary metal mining is dominated by iron ore, copper and gold, with 2 billion tons of iron ore, nearly 20 million tons of copper and 2,000 tons of gold produced every year. Tens to hundreds of tons of other metals that are essential components for electronics, green energy production, and high-technology products are produced annually.
Ph.D.'s Spend Big Bucks Hunting for Academic Jobs, with No Guaranteed Results
ERIC Educational Resources Information Center
Patton, Stacey
2013-01-01
Ph.D.'s are used to shelling out tens of thousands of dollars in the name of education. But earning the top graduate degree doesn't mean their spending has come to an end. An industry designed to help aspiring academics manage the job-application process and land tenure-track jobs is growing, and reaping the benefits of a tight market in many…
UCLA High Speed, High Volume Laboratory Network for Infectious Diseases. Addendum
2009-08-01
s) and should not be construed as an official Department of the Army position, policy or decision unless so designated by other documentation... Design : Because of current public health and national security threats, influenza surveillance and analysis will be the initial focus. In the upcoming...throughput and automated systems will enable processing of tens of thousands of samples and provide critical laboratory capacity. Its overall design and
Strategic Studies Quarterly. Volume 4, Number 1, Spring 2010
2010-01-01
scientists to produce more infectious pathogens through the use of genetic manipulation. Indeed, the reproduc tive capacity of bacteria and viruses...eries will give potential bioterrorists the ability to genetically engineer and produce new biological weapons for only tens of thousands of dollars...United States, because of its dominant economy and political clout, was able to levy neo-liberal policy prescriptions under the rubric of the
Burma: Assessing Options for U.S. Engagement
2009-06-01
2009). 28 Christina Fink, Living Silence: Burma Under Military Rule (Bangkok: White Lotus Company Ltd. 2001) 125. 29 John F. Cady, The United...death.’ Deprived of food , shelter, and medical treatment, tens of thousands died laying tracks through the fever-ridden mountainous jungle. The...hamlet’ operations in Vietnam, Ne Win implemented a policy called “Four Cuts” which was intended to cut all links to food , funds, intelligence, and
Removing the Stigma: For God and Country
2013-03-01
Virginians to the Puritans. George Washington exemplified the sentiments of our founding fathers in his response to the address from the Hebrew Congregation...distribution of Bibles 18 to the Japanese people. He declared, “We must have ten thousand Christian missionaries and a million bibles to complete the...crosses and Christian messages were painted on military vehicles driving through Iraq; images of U.S. soldiers holding rifles and bibles were posted on
NASA Technical Reports Server (NTRS)
Kalelkar, A. S.; Fiksel, J.; Rosenfield, D.; Richardson, D. L.; Hagopian, J.
1980-01-01
The risks associated with electrical effects arising from carbon fibers released from commercial aviation aircraft fires were estimated for 1993. The expected annual losses were estimated to be about $470 (1977 dollars) in 1993. The chances of total losses from electrical effects exceeding $100,000 (1977 dollars) in 1993 were established to be about one in ten thousand.
ERIC Educational Resources Information Center
Udell, Monique A. R.; Wynne, C. D. L.
2008-01-01
Dogs likely were the first animals to be domesticated and as such have shared a common environment with humans for over ten thousand years. Only recently, however, has this species' behavior been subject to scientific scrutiny. Most of this work has been inspired by research in human cognitive psychology and suggests that in many ways dogs are…
Pouria Bahmani; John van de Lindt; Asif Iqbal; Douglas Rammer
2017-01-01
Soft-story wood-frame buildings have been recognized as a disaster preparedness problem for decades. There are tens of thousands of these multi-family three- and four-story structures throughout California and the United States. The majority were constructed between 1920 and 1970, with many being prevalent in the San Francisco Bay Area in California. The NEES Soft...
ERIC Educational Resources Information Center
Joo, Hee-Jung Serenity
2015-01-01
In the last two decades, the issue of comfort women--the women and girls who were forced into sex slavery for the Japanese army before and during WWII--has risen to global attention. Tens of thousands of comfort women (the average estimate is anywhere between 80,000 and 200,000) were confined at comfort stations managed by the Japanese Imperial…
ERIC Educational Resources Information Center
Shultz, Ginger V.; Gottfried, Amy C.; Winschel, Grace A.
2015-01-01
General chemistry is a gateway course that impacts the STEM trajectory of tens of thousands of students each year, and its role in the introductory curriculum as well as its pedagogical design are the center of an ongoing debate. To investigate the role of general chemistry in the curriculum, we report the results of a posthoc analysis of 10 years…
Long-term evolution of an Oligocene/Miocene maar lake from Otago, New Zealand
NASA Astrophysics Data System (ADS)
Fox, B. R. S.; Wartho, J.; Wilson, G. S.; Lee, D. E.; Nelson, F. E.; Kaulfuss, U.
2015-01-01
Foulden Maar is a highly resolved maar lake deposit from the South Island of New Zealand comprising laminated diatomite punctuated by numerous diatomaceous turbidites. Basaltic clasts found in debris flow deposits near the base of the cored sedimentary sequence yielded two new 40Ar/39Ar dates of 24.51 ± 0.24 and 23.38 ± 0.24 Ma (2σ). The younger date agrees within error with a previously published 40Ar/39Ar date of 23.17 ± 0.19 Ma from a basaltic dyke adjacent to the maar crater. The diatomite is inferred to have been deposited over several tens of thousands of years in the latest Oligocene/earliest Miocene, and may have been coeval with the period of rapid glaciation and subsequent deglaciation of Antarctica known as the Mi-1 event. Sediment magnetic properties and SEM measurements indicate that the magnetic signal is dominated by pseudo-single domain pyrrhotite. The most likely source of detrital pyrrhotite is schist country rock fragments from the inferred tephra ring created by the phreatomagmatic eruption that formed the maar. Variations in magnetic mineral concentration indicate a decrease in erosional input throughout the depositional period, suggesting long-term (tens of thousands of years) environmental change in New Zealand in the latest Oligocene/earliest Miocene.
The Galaxy platform for accessible, reproducible and collaborative biomedical analyses: 2018 update.
Afgan, Enis; Baker, Dannon; Batut, Bérénice; van den Beek, Marius; Bouvier, Dave; Cech, Martin; Chilton, John; Clements, Dave; Coraor, Nate; Grüning, Björn A; Guerler, Aysam; Hillman-Jackson, Jennifer; Hiltemann, Saskia; Jalili, Vahid; Rasche, Helena; Soranzo, Nicola; Goecks, Jeremy; Taylor, James; Nekrutenko, Anton; Blankenberg, Daniel
2018-05-22
Galaxy (homepage: https://galaxyproject.org, main public server: https://usegalaxy.org) is a web-based scientific analysis platform used by tens of thousands of scientists across the world to analyze large biomedical datasets such as those found in genomics, proteomics, metabolomics and imaging. Started in 2005, Galaxy continues to focus on three key challenges of data-driven biomedical science: making analyses accessible to all researchers, ensuring analyses are completely reproducible, and making it simple to communicate analyses so that they can be reused and extended. During the last two years, the Galaxy team and the open-source community around Galaxy have made substantial improvements to Galaxy's core framework, user interface, tools, and training materials. Framework and user interface improvements now enable Galaxy to be used for analyzing tens of thousands of datasets, and >5500 tools are now available from the Galaxy ToolShed. The Galaxy community has led an effort to create numerous high-quality tutorials focused on common types of genomic analyses. The Galaxy developer and user communities continue to grow and be integral to Galaxy's development. The number of Galaxy public servers, developers contributing to the Galaxy framework and its tools, and users of the main Galaxy server have all increased substantially.
Reconciling short recurrence intervals with minor deformation in the New Madrid seismic zone
Schweig, E.S.; Ellis, M.A.
1994-01-01
At least three great earthquakes occurred in the New Madrid seismic zone in 1811 and 1812. Estimates of present-day strain rates suggest that such events may have a repeat time of 1000 years or less. Paleoseismological data also indicate that earthquakes large enough to cause soil liquefaction have occurred several times in the past 5000 years. However, pervasive crustal deformation expected from such a high frequency of large earthquakes is not observed. This suggests that the seismic zone is a young feature, possibly as young as several tens of thousands of years old and no more than a few million years old.At least three great earthquakes occurred in the New Madrid seismic zone in 1811 and 1812. Estimates of present-day strain rates suggest that such events may have a repeat time of 1000 years or less. Paleoseismological data also indicate that earthquakes large enough to cause soil liquefaction have occurred several times in the past 5000 years. However, pervasive crustal deformation expected from such a high frequency of large earthquakes is not observed. This suggests that the seismic zone is a young feature, possibly as young as several tens of thousands of years old and no more than a few million years old.
Bhaskar, Anand; Wang, Y X Rachel; Song, Yun S
2015-02-01
With the recent increase in study sample sizes in human genetics, there has been growing interest in inferring historical population demography from genomic variation data. Here, we present an efficient inference method that can scale up to very large samples, with tens or hundreds of thousands of individuals. Specifically, by utilizing analytic results on the expected frequency spectrum under the coalescent and by leveraging the technique of automatic differentiation, which allows us to compute gradients exactly, we develop a very efficient algorithm to infer piecewise-exponential models of the historical effective population size from the distribution of sample allele frequencies. Our method is orders of magnitude faster than previous demographic inference methods based on the frequency spectrum. In addition to inferring demography, our method can also accurately estimate locus-specific mutation rates. We perform extensive validation of our method on simulated data and show that it can accurately infer multiple recent epochs of rapid exponential growth, a signal that is difficult to pick up with small sample sizes. Lastly, we use our method to analyze data from recent sequencing studies, including a large-sample exome-sequencing data set of tens of thousands of individuals assayed at a few hundred genic regions. © 2015 Bhaskar et al.; Published by Cold Spring Harbor Laboratory Press.
HD-MTL: Hierarchical Deep Multi-Task Learning for Large-Scale Visual Recognition.
Fan, Jianping; Zhao, Tianyi; Kuang, Zhenzhong; Zheng, Yu; Zhang, Ji; Yu, Jun; Peng, Jinye
2017-02-09
In this paper, a hierarchical deep multi-task learning (HD-MTL) algorithm is developed to support large-scale visual recognition (e.g., recognizing thousands or even tens of thousands of atomic object classes automatically). First, multiple sets of multi-level deep features are extracted from different layers of deep convolutional neural networks (deep CNNs), and they are used to achieve more effective accomplishment of the coarseto- fine tasks for hierarchical visual recognition. A visual tree is then learned by assigning the visually-similar atomic object classes with similar learning complexities into the same group, which can provide a good environment for determining the interrelated learning tasks automatically. By leveraging the inter-task relatedness (inter-class similarities) to learn more discriminative group-specific deep representations, our deep multi-task learning algorithm can train more discriminative node classifiers for distinguishing the visually-similar atomic object classes effectively. Our hierarchical deep multi-task learning (HD-MTL) algorithm can integrate two discriminative regularization terms to control the inter-level error propagation effectively, and it can provide an end-to-end approach for jointly learning more representative deep CNNs (for image representation) and more discriminative tree classifier (for large-scale visual recognition) and updating them simultaneously. Our incremental deep learning algorithms can effectively adapt both the deep CNNs and the tree classifier to the new training images and the new object classes. Our experimental results have demonstrated that our HD-MTL algorithm can achieve very competitive results on improving the accuracy rates for large-scale visual recognition.
NASA Astrophysics Data System (ADS)
Ohdachi, Satoshi; Watanabe, Kiyomasa; Sakakibara, Satoru; Suzuki, Yasuhiro; Tsuchiya, Hayato; Ming, Tingfeng; Du, Xiaodi; LHD Expriment Group Team
2014-10-01
In the Large Helical Device (LHD), the plasma is surrounded by the so-called magnetic stochastic region, where the Kolmogorov length of the magnetic field lines is very short, from several tens of meters and to thousands meters. Finite pressure gradient are formed in this region and MHD instabilities localized in this region is observed since the edge region of the LHD is always unstable against the pressure driven mode. Therefore, the saturation level of the instabilities is the key issue in order to evaluate the risk of this kind of MHD instabilities. The saturation level depends on the pressure gradient and on the magnetic Reynolds number; there results are similar to the MHD mode in the closed magnetic surface region. The saturation level in the stochastic region is affected also by the stocasticity itself. Parameter dependence of the saturation level of the MHD activities in the region is discussed in detail. It is supported by NIFS budget code ULPP021, 028 and is also partially supported by the Ministry of Education, Science, Sports and Culture, Grant-in-Aid for Scientific Research 26249144, by the JSPS-NRF-NSFC A3 Foresight Program NSFC: No. 11261140328.
Predicting the impact of tsunami in California under rising sea level
NASA Astrophysics Data System (ADS)
Dura, T.; Garner, A. J.; Weiss, R.; Kopp, R. E.; Horton, B.
2017-12-01
The flood hazard for the California coast depends not only on the magnitude, location, and rupture length of Alaska-Aleutian subduction zone earthquakes and their resultant tsunamis, but also on rising sea levels, which combine with tsunamis to produce overall flood levels. The magnitude of future sea-level rise remains uncertain even on the decadal scale, with future sea-level projections becoming even more uncertain at timeframes of a century or more. Earthquake statistics indicate that timeframes of ten thousand to one hundred thousand years are needed to capture rare, very large earthquakes. Because of the different timescales between reliable sea-level projections and earthquake distributions, simply combining the different probabilities in the context of a tsunami hazard assessment may be flawed. Here, we considered 15 earthquakes between Mw 8 to Mw 9.4 bound by -171oW and -140oW of the Alaska-Aleutian subduction zone. We employed 24 realizations at each magnitude with random epicenter locations and different fault length-to-width ratios, and simulated the tsunami evolution from these 360 earthquakes at each decade from the years 2000 to 2200. These simulations were then carried out for different sea-level-rise projections to analyze the future flood hazard for California. Looking at the flood levels at tide gauges, we found that the flood level simulated at, for example, the year 2100 (including respective sea-level change) is different from the flood level calculated by adding the flood for the year 2000 to the sea-level change prediction for the year 2100. This is consistent for all sea-level rise scenarios, and this difference in flood levels range between 5% and 12% for the larger half of the given magnitude interval. Focusing on flood levels at the tide gauge in the Port of Los Angeles, the most probable flood level (including all earthquake magnitudes) in the year 2000 was 5 cm. Depending on the sea-level predictions, in the year 2050 the most probable flood levels could rise to 20 to 30 cm, but increase significantly from 2100 to 2200 to between 0.5 m and 2.5 m. Aside from the significant increase in flood level, it should be noted that the range over which potential most probable flood levels can vary is significant and defines a tremendous challenge for long-term planning of hazard mitigating measures.
NASA Astrophysics Data System (ADS)
Shaw, Glenn E.
1988-02-01
Tropospheric aerosols with the diameter range of half a micron reside in the atmosphere for tens of days and teleconnect Antarctica with other regions by transport that reaches planetary scales of distances; thus, the aerosol on the Antarctic ice represents 'memory modules' of events that took place at regions separated from Antarctica by tens of thousands of kilometers. In terms of aerosol mass, the aerosol species include insoluble crustal products (less than 5 percent), transported sea-salt residues (highly variable but averaging about 10 percent), Ni-rich meteoric material, and anomalously enriched material with an unknown origin. Most (70-90 percent by mass) of the aerosol over the Antarctic ice shield, however, is the 'natural acid sulfate aerosol', apparently deriving from biological processes taking place in the surrounding oceans.
The Prophylactic Extraction of Third Molars: A Public Health Hazard
Friedman, Jay W.
2007-01-01
Ten million third molars (wisdom teeth) are extracted from approximately 5 million people in the United States each year at an annual cost of over $3 billion. In addition, more than 11 million patient days of “standard discomfort or disability”—pain, swelling, bruising, and malaise—result postoperatively, and more than 11000 people suffer permanent paresthesia—numbness of the lip, tongue, and cheek—as a consequence of nerve injury during the surgery. At least two thirds of these extractions, associated costs, and injuries are unnecessary, constituting a silent epidemic of iatrogenic injury that afflicts tens of thousands of people with lifelong discomfort and disability. Avoidance of prophylactic extraction of third molars can prevent this public health hazard. PMID:17666691
NASA Astrophysics Data System (ADS)
Townsley, Leisa
2016-09-01
Massive star-forming regions (MSFRs) are engines of change across the Galaxy, providing its ionization, fueling the hot ISM, and seeding spiral arms with tens of thousands of new stars. Galactic MSFRs are springboards for understanding their extragalactic counterparts, which provide the basis for star formation rate calibrations and form the building blocks of starburst galaxies. This archive program will extend Chandra's lexicon of the Galaxy's MSFRs with in-depth analysis of 16 complexes, studying star formation and evolution on scales of tenths to tens of parsecs, distances <1 to >10 kpc, and ages <1 to >15 Myr. It fuses a "Physics of the Cosmos" mission with "Cosmic Origins" science, bringing new insight into star formation and feedback through Chandra's unique X-ray perspective.
More MAGiX in the Chandra Archive
NASA Astrophysics Data System (ADS)
Townsley, Leisa
2017-09-01
Massive star-forming regions (MSFRs) are engines of change across the Galaxy, providing its ionization, fueling the hot ISM, and seeding spiral arms with tens of thousands of new stars. Resolvable MSFRs are microscopes for understanding their more distant extragalactic counterparts, which provide the basis for star formation rate calibrations and form the building blocks of starburst galaxies. This archive program will extend Chandra's lexicon of MSFRs with in-depth analysis of 16 complexes, studying star formation and evolution on scales of tenths to tens of parsecs, distances <1 to >50 kpc, and ages <1 to 25 Myr. It fuses a "Physics of the Cosmos" mission with "Cosmic Origins" science, bringing new insight into star formation and feedback through Chandra's unique X-ray perspective.
Brownian relaxation of an inelastic sphere in air
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bird, G. A., E-mail: gab@gab.com.au
2016-06-15
The procedures that are used to calculate the forces and moments on an aerodynamic body in the rarefied gas of the upper atmosphere are applied to a small sphere of the size of an aerosol particle at sea level. While the gas-surface interaction model that provides accurate results for macroscopic bodies may not be appropriate for bodies that are comprised of only about a thousand atoms, it provides a limiting case that is more realistic than the elastic model. The paper concentrates on the transfer of energy from the air to an initially stationary sphere as it acquires Brownian motion.more » Individual particle trajectories vary wildly, but a clear relaxation process emerges from an ensemble average over tens of thousands of trajectories. The translational and rotational energies in equilibrium Brownian motion are determined. Empirical relationships are obtained for the mean translational and rotational relaxation times, the mean initial power input to the particle, the mean rates of energy transfer between the particle and air, and the diffusivity. These relationships are functions of the ratio of the particle mass to an average air molecule mass and the Knudsen number, which is the ratio of the mean free path in the air to the particle diameter. The ratio of the molecular radius to the particle radius also enters as a correction factor. The implications of Brownian relaxation for the second law of thermodynamics are discussed.« less
Modeling the chemistry of complex petroleum mixtures.
Quann, R J
1998-01-01
Determining the complete molecular composition of petroleum and its refined products is not feasible with current analytical techniques because of the astronomical number of molecular components. Modeling the composition and behavior of such complex mixtures in refinery processes has accordingly evolved along a simplifying concept called lumping. Lumping reduces the complexity of the problem to a manageable form by grouping the entire set of molecular components into a handful of lumps. This traditional approach does not have a molecular basis and therefore excludes important aspects of process chemistry and molecular property fundamentals from the model's formulation. A new approach called structure-oriented lumping has been developed to model the composition and chemistry of complex mixtures at a molecular level. The central concept is to represent an individual molecular or a set of closely related isomers as a mathematical construct of certain specific and repeating structural groups. A complex mixture such as petroleum can then be represented as thousands of distinct molecular components, each having a mathematical identity. This enables the automated construction of large complex reaction networks with tens of thousands of specific reactions for simulating the chemistry of complex mixtures. Further, the method provides a convenient framework for incorporating molecular physical property correlations, existing group contribution methods, molecular thermodynamic properties, and the structure--activity relationships of chemical kinetics in the development of models. PMID:9860903
Active Vertex Model for cell-resolution description of epithelial tissue mechanics
Barton, Daniel L.; Henkes, Silke
2017-01-01
We introduce an Active Vertex Model (AVM) for cell-resolution studies of the mechanics of confluent epithelial tissues consisting of tens of thousands of cells, with a level of detail inaccessible to similar methods. The AVM combines the Vertex Model for confluent epithelial tissues with active matter dynamics. This introduces a natural description of the cell motion and accounts for motion patterns observed on multiple scales. Furthermore, cell contacts are generated dynamically from positions of cell centres. This not only enables efficient numerical implementation, but provides a natural description of the T1 transition events responsible for local tissue rearrangements. The AVM also includes cell alignment, cell-specific mechanical properties, cell growth, division and apoptosis. In addition, the AVM introduces a flexible, dynamically changing boundary of the epithelial sheet allowing for studies of phenomena such as the fingering instability or wound healing. We illustrate these capabilities with a number of case studies. PMID:28665934
[Molecular regulation of microbial secondary metabolites--a review].
Wang, Linqi; Tan, Huarong
2009-04-01
Microbial secondary metabolites play an important role in the field of industry, agriculture, medicine and human health. The molecular regulation of secondary metabolites is gradually becoming noticeable and intriguing. In recent years, many researches have demonstrated that secondary metabolite biosynthesis is tightly linked to the physiological and developmental status in its producer. It is suggested that the biosynthesis of secondary metabolites involves in complex process concerning multi-level regulation. Here we reviewed the recent research progress on the molecular regulation of secondary metabolites in microorganisms. In known about ten thousand kinds of natural secondary metabolites, most of them (about 60%) were produced by Streptomycete. Therefore, the regulation of secondary metabolites in Streptomyces is chosen as the mainline in this review. Additionally, several well-studied antibiotics as the representative members were targeted. Finally, some suggestions, in response to the issues at present, have been presented in this paper.
A Blind Survey for AGN in the Kepler Field through Optical Variability
NASA Astrophysics Data System (ADS)
Olling, Robert; Shaya, E. J.; Mushotzky, R.
2013-01-01
We present an initial analysis of three quarters of Kepler LLC time series of 400 small galaxies. The Kepler LLC data is sampled about twice per hour, and allows us to investigate variability on time scales between about one day and one month. The calibrated Kepler LLC light curves still contain many instrumental effects that can not be taken out in a robust manner. Instead, our analysis relies on the similarity of variability measures in the three independent quarters to decide if an galaxy shows variability, or not. We estimate that roughly 15% of our small galaxies shows variability at levels exceeding several parts per thousand (mmag) on timescales of days to weeks. However, this estimate is probably uncertain by a factor of two. Our data is more sensitive by several factors of ten as compared to extant data sets.
The electronics readout and data acquisition system of the KM3NeT neutrino telescope node
DOE Office of Scientific and Technical Information (OSTI.GOV)
Real, Diego; Collaboration: KM3NeT Collaboration
2014-11-18
The KM3NeT neutrino telescope will be composed by tens of thousands of glass spheres, called Digital Optical Module (DOM), each of them containing 31 PMTs of small photocathode area (3'). The readout and data acquisition system of KM3NeT have to collect, treat and send to shore, in an economic way, the enormous amount of data produced by the photomultipliers and at the same time to provide time synchronization between each DOM at the level of 1 ns. It is described in the present article the Central Logic Board, that integrates the Time to Digital Converters and the White Rabbit protocolmore » used for the DOM synchronization in a transparent way, the Power Board used in the DOM, the PMT base to readout the photomultipliers and the respective collecting boards, the so called Octopus Board.« less
Active Vertex Model for cell-resolution description of epithelial tissue mechanics.
Barton, Daniel L; Henkes, Silke; Weijer, Cornelis J; Sknepnek, Rastko
2017-06-01
We introduce an Active Vertex Model (AVM) for cell-resolution studies of the mechanics of confluent epithelial tissues consisting of tens of thousands of cells, with a level of detail inaccessible to similar methods. The AVM combines the Vertex Model for confluent epithelial tissues with active matter dynamics. This introduces a natural description of the cell motion and accounts for motion patterns observed on multiple scales. Furthermore, cell contacts are generated dynamically from positions of cell centres. This not only enables efficient numerical implementation, but provides a natural description of the T1 transition events responsible for local tissue rearrangements. The AVM also includes cell alignment, cell-specific mechanical properties, cell growth, division and apoptosis. In addition, the AVM introduces a flexible, dynamically changing boundary of the epithelial sheet allowing for studies of phenomena such as the fingering instability or wound healing. We illustrate these capabilities with a number of case studies.
Operation and performance of the mars exploration rover imaging system on the martian surface
Maki, J.N.; Litwin, T.; Schwochert, M.; Herkenhoff, K.
2005-01-01
The Imaging System on the Mars Exploration Rovers has successfully operated on the surface of Mars for over one Earth year. The acquisition of hundreds of panoramas and tens of thousands of stereo pairs has enabled the rovers to explore Mars at a level of detail unprecedented in the history of space exploration. In addition to providing scientific value, the images also play a key role in the daily tactical operation of the rovers. The mobile nature of the MER surface mission requires extensive use of the imaging system for traverse planning, rover localization, remote sensing instrument targeting, and robotic arm placement. Each of these activity types requires a different set of data compression rates, surface coverage, and image acquisition strategies. An overview of the surface imaging activities is provided, along with a summary of the image data acquired to date. ?? 2005 IEEE.
M-OTDR sensing system based on 3D encoded microstructures
Sun, Qizhen; Ai, Fan; Liu, Deming; Cheng, Jianwei; Luo, Hongbo; Peng, Kuan; Luo, Yiyang; Yan, Zhijun; Shum, Perry Ping
2017-01-01
In this work, a quasi-distributed sensing scheme named as microstructured OTDR (M-OTDR) by introducing ultra-weak microstructures along the fiber is proposed. Owing to its relative higher reflectivity compared with the backscattered coefficient in fiber and three dimensional (3D) i.e. wavelength/frequency/time encoded property, the M-OTDR system exhibits the superiorities of high signal to noise ratio (SNR), high spatial resolution of millimeter level and high multiplexing capacity up to several ten thousands theoretically. A proof-of-concept system consisting of 64 sensing units is constructed to demonstrate the feasibility and sensing performance. With the help of the demodulation method based on 3D analysis and spectrum reconstruction of the signal light, quasi-distributed temperature sensing with a spatial resolution of 20 cm as well as a measurement resolution of 0.1 °C is realized. PMID:28106132
Biofilm growth program and architecture revealed by single-cell live imaging
NASA Astrophysics Data System (ADS)
Yan, Jing; Sabass, Benedikt; Stone, Howard; Wingreen, Ned; Bassler, Bonnie
Biofilms are surface-associated bacterial communities. Little is known about biofilm structure at the level of individual cells. We image living, growing Vibrio cholerae biofilms from founder cells to ten thousand cells at single-cell resolution, and discover the forces underpinning the architectural evolution of the biofilm. Mutagenesis, matrix labeling, and simulations demonstrate that surface-adhesion-mediated compression causes V. cholerae biofilms to transition from a two-dimensional branched morphology to a dense, ordered three-dimensional cluster. We discover that directional proliferation of rod-shaped bacteria plays a dominant role in shaping the biofilm architecture, and this growth pattern is controlled by a single gene. Competition analyses reveal the advantages of the dense growth mode in providing the biofilm with superior mechanical properties. We will further present continuum theory to model the three-dimensional growth of biofilms at the solid-liquid interface as well as solid-air interface.
Predicting the Lifetimes of Nuclear Waste Containers
NASA Astrophysics Data System (ADS)
King, Fraser
2014-03-01
As for many aspects of the disposal of nuclear waste, the greatest challenge we have in the study of container materials is the prediction of the long-term performance over periods of tens to hundreds of thousands of years. Various methods have been used for predicting the lifetime of containers for the disposal of high-level waste or spent fuel in deep geological repositories. Both mechanical and corrosion-related failure mechanisms need to be considered, although until recently the interactions of mechanical and corrosion degradation modes have not been considered in detail. Failure from mechanical degradation modes has tended to be treated through suitable container design. In comparison, the inevitable loss of container integrity due to corrosion has been treated by developing specific corrosion models. The most important aspect, however, is to be able to justify the long-term predictions by demonstrating a mechanistic understanding of the various degradation modes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heffner, M.; Riot, V.; Fabris, L.
Medium to large channel count detectors are usually faced with a few unattractive options for data acquisition (DAQ). Small to medium sized TPC experiments, for example, can be too small to justify the high expense and long development time of application specific integrated circuit (ASIC) development. In some cases an experiment can piggy-back on a larger experiment and the associated ASIC development, but this puts the time line of development out of the hands of the smaller experiment. Another option is to run perhaps thousands of cables to rack mounted equipment, which is clearly undesirable. The development of commercial high-speedmore » high-density FPGAs and ADCs combined with the small discrete components and robotic assembly open a new option that scales to tens of thousands of channels and is only slightly larger than ASICs using off-the-shelf components.« less
Rechargeable nickel-3D zinc batteries: An energy-dense, safer alternative to lithium-ion.
Parker, Joseph F; Chervin, Christopher N; Pala, Irina R; Machler, Meinrad; Burz, Michael F; Long, Jeffrey W; Rolison, Debra R
2017-04-28
The next generation of high-performance batteries should include alternative chemistries that are inherently safer to operate than nonaqueous lithium-based batteries. Aqueous zinc-based batteries can answer that challenge because monolithic zinc sponge anodes can be cycled in nickel-zinc alkaline cells hundreds to thousands of times without undergoing passivation or macroscale dendrite formation. We demonstrate that the three-dimensional (3D) zinc form-factor elevates the performance of nickel-zinc alkaline cells in three fields of use: (i) >90% theoretical depth of discharge (DOD Zn ) in primary (single-use) cells, (ii) >100 high-rate cycles at 40% DOD Zn at lithium-ion-commensurate specific energy, and (iii) the tens of thousands of power-demanding duty cycles required for start-stop microhybrid vehicles. Copyright © 2017, American Association for the Advancement of Science.
Statistical Detection of Atypical Aircraft Flights
NASA Technical Reports Server (NTRS)
Statler, Irving; Chidester, Thomas; Shafto, Michael; Ferryman, Thomas; Amidan, Brett; Whitney, Paul; White, Amanda; Willse, Alan; Cooley, Scott; Jay, Joseph;
2006-01-01
A computational method and software to implement the method have been developed to sift through vast quantities of digital flight data to alert human analysts to aircraft flights that are statistically atypical in ways that signify that safety may be adversely affected. On a typical day, there are tens of thousands of flights in the United States and several times that number throughout the world. Depending on the specific aircraft design, the volume of data collected by sensors and flight recorders can range from a few dozen to several thousand parameters per second during a flight. Whereas these data have long been utilized in investigating crashes, the present method is oriented toward helping to prevent crashes by enabling routine monitoring of flight operations to identify portions of flights that may be of interest with respect to safety issues.
Hydrogen-oxygen proton-exchange membrane fuel cells and electrolyzers
NASA Technical Reports Server (NTRS)
Baldwin, R.; Pham, M.; Leonida, A.; Mcelroy, J.; Nalette, T.
1989-01-01
Hydrogen-oxygen solid polymer electrolyte (SPE) fuel cells and SPE electrolyzers (products of Hamilton Standard) both use a Proton-Exchange Membrane (PEM) as the sole electrolyte. These solid electrolyte devices have been under continuous development for over 30 years. This experience has resulted in a demonstrated ten-year SPE cell life capability under load conditions. Ultimate life of PEM fuel cells and electrolyzers is primarily related to the chemical stability of the membrane. For perfluorocarbon proton exchange membranes an accurate measure of the membrane stability is the fluoride loss rate. Millions of cell hours have contributed to establishing a relationship between fluoride loss rates and average expected ultimate cell life. This relationship is shown. Several features have been introduced into SPE fuel cells and SPE electrolyzers such that applications requiring greater than or equal to 100,000 hours of life can be considered. Equally important as the ultimate life is the voltage stability of hydrogen-oxygen fuel cells and electrolyzers. Here again the features of SPE fuel cells and SPE electrolyzers have shown a cell voltage stability in the order of 1 microvolt per hour. That level of stability has been demonstrated for tens of thousands of hours in SPE fuel cells at up to 500 amps per square foot (ASF) current density.
Nuclear Energy in Southeast Asia: Pull Rods or Scram
2009-06-01
December 29, 2008); Seth Mydans, “Tens of thousands join Myanmar protest,” International Harold Tribune, September 24, 2007, http://www.iht.com...articles/2007/09/24/news/myanmar.php (accessed December 29, 2008); Seth Mydans; “Myanmar monk protest contained by Junta forces,” The New York Times...Nuclear Plant for Electricity.” Associated Press, September 26, 2008. http://www.ap.org (accessed October 20, 2008). Mydans, Seth . “Myanmar monk
Dental Calculus and the Evolution of the Human Oral Microbiome.
Warinner, Christina
2016-07-01
Characterizing the evolution of the oral microbiome is a challenging, but increasingly feasible, task. Recently, dental calculus has been shown to preserve ancient biomolecules from the oral microbiota, host tissues and diet for tens of thousands of years. As such, it provides a unique window into the ancestral oral microbiome. This article reviews recent advancements in ancient dental calculus research and emerging insights into the evolution and ecology of the human oral microbiome.
Pouria Bahmani; John W. van de Lindt; Mikhail Gershfeld; Gary L. Mochizuki; Steven E. Pryor; Douglas Rammer
2016-01-01
Soft-story wood-frame buildings have been recognized as a disaster preparedness problem for decades. There are tens of thousands of these multifamily three- and four-story structures throughout California and other parts of the United States. The majority were constructed between 1920 and 1970 and are prevalent in regions such as the San Francisco Bay Area in...
Sub-Saharan Africa Report, No. 2828
1983-08-03
EEC Food Aid 59 - d - TEN MILLION AFRICANS MAY STARVE THIS WINTER Johannesburg THE STAR in English 7 Jul 83 p 1 INTER-AFRICAN AFFAIRS [Text...BULÄWAYÖ - At least 10 million people in five South- ern African countries will need emergency food aid if they are to survive the win- ter. This...about half the population of about one million are already receiving emergen- cy food rations while thousands of cattle are being slaughtered
Probability of illness definition for the Skylab flight crew health stabilization program
NASA Technical Reports Server (NTRS)
1974-01-01
Management and analysis of crew and environmental microbiological data from SMEAT and Skylab are discussed. Samples were collected from ten different body sites on each SMEAT and Skylab crew-member on approximately 50 occasions and since several different organisms could be isolated from each sample, several thousand lab reports were generated. These lab reports were coded and entered in a computer file and from the file various tabular summaries were constructed.
Big Data Quality Case Study Preliminary Findings, U.S. Army MEDCOM MODS
2013-09-01
captured in electronic form is relatively small, on the order of hundreds of thousands of health profiles at say around 500K per profile, or in the...in electronic form, then different language identification, handwriting recognition, and Natural Language Processing (NLP) techniques could be used...and patterns” [15]. Volume - The free text fields vary in length from say ten characters to several hundred characters. Other materials can be much
Israel: Possible Military Strike Against Iran’s Nuclear Facilities
2012-03-27
centrifuge facility and a larger commercial facility located at this site. The commercial facility is reportedly hardened by steel-reinforced concrete , buried...prime minister has had to contemplate. A strike against Iran’s nuclear facilities could lead to regional conflagration , tens of thousands of...high explosives, and can penetrate more than 6 feet of reinforced concrete . The GBU-28 5000-lb class weapon penetrates at least 20 feet of concrete
Reclamation of Wood Materials Coated with Lead-Based Paint
2008-05-01
Tens of thousands of temporary wooden buildings from the World War II (WWII) era, consisting of more than 50 million sf of floor area , await...Camp Roberts. The amount of heartwood and sapwood is an important quality characteristic in the redwood species, with a higher content of...landfilling that material as C&D debris. There are two general areas of opportunity to improve efficiency and reduce costs in a deconstruction and
2016-04-01
AFRL-AFOSR-VA-TR-2016-0145 Quasi-continuum reduction of field theories: A route to seamlessly bridge quantum and atomistic length-scales with...field theories: A route to seamlessly bridge quantum and atomistic length-scales with continuum Principal Investigator: Vikram Gavini Department of...calculations on tens of thousands of atoms, and enable continuing efforts towards a seamless bridging of the quantum and continuum length-scales
Dai, Mingwei; Ming, Jingsi; Cai, Mingxuan; Liu, Jin; Yang, Can; Wan, Xiang; Xu, Zongben
2017-09-15
Results from genome-wide association studies (GWAS) suggest that a complex phenotype is often affected by many variants with small effects, known as 'polygenicity'. Tens of thousands of samples are often required to ensure statistical power of identifying these variants with small effects. However, it is often the case that a research group can only get approval for the access to individual-level genotype data with a limited sample size (e.g. a few hundreds or thousands). Meanwhile, summary statistics generated using single-variant-based analysis are becoming publicly available. The sample sizes associated with the summary statistics datasets are usually quite large. How to make the most efficient use of existing abundant data resources largely remains an open question. In this study, we propose a statistical approach, IGESS, to increasing statistical power of identifying risk variants and improving accuracy of risk prediction by i ntegrating individual level ge notype data and s ummary s tatistics. An efficient algorithm based on variational inference is developed to handle the genome-wide analysis. Through comprehensive simulation studies, we demonstrated the advantages of IGESS over the methods which take either individual-level data or summary statistics data as input. We applied IGESS to perform integrative analysis of Crohns Disease from WTCCC and summary statistics from other studies. IGESS was able to significantly increase the statistical power of identifying risk variants and improve the risk prediction accuracy from 63.2% ( ±0.4% ) to 69.4% ( ±0.1% ) using about 240 000 variants. The IGESS software is available at https://github.com/daviddaigithub/IGESS . zbxu@xjtu.edu.cn or xwan@comp.hkbu.edu.hk or eeyang@hkbu.edu.hk. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Developing seismogenic source models based on geologic fault data
Haller, Kathleen M.; Basili, Roberto
2011-01-01
Calculating seismic hazard usually requires input that includes seismicity associated with known faults, historical earthquake catalogs, geodesy, and models of ground shaking. This paper will address the input generally derived from geologic studies that augment the short historical catalog to predict ground shaking at time scales of tens, hundreds, or thousands of years (e.g., SSHAC 1997). A seismogenic source model, terminology we adopt here for a fault source model, includes explicit three-dimensional faults deemed capable of generating ground motions of engineering significance within a specified time frame of interest. In tectonically active regions of the world, such as near plate boundaries, multiple seismic cycles span a few hundred to a few thousand years. In contrast, in less active regions hundreds of kilometers from the nearest plate boundary, seismic cycles generally are thousands to tens of thousands of years long. Therefore, one should include sources having both longer recurrence intervals and possibly older times of most recent rupture in less active regions of the world rather than restricting the model to include only Holocene faults (i.e., those with evidence of large-magnitude earthquakes in the past 11,500 years) as is the practice in tectonically active regions with high deformation rates. During the past 15 years, our institutions independently developed databases to characterize seismogenic sources based on geologic data at a national scale. Our goal here is to compare the content of these two publicly available seismogenic source models compiled for the primary purpose of supporting seismic hazard calculations by the Istituto Nazionale di Geofisica e Vulcanologia (INGV) and the U.S. Geological Survey (USGS); hereinafter we refer to the two seismogenic source models as INGV and USGS, respectively. This comparison is timely because new initiatives are emerging to characterize seismogenic sources at the continental scale (e.g., SHARE in the Euro-Mediterranean, http://www.share-eu.org/; EMME in the Middle East, http://www.emme-gem.org/) and global scale (e.g., GEM, http://www.globalquakemodel.org/; Anonymous 2008). To some extent, each of these efforts is still trying to resolve the level of optimal detail required for this type of compilation. The comparison we provide defines a common standard for consideration by the international community for future regional and global seismogenic source models by identifying the necessary parameters that capture the essence of geological fault data in order to characterize seismogenic sources. In addition, we inform potential users of differences in our usage of common geological/seismological terms to avoid inappropriate use of the data in our models and provide guidance to convert the data from one model to the other (for detailed instructions, see the electronic supplement to this article). Applying our recommendations will permit probabilistic seismic hazard assessment codes to run seamlessly using either seismogenic source input. The USGS and INGV database schema compare well at a first-level inspection. Both databases contain a set of fields representing generalized fault three-dimensional geometry and additional fields that capture the essence of past earthquake occurrences. Nevertheless, there are important differences. When we further analyze supposedly comparable fields, many are defined differently. These differences would cause anomalous results in hazard prediction if one assumes the values are similarly defined. The data, however, can be made fully compatible using simple transformations.
Track following of Ξ-hyperons in nuclear emulsion for the E07 experiment
NASA Astrophysics Data System (ADS)
Mishina, Akihiro; Nakazawa, Kazuma; Hoshino, Kaoru; Itonaga, Kazunori; Yoshida, Junya; Than Tint, Khin; Kyaw Soe, Myint; Kinbara, Shinji; Itoh, Hiroki; Endo, Yoko; Kobayashi, Hidetaka; Umehara, Kaori; Yokoyama, Hiroyuki; Nakashima, Daisuke; J-PARC E07 Collaboration
2014-09-01
Events of Double- Λ and Twin Single- Λ Hypernuclei are very important to understand Λ- Λ and Ξ--N interaction. We planned the E07 experiment to find Nuclear mass dependences of them with ten times higher statistics than before. In the experiment, the number of Ξ- hyperon stopping at rest is about ten thousands which is ten times larger than before. Such number of tracks for Ξ- hyperon candidates should be followed in nuclear emulsion plate up to their stopping point. To complete its job within one year, it is necessary for development of automated track following system. The important points for track following is Track connection in plate by plate. To carry out these points, we innovated image processing methods. Especially, we applied pattern match of K- beams for 2nd point. Position accuracy of this method was 1.4 +/-0.8 μm . If we succeed this application in about one minute for a track in each plate, all track following can be finished in one year.
Buttigieg, Pier Luigi; Ramette, Alban
2015-01-01
Marine bacteria colonizing deep-sea sediments beneath the Arctic ocean, a rapidly changing ecosystem, have been shown to exhibit significant biogeographic patterns along transects spanning tens of kilometers and across water depths of several thousand meters (Jacob et al., 2013). Jacob et al. (2013) adopted what has become a classical view of microbial diversity – based on operational taxonomic units clustered at the 97% sequence identity level of the 16S rRNA gene – and observed a very large microbial community replacement at the HAUSGARTEN Long Term Ecological Research station (Eastern Fram Strait). Here, we revisited these data using the oligotyping approach and aimed to obtain new insight into ecological and biogeographic patterns associated with bacterial microdiversity in marine sediments. We also assessed the level of concordance of these insights with previously obtained results. Variation in oligotype dispersal range, relative abundance, co-occurrence, and taxonomic identity were related to environmental parameters such as water depth, biomass, and sedimentary pigment concentration. This study assesses ecological implications of the new microdiversity-based technique using a well-characterized dataset of high relevance for global change biology. PMID:25601856
Molecular-level dynamics of refractory dissolved organic matter
NASA Astrophysics Data System (ADS)
Niggemann, J.; Gerdts, G.; Dittmar, T.
2012-04-01
Refractory dissolved organic matter (DOM) accounts for most of the global oceanic organic carbon inventory. Processes leading to its formation and factors determining its stability are still largely unknown. We hypothesize that refractory DOM carries a universal molecular signature. Characterizing spatial and temporal variability in this universal signature is a key to understanding dynamics of refractory DOM. We present results from a long-term study of the DOM geo-metabolome in the open North Sea. Geo-metabolomics considers the entity of DOM as a population of compounds, each characterized by a specific function and reactivity in the cycling of energy and elements. Ten-thousands of molecular formulae were identified in DOM by ultrahigh resolution mass spectrometry analysis (FT-ICR-MS, Fourier-Transform Ion Cyclotron Resonance Mass Spectrometry). The DOM pool in the North Sea was influenced by a complex interplay of processes that produced, transformed and degraded dissolved molecules. We identified a stable fraction in North Sea DOM with a molecular composition similar to deep ocean DOM. Molecular-level changes in this stable fraction provide novel information on dynamics and interactions of refractory DOM.
The power dynamics perpetuating unsafe abortion in Africa: a feminist perspective.
Braam, Tamara; Hessini, Leila
2004-04-01
Tens of thousands of African women die every year because societies and governments either ignore the issue of unsafe abortion or actively refuse to address it. This paper explores the issue of abortion from a feminist perspective, centrally arguing that finding appropriate strategies to reclaim women's power at an individual and social level is a central lever for developing effective strategies to increase women's access to safe abortion services. The paper emphasises the central role of patriarchy in shaping the ways power plays itself out in individual relationships, and at social, economic and political levels. The ideology of male superiority denies abortion as an important issue of status and frames the morality, legality and socio-cultural attitudes towards abortion. Patriarchy sculpts unequal gender power relationships and takes power away from women in making decisions about their bodies. Other forms of power such as economic inequality, discourse and power within relationships are also explored. Recommended solutions to shifting the power dynamics around the issue include a combination of public health, rights-based, legal reform and social justice approaches.
Interplay of ICP and IXP over the Internet with power-law features
NASA Astrophysics Data System (ADS)
Fan, Zhongyan; Tang, Wallace Kit-Sang
The Internet is the largest artificial network consisting of billions of IP devices, managed by tens of thousands of autonomous systems (ASes). Due to its importance, the Internet has received much attention and its topological features, mainly in AS-level, have been widely explored from the complex network perspective. However, most of the previous studies assume a homogeneous model in which nodes are indistinguishable in nature. It may be good for a general study of topological structure, but unfortunately it fails to reflect the functionality. The Internet ecology is in fact heterogeneous and highly complex. It consists of various elements such as Internet Exchange Points (IXPs), Internet Content Providers (ICPs), and normal Autonomous System (ASes), realizing different roles in the Internet. In this paper, we propose level-structured network models for investigating how ICP performs under the AS-topology with power-law features and how IXP enhances its performance from a complex network perspective. Based on real data, our results reveal that the power-law nature of the Internet facilitates content delivery not only in efficiency but also in path redundancy. Moreover, the proposed multi-level framework is able to clearly illustrate the significant benefits gained by ICP from IXP peerings.
NASA Technical Reports Server (NTRS)
Levinton, Douglas B.; Cash, Webster C.; Gleason, Brian; Kaiser, Michael J.; Levine, Sara A.; Lo, Amy S.; Schindhelm, Eric; Shipley, Ann F.
2007-01-01
A new mission concept for the direct imaging of exo-solar planets called the New Worlds Observer (NWO) has been proposed. The concept involves flying a meter-class space telescope in formation with a newly-conceived, specially-shaped, deployable star-occulting shade several meters across at a separation of some tens of thousands of kilometers. The telescope would make its observations from behind the starshade in a volume of high suppression of incident irradiance from the star around which planets orbit. The required level of irradiance suppression created by the starshade for an efficacious mission is of order 0.1 to 10 parts per billion in broadband light. This paper discusses the experimental setup developed to accurately measure the suppression ratio of irradiance produced at the null position behind candidate starshade forms to these levels. It also presents results of broadband measurements which demonstrated suppression levels of just under 100 parts per billion in air using the Sun as a light source. Analytical modeling of spatial irradiance distributions surrounding the null are presented and compared with photographs of irradiance captured in situ behind candidate starshades.
High Speed White Dwarf Asteroseismology with the Herty Hall Cluster
NASA Astrophysics Data System (ADS)
Gray, Aaron; Kim, A.
2012-01-01
Asteroseismology is the process of using observed oscillations of stars to infer their interior structure. In high speed asteroseismology, we complete that by quickly computing hundreds of thousands of models to match the observed period spectra. Each model on a single processor takes five to ten seconds to run. Therefore, we use a cluster of sixteen Dell Workstations with dual-core processors. The computers use the Ubuntu operating system and Apache Hadoop software to manage workloads.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. House Committee on the Budget.
According to Congressman Charles E. Schumer in his opening statement, the deqrease in Federal housing funds is inextricably linked to the increase in homelessness. Since 1981 the Reagan Administration has been systematically dismantling the nation's housing programs, leaving tens of thousands of low-income people homeless. In 1982 there were 1,088…
Vortex Advisory System. Volume I. Effectiveness for Selected Airports.
1980-05-01
analysis of tens of thousands of vortex tracks. Wind velocity was found to be the primary determinant of vortex behavior. The VAS uses wind-velocity...and the correlation of vortex be- havior with the ambient winds. Analysis showed that a wind-rose criterion could be used to determine when interarrival...Washington DC. 2. Hallock, J.N., " Vortex Advisory System Safety Analysis , Vol. I: Analytical Model ," FAA-RD-78-68,1, Sep. 1978, DOT/ Transportation
A Public Trust: An Executive Summary of GREAT I,
1980-09-01
reconstruction and/or ex- cargo in 1975. Commodities such as tens of thousands of species of plants pmsion of locks arid dam 26. grains, fertilizer...our residen- open backwater areas to marshland. well as the many species of plants , tial, commercial, and industrial Conpario of an 1895 sounding of...appropriate States and based work done to dab idicate that recreational use of the river on the GREAT I site-specific recommendations. wi gmo and ta ned for
Assessing the Impact of Social Media on the 25 January 2011 Egyptian Revolution
2012-03-01
Ahmed Maher to support workers of Mahalla al Kubra. The group used the Internet (social media and blogs), mobile telephones, and word of mouth to...especially the poor, were united by their collective struggle. As a result, the word of mouth went viral and brought tens of thousands of Egyptians...distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words ) In the light of the dramatic events of the 25 January 2011
Genome-scale engineering of Saccharomyces cerevisiae with single-nucleotide precision.
Bao, Zehua; HamediRad, Mohammad; Xue, Pu; Xiao, Han; Tasan, Ipek; Chao, Ran; Liang, Jing; Zhao, Huimin
2018-07-01
We developed a CRISPR-Cas9- and homology-directed-repair-assisted genome-scale engineering method named CHAnGE that can rapidly output tens of thousands of specific genetic variants in yeast. More than 98% of target sequences were efficiently edited with an average frequency of 82%. We validate the single-nucleotide resolution genome-editing capability of this technology by creating a genome-wide gene disruption collection and apply our method to improve tolerance to growth inhibitors.
Development of mini VSAT system
NASA Astrophysics Data System (ADS)
Lu, Shyue-Ching; Chiu, Wu-Jhy; Lin, Hen-Dao; Shih, Mu-Piao
1992-03-01
This paper presents the mini VSAT (very small aperture terminal) system, which is a low cost networking system providing economical alternatives to the business world's datacom needs. The system is designed to achieve the highest possible performance/price ratio for private VSAT networks that range from a few tens of remote terminals to large networks of several thousands remote terminals. The paper describes the system architecture, major features, hardware and software structure, access protocol and performance of the developed system.
Dinov, Ivo D.; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Zamanyan, Alen; Torri, Federica; Macciardi, Fabio; Hobel, Sam; Moon, Seok Woo; Sung, Young Hee; Jiang, Zhiguo; Labus, Jennifer; Kurth, Florian; Ashe-McNalley, Cody; Mayer, Emeran; Vespa, Paul M.; Van Horn, John D.; Toga, Arthur W.
2013-01-01
The volume, diversity and velocity of biomedical data are exponentially increasing providing petabytes of new neuroimaging and genetics data every year. At the same time, tens-of-thousands of computational algorithms are developed and reported in the literature along with thousands of software tools and services. Users demand intuitive, quick and platform-agnostic access to data, software tools, and infrastructure from millions of hardware devices. This explosion of information, scientific techniques, computational models, and technological advances leads to enormous challenges in data analysis, evidence-based biomedical inference and reproducibility of findings. The Pipeline workflow environment provides a crowd-based distributed solution for consistent management of these heterogeneous resources. The Pipeline allows multiple (local) clients and (remote) servers to connect, exchange protocols, control the execution, monitor the states of different tools or hardware, and share complete protocols as portable XML workflows. In this paper, we demonstrate several advanced computational neuroimaging and genetics case-studies, and end-to-end pipeline solutions. These are implemented as graphical workflow protocols in the context of analyzing imaging (sMRI, fMRI, DTI), phenotypic (demographic, clinical), and genetic (SNP) data. PMID:23975276
Friend, Milton; Franson, J. Christian
1987-01-01
Individual disease outbreaks have killed many thousands of animals on numerous occasions. Tens of thousands of migratory birds have died in single die-offs with as many as 1,000 birds succumbing in 1 day. In mammals, individual disease outbreaks have killed hundreds to thousands of animals with, for example, hemorrhagic disease in white-tailed deer, distemper in raccoon, Errington's disease in muskrat, and sylvatic plague in wild rodents. The ability to successfully combat such explosive situations is highly dependent n the readiness of field personnel to deal with them. Because many disease agents can spread though wildlife populations very fast, advance preparation is essential in preventing infected animals from spreading disease to additional species and locations. Carefully though-out disease contingency plans should be developed as practical working documents for field personnel and updated as necessary. Such well-designed plans can prove invaluable in minimizing wildlife losses and costs associated with disease control activities. Although requirements for disease control operations vary and must be tailored to each situation, all disease contingency planning involved general concepts and basic biological information. This chapter, intended as a practical guide, identifies the major activities and needs of disease control operations, and relates them to disease contingency planning.
Miller, C. Dan; Sushyar, R.; ,; Hamidi, S.
1983-01-01
The Dieng Mountains region consists of a complex of late Quaternary to recent volcanic stratocones, parasitic vents, and explosion craters. Six age groups of volcanic centers, eruptive products, and explosion craters are recognized in the region based on their morphology, degree of dissection, stratigraphic relationships, and degree of weathering. These features range in age from tens of thousands of years to events that have occurred this century. No magmatic eruptions have occurred in the Dieng Mountains region for at least several thousand years; volcanic activity during this time interval has consisted of phreatic eruptions and non-explosive hydrothermal activity. If future volcanic events are similar to those of the last few thousand years, they will consist of phreatic eruptions, associated small hot mudflows, emission of suffocating gases, and hydrothermal activity. Future phreatic eruptions may follow, or accompany, periods of increased earthquake activity; the epicenters for the seismicity may suggest where eruptive activity will occur. Under such circumstances, the populace within several kilometers of a potential eruption site should be warned of a possible eruption, given instructions about what to do in the event of an eruption, or temporarily evacuated to a safer location.
NASA Astrophysics Data System (ADS)
Atubga, David; Wu, Huijuan; Lu, Lidong; Sun, Xiaoyan
2017-02-01
Typical fully distributed optical fiber sensors (DOFS) with dozens of kilometers are equivalent to tens of thousands of point sensors along the whole monitoring line, which means tens of thousands of data will be generated for one pulse launching period. Therefore, in an all-day nonstop monitoring, large volumes of data are created thereby triggering the demand for large storage space and high speed for data transmission. In addition, when the monitoring length and channel numbers increase, the data also increase extensively. The task of mitigating large volumes of data accumulation, large storage capacity, and high-speed data transmission is, therefore, the aim of this paper. To demonstrate our idea, we carried out a comparative study of two lossless methods, Huffman and Lempel Ziv Welch (LZW), with a lossy data compression algorithm, fast wavelet transform (FWT) based on three distinctive DOFS sensing data, such as Φ-OTDR, P-OTDR, and B-OTDA. Our results demonstrated that FWT yielded the best compression ratio with good consumption time, irrespective of errors in signal construction of the three DOFS data. Our outcomes indicate the promising potentials of FWT which makes it more suitable, reliable, and convenient for real-time compression of the DOFS data. Finally, it was observed that differences in the DOFS data structure have some influence on both the compression ratio and computational cost.
NASA Astrophysics Data System (ADS)
Barboni, Mélanie; Boehnke, Patrick; Schmitt, Axel K.; Harrison, T. Mark; Shane, Phil; Bouvier, Anne-Sophie; Baumgartner, Lukas
2016-12-01
Felsic magmatic systems represent the vast majority of volcanic activity that poses a threat to human life. The tempo and magnitude of these eruptions depends on the physical conditions under which magmas are retained within the crust. Recently the case has been made that volcanic reservoirs are rarely molten and only capable of eruption for durations as brief as 1,000 years following magma recharge. If the “cold storage” model is generally applicable, then geophysical detection of melt beneath volcanoes is likely a sign of imminent eruption. However, some arc volcanic centers have been active for tens of thousands of years and show evidence for the continual presence of melt. To address this seeming paradox, zircon geochronology and geochemistry from both the frozen lava and the cogenetic enclaves they host from the Soufrière Volcanic Center (SVC), a long-lived volcanic complex in the Lesser Antilles arc, were integrated to track the preeruptive thermal and chemical history of the magma reservoir. Our results show that the SVC reservoir was likely eruptible for periods of several tens of thousands of years or more with punctuated eruptions during these periods. These conclusions are consistent with results from other arc volcanic reservoirs and suggest that arc magmas are generally stored warm. Thus, the presence of intracrustal melt alone is insufficient as an indicator of imminent eruption, but instead represents the normal state of magma storage underneath dormant volcanoes.
Barboni, Mélanie; Boehnke, Patrick; Schmitt, Axel K; Harrison, T Mark; Shane, Phil; Bouvier, Anne-Sophie; Baumgartner, Lukas
2016-12-06
Felsic magmatic systems represent the vast majority of volcanic activity that poses a threat to human life. The tempo and magnitude of these eruptions depends on the physical conditions under which magmas are retained within the crust. Recently the case has been made that volcanic reservoirs are rarely molten and only capable of eruption for durations as brief as 1,000 years following magma recharge. If the "cold storage" model is generally applicable, then geophysical detection of melt beneath volcanoes is likely a sign of imminent eruption. However, some arc volcanic centers have been active for tens of thousands of years and show evidence for the continual presence of melt. To address this seeming paradox, zircon geochronology and geochemistry from both the frozen lava and the cogenetic enclaves they host from the Soufrière Volcanic Center (SVC), a long-lived volcanic complex in the Lesser Antilles arc, were integrated to track the preeruptive thermal and chemical history of the magma reservoir. Our results show that the SVC reservoir was likely eruptible for periods of several tens of thousands of years or more with punctuated eruptions during these periods. These conclusions are consistent with results from other arc volcanic reservoirs and suggest that arc magmas are generally stored warm. Thus, the presence of intracrustal melt alone is insufficient as an indicator of imminent eruption, but instead represents the normal state of magma storage underneath dormant volcanoes.
Barboni, Mélanie; Schmitt, Axel K.; Harrison, T. Mark; Shane, Phil; Bouvier, Anne-Sophie; Baumgartner, Lukas
2016-01-01
Felsic magmatic systems represent the vast majority of volcanic activity that poses a threat to human life. The tempo and magnitude of these eruptions depends on the physical conditions under which magmas are retained within the crust. Recently the case has been made that volcanic reservoirs are rarely molten and only capable of eruption for durations as brief as 1,000 years following magma recharge. If the “cold storage” model is generally applicable, then geophysical detection of melt beneath volcanoes is likely a sign of imminent eruption. However, some arc volcanic centers have been active for tens of thousands of years and show evidence for the continual presence of melt. To address this seeming paradox, zircon geochronology and geochemistry from both the frozen lava and the cogenetic enclaves they host from the Soufrière Volcanic Center (SVC), a long-lived volcanic complex in the Lesser Antilles arc, were integrated to track the preeruptive thermal and chemical history of the magma reservoir. Our results show that the SVC reservoir was likely eruptible for periods of several tens of thousands of years or more with punctuated eruptions during these periods. These conclusions are consistent with results from other arc volcanic reservoirs and suggest that arc magmas are generally stored warm. Thus, the presence of intracrustal melt alone is insufficient as an indicator of imminent eruption, but instead represents the normal state of magma storage underneath dormant volcanoes. PMID:27799558
Microarray-based cancer prediction using soft computing approach.
Wang, Xiaosheng; Gotoh, Osamu
2009-05-26
One of the difficulties in using gene expression profiles to predict cancer is how to effectively select a few informative genes to construct accurate prediction models from thousands or ten thousands of genes. We screen highly discriminative genes and gene pairs to create simple prediction models involved in single genes or gene pairs on the basis of soft computing approach and rough set theory. Accurate cancerous prediction is obtained when we apply the simple prediction models for four cancerous gene expression datasets: CNS tumor, colon tumor, lung cancer and DLBCL. Some genes closely correlated with the pathogenesis of specific or general cancers are identified. In contrast with other models, our models are simple, effective and robust. Meanwhile, our models are interpretable for they are based on decision rules. Our results demonstrate that very simple models may perform well on cancerous molecular prediction and important gene markers of cancer can be detected if the gene selection approach is chosen reasonably.
NASA Astrophysics Data System (ADS)
Jia, Weile; Wang, Jue; Chi, Xuebin; Wang, Lin-Wang
2017-02-01
LS3DF, namely linear scaling three-dimensional fragment method, is an efficient linear scaling ab initio total energy electronic structure calculation code based on a divide-and-conquer strategy. In this paper, we present our GPU implementation of the LS3DF code. Our test results show that the GPU code can calculate systems with about ten thousand atoms fully self-consistently in the order of 10 min using thousands of computing nodes. This makes the electronic structure calculations of 10,000-atom nanosystems routine work. This speed is 4.5-6 times faster than the CPU calculations using the same number of nodes on the Titan machine in the Oak Ridge leadership computing facility (OLCF). Such speedup is achieved by (a) carefully re-designing of the computationally heavy kernels; (b) redesign of the communication pattern for heterogeneous supercomputers.
A Compact, Flexible, High Channel Count DAQ Built From Off-the-Shelf Components
Heffner, M.; Riot, V.; Fabris, L.
2013-06-01
Medium to large channel count detectors are usually faced with a few unattractive options for data acquisition (DAQ). Small to medium sized TPC experiments, for example, can be too small to justify the high expense and long development time of application specific integrated circuit (ASIC) development. In some cases an experiment can piggy-back on a larger experiment and the associated ASIC development, but this puts the time line of development out of the hands of the smaller experiment. Another option is to run perhaps thousands of cables to rack mounted equipment, which is clearly undesirable. The development of commercial high-speedmore » high-density FPGAs and ADCs combined with the small discrete components and robotic assembly open a new option that scales to tens of thousands of channels and is only slightly larger than ASICs using off-the-shelf components.« less
James, John S
2004-01-01
India changed its pharmaceutical patent law to conform to the U.S.-European system, just ahead of a Jan. 1 World Trade Organization deadline--meaning that most new medicines (patentable in 1995 or later) will be priced out of reach of the great majority of people in India--and in Africa and other poor regions as well. "The real issue for the multinational corporations is not the poor-country markets, which are financially small and unattractive, but the poor-country examples. How would thousands of people in rich countries, especially the U.S., be persuaded to accept death from cancer and other diseases because they cannot pay tens of thousands of dollars a year for a new generation of treatments that could save their lives--if companies in India could manufacture and sell the same treatments for a small fraction of the price?"
Zircon from historic eruptions in Iceland: reconstructing storage and evolution of silicic magmas
NASA Astrophysics Data System (ADS)
Carley, Tamara L.; Miller, Calvin F.; Wooden, Joseph L.; Bindeman, Ilya N.; Barth, Andrew P.
2011-10-01
Zoning patterns, U-Th disequilibria ages, and elemental compositions of zircon from eruptions of Askja (1875 AD), Hekla (1158 AD), Öræfajökull (1362 AD) and Torfajökull (1477 AD, 871 AD, 3100 BP, 7500 BP) provide insights into the complex, extended, histories of silicic magmatic systems in Iceland. Zircon compositions, which are correlated with proximity to the main axial rift, are distinct from those of mid-ocean ridge environments and fall at the low-Hf edge of the range of continental zircon. Morphology, zoning patterns, compositions, and U-Th ages all indicate growth and storage in subvolcanic silicic mushes or recently solidified rock at temperatures above the solidus but lower than that of the erupting magma. The eruptive products were likely ascending magmas that entrained a zircon "cargo" that formed thousands to tens of thousands of years prior to the eruptions.
Climate Change: Past, Present, and Future
NASA Astrophysics Data System (ADS)
Chapman, David S.; Davis, Michael G.
2010-09-01
Questions about global warming concern climate scientists and the general public alike. Specifically, what are the reliable surface temperature reconstructions over the past few centuries? And what are the best predictions of global temperature change the Earth might expect for the next century? Recent publications [National Research Council (NRC), 2006; Intergovernmental Panel on Climate Change (IPCC), 2007] permit these questions to be answered in a single informative illustration by assembling temperature reconstructions of the past thousand years with predictions for the next century. The result, shown in Figure 1, illustrates present and future warming in the context of natural variations in the past [see also Oldfield and Alverson, 2003]. To quote a Chinese proverb, “A picture's meaning can express ten thousand words.” Because it succinctly captures past inferences and future projections of climate, the illustration should be of interest to scientists, educators, policy makers, and the public.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shyer, E.B.
The New York State Development of Environmental Conservation`s Division of Mineral Resources is responsible for regulating the oil and gas industry and receiving operator`s annual well production reports. Production year 1970 and 627 active gas wells with reported production of 3 billion cubic feet by New York State operators. Ten years later in 1980, production had more than tripled to 15.5 billion cubic feet and reported active gas wells increased to 1,966. During 1990, reported gas production was 25 billion cubic feet from 5,536 active gas wells. The average production per gas well in 1970 was 4,773 thousand cubic feet.more » Average gas production per well peaked in 1978 with a reported production of 14 billion cubic feet by 1,431 active gas wells which averaged 9,821 thousand cubic feet per well. By 1994 the average production per well had decreased to 3,800 thousand cubic feet, a decrease of approximately 60%. The decrease in average well production is more a reflection of the majority of older wells reaching the lower end of their decline curve than a decrease in overall per well production. The number of completed gas wells increased following the rising price of gas. In 1970 gas was $0.30 per thousand cubic feet. By 1984 the price per thousand cubic feet had peaked at $4. After 1984 the price of gas started to decline while the number of active gas wells continued to increase. Sharp increases in gas production for certain counties such as Steuben in 1972 and 1973 and Chautauqua in 1980-83 reflects the discoveries of new fields such as Adrian Reef and Bass Island, respectively. The Stagecoach Field discovered in 1989 in Tioga County is the newest high producing field in New York State.« less
The cost of large numbers of hypothesis tests on power, effect size and sample size.
Lazzeroni, L C; Ray, A
2012-01-01
Advances in high-throughput biology and computer science are driving an exponential increase in the number of hypothesis tests in genomics and other scientific disciplines. Studies using current genotyping platforms frequently include a million or more tests. In addition to the monetary cost, this increase imposes a statistical cost owing to the multiple testing corrections needed to avoid large numbers of false-positive results. To safeguard against the resulting loss of power, some have suggested sample sizes on the order of tens of thousands that can be impractical for many diseases or may lower the quality of phenotypic measurements. This study examines the relationship between the number of tests on the one hand and power, detectable effect size or required sample size on the other. We show that once the number of tests is large, power can be maintained at a constant level, with comparatively small increases in the effect size or sample size. For example at the 0.05 significance level, a 13% increase in sample size is needed to maintain 80% power for ten million tests compared with one million tests, whereas a 70% increase in sample size is needed for 10 tests compared with a single test. Relative costs are less when measured by increases in the detectable effect size. We provide an interactive Excel calculator to compute power, effect size or sample size when comparing study designs or genome platforms involving different numbers of hypothesis tests. The results are reassuring in an era of extreme multiple testing.
Cheng, Winton; Juang, Feng-Ming; Chen, Jiann-Chu
2004-03-01
Addition of NaCl at 2.5% to 3.5% to tryptic soy broth (TSB) significantly increased the growth of Vibrio parahaemolyticus. Taiwan abalone Haliotis diversicolor supertexta held in 30 per thousand seawater were injected with V. parahaemolyticus grown in TSB containing NaCl at 0.5, 1.5, 2.5, 3.5 and 4.5% at a dose of 1.6 x 10(5)colony-forming units (cfu) abalone(-1). After 48 h, the cumulative mortality was significantly higher for the abalone challenged with V. parahaemolyticus grown in 2.5% than those grown in 0.5 and 1.5% NaCl. In other experiments, abalones held in 30 per thousand seawater were injected with TSB-grown V. parahaemolyticus (1.6 x 10(5)cfu abalone(-1)), and then transferred to 20, 25, 30 and 35 per thousand seawater. All abalones held in 20 per thousand were killed in 48 h. The mortality of V. parahaemolyticus-injected abalone held in 30 per thousand was significantly lower over 24-120 h. Abalone held in 30 per thousand seawater and then transferred to 20, 25, 30 and 35 per thousand were examined for THC (total haemocyte count), phenoloxidase activity, respiratory burst, phagocytic activity and clearance efficiency of V. parahemolyticus after 24 and 72 h. The THC increased directly related with salinity levels. Phenoloxidase activity, respiratory burst, phagocytic activity and clearance efficiency of V. parahaemolyticus decreased significantly for the abalone in 20, 25 and 35 per thousand. It is concluded that the abalone transferred from 30 per thousand to 20, 25 and 35 per thousand had reduced immune ability and decreased resistance against V. parahaemolyticus infection.
A Hardware Model Validation Tool for Use in Complex Space Systems
NASA Technical Reports Server (NTRS)
Davies, Misty Dawn; Gundy-Burlet, Karen L.; Limes, Gregory L.
2010-01-01
One of the many technological hurdles that must be overcome in future missions is the challenge of validating as-built systems against the models used for design. We propose a technique composed of intelligent parameter exploration in concert with automated failure analysis as a scalable method for the validation of complex space systems. The technique is impervious to discontinuities and linear dependencies in the data, and can handle dimensionalities consisting of hundreds of variables over tens of thousands of experiments.
NASA Astrophysics Data System (ADS)
Caballero, J. A.
2012-05-01
In the last few years, there have been several projects involving astronomy and classical music. But have a rock band ever appeared at a science conference or an astronomer at a rock concert? We present a project, Multiverso, in which we mix rock and astronomy, together with poetry and video art (Caballero, 2010). The project started in late 2009 and has already reached tens of thousands people in Spain through the release of an album, several concert-talks, television, radio, newspapers and the internet.
A structural perspective of the flavivirus life cycle.
Mukhopadhyay, Suchetana; Kuhn, Richard J; Rossmann, Michael G
2005-01-01
Dengue, Japanese encephalitis, West Nile and yellow fever belong to the Flavivirus genus, which is a member of the Flaviviridae family. They are human pathogens that cause large epidemics and tens of thousands of deaths annually in many parts of the world. The structural organization of these viruses and their associated structural proteins has provided insight into the molecular transitions that occur during the viral life cycle, such as assembly, budding, maturation and fusion. This review focuses mainly on structural studies of dengue virus.
Model based verification of the Secure Socket Layer (SSL) Protocol for NASA systems
NASA Technical Reports Server (NTRS)
Powell, John D.; Gilliam, David
2004-01-01
The National Aeronautics and Space Administration (NASA) has tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information theft, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach' offers formal verification of information technology (IT), through the creation of a Software Security Assessment Instrument (SSAI), to address software security risks.
A Website for Astronomy Education and Outreach
NASA Astrophysics Data System (ADS)
Impey, C.; Danehy, A.
2017-09-01
Teach Astronomy is a free, open access website designed for formal and informal learners of astronomy. The site features: an online textbook complete with quiz questions and a glossary; over ten thousand images; a curated collection of the astronomy articles in Wikipedia; a complete video lecture course; a video Frequently Asked Questions tool; and other materials provided by content partners. Clustering algorithms and an interactive visual interface allow users to browse related content. This article reviews the features of the website and how it can be used.
Silica waveguide devices and their applications
NASA Astrophysics Data System (ADS)
Sun, C. J.; Schmidt, Kevin M.; Lin, Wenhua
2005-03-01
Silica waveguide technology transitioned from laboratories to commercial use in early 1990. Since then, various applications have been exploited based on this technology. Tens of thousands of array waveguide grating (AWG) devices have been installed worldwide for DWDM Mux and Demux. The recent FTTH push in Japan has renewed the significance of this technology for passive optical network (PON) application. This paper reviews the past development of this technology, compare it with competing technologies, and outline the future role of this technology in the evolving optical communications.
North Polar Cap Layers and Ledges
2016-08-24
At the edge of Mars' permanent North Polar cap, we see an exposure of the internal layers, each with a different mix of water ice, dust and dirt. These layers are believed to correspond to different climate conditions over the past tens of thousands of years. When we zoom in closer, we see that the distinct layers erode differently. Some are stronger and more resistant to erosion, others only weakly cemented. The strong layers form ledges. http://photojournal.jpl.nasa.gov/catalog/PIA21022
NASA Astrophysics Data System (ADS)
Barnett, R. Michael
2013-02-01
After half a century of waiting, the drama was intense. Physicists slept overnight outside the auditorium to get seats for the seminar at the CERN lab in Geneva, Switzerland. Ten thousand miles away on the other side of the planet, at the world's most prestigious international particle physics conference, hundreds of physicists from every corner of the globe lined up to hear the seminar streamed live from Geneva (see Fig. 1). And in universities from North America to Asia, physicists and students gathered to watch the streaming talks.
Management of Ebola Virus Disease in Children.
Trehan, Indi; De Silva, Stephanie C
2018-03-01
The West African outbreak of 2013 to 2016 was the largest Ebola epidemic in history. With tens of thousands of patients treated during this outbreak, much was learned about how to optimize clinical care for children with Ebola. In anticipation of inevitable future outbreaks, a firsthand summary of the major aspects of pediatric Ebola case management in austere settings is presented. Emphasis is on early and aggressive critical care, including fluid resuscitation, electrolyte repletion, antimicrobial therapy, and nutritional supplementation. Copyright © 2017 Elsevier Inc. All rights reserved.
Medical device problem reporting for the betterment of healthcare.
1998-08-01
Given that there are nearly 5,000 individual classes of medical devices, tens of thousands of medical device suppliers, and millions of healthcare providers around the world, device-related problems are bound to happen. But effective problem reporting can help reduce or eliminate many of these problems--not only within an institution, but also potentially around the world. In this article, we trace the problem reporting process from its beginnings in the hospital to its global impact in making critical information available throughout the healthcare community.
Darnaude, Audrey M; Salen-Picard, Chantal; Polunin, Nicholas V C; Harmelin-Vivien, Mireille L
2004-02-01
The link between climate-driven river runoff and sole fishery yields observed in the Gulf of Lions (NW Mediterranean) was analysed using carbon- and nitrogen stable isotopes along the flatfish food webs. Off the Rhone River, the main terrestrial (river POM) and marine (seawater POM) sources of carbon differed in delta(13)C (-26.11 per thousand and -22.36 per thousand, respectively). Surface sediment and suspended POM in plume water exhibited low delta(13)C (-24.38 per thousand and -24.70 per thousand, respectively) that differed more from the seawater POM than from river POM, demonstrating the dominance of terrestrial material in those carbon pools. Benthic invertebrates showed a wide range in delta(15)N (mean 4.30 per thousand to 9.77 per thousand ) and delta(13)C (mean -23.81 per thousand to -18.47 per thousand ), suggesting different trophic levels, diets and organic sources. Among the macroinvertebrates, the surface (mean delta(13)C -23.71 per thousand ) and subsurface (mean delta(13)C -23.81 per thousand ) deposit-feeding polychaetes were particularly (13)C depleted, indicating that their carbon was mainly derived from terrestrial material. In flatfish, delta(15)N (mean 9.42 to 10.93 per thousand ) and delta(13)C (mean -19.95 per thousand to -17.69 per thousand ) varied among species, indicating differences in food source and terrestrial POM use. A significant negative correlation was observed between the percentage by weight of polychaetes in the diet and the delta(13)C of flatfish white muscle. Solea solea (the main polychaete feeder) had the lowest mean delta(13)C, Arnoglossus laterna and Buglossidium luteum (crustacean, mollusc and polychaete feeders) had intermediate values, and Solea impar (mollusc feeder) and Citharus linguatula (crustacean and fish feeder) exhibited the highest delta(13)C. Two different benthic food webs were thus identified off the Rhone River, one based on marine planktonic carbon and the other on the terrestrial POM carried by the river. Deposit-feeding polychaetes were responsible for the main transfer of terrestrial POM to upper trophic levels, linking sole population dynamics to river runoff fluctuations.
NASA Astrophysics Data System (ADS)
Gold, Anne; Gordon, Eric
2016-04-01
Over the last decade, Massive Open Online Courses (MOOCs) have rapidly gained traction as a way to provide virtually anyone with an internet connection free access to a broad variety of high-quality college-level courses. That means Earth science instructors can now teach courses that reach tens of thousands of students--an incredible opportunity, but one that also poses many novel challenges. In April 2015, we used the Coursera platform to run a MOOC entitled "Water in the Western United States," to deliver a survey course of broad interest and partly as a venue to make research efforts accessible to a wide audience. Leveraging a previous online course run on a smaller MOOC platform (Canvas), we created a course largely based on short expert video lectures tied together by various types of assessments.Over a dozen experts provided short lectures offering a survey course that touches on the social, legal, natural, and societal aspects of the topic.This style of MOOC, in which the content is not delivered by one expert but by many, helped us showcase the breadth of available expertise both at the University of Colorado and elsewhere. In this presentation we will discuss the challenges that arose from planning a MOOC with no information about the characteristics of the student body, teaching thousands of unidentified students, and understanding the nature of online learning in an increasingly mobile-dominated world. We will also discuss the opportunities a MOOC offers for changes in undergraduate education, sharing across campuses or even across levels, and promoting flipped classroom-style learning. Finally, we will describe the general characteristics of our MOOC student body and describe lessons learned from our experience while aiming to place the MOOC experience into a larger conversation about the future of education at multiple levels.
Glynn, Pierre
2008-01-01
Provost et al. (1998) and Glynn and Voss (1999; also published in Glynn et al., 1999) considered the possibility that during future glaciations, oxygenated glacial meltwaters from two- to three-kilometer thick ice sheets could potentially intrude to the 500 m depth of planned nuclear-waste repositories. This possibility has been of concern because of potential negative effects on the stability of the repository engineered environment, and because of the potential mobilization of radionuclides should the oxygenated waters come into contact with the radioactive waste. The above reports argued that given the current state of knowledge, it was hard to discount the possibility that oxygenated waters could penetrate to repository level depth. The reports also suggested that oxidizing conditions might be present in the fractured rock environment for significant amounts of time, on the order of thousands to tens of thousands of years. In some earlier reports, Swedish and Finnish governmental agencies in charge of nuclear-waste disposal had considered the possibility that oxygenated meltwaters might intrude to the repository depth (SKI: 1992; Martinerie et al, 1992; Ahonen and Vieno, 1994). Subsequent to the publication of Provost et al. (1998), Glynn et al. (1999) and Glynn and Voss (1999), the Swedish Nuclear Fuel and Waste Handling Company (SKB) commissioned efforts to examine more thoroughly the possibilities that oxygenated meltwaters might occur under ice-sheet conditions and intrude to the repository depth.
Detecting differentially expressed genes in heterogeneous diseases using half Student's t-test.
Hsu, Chun-Lun; Lee, Wen-Chung
2010-12-01
Microarray technology provides information about hundreds and thousands of gene-expression data in a single experiment. To search for disease-related genes, researchers test for those genes that are differentially expressed between the case subjects and the control subjects. The authors propose a new test, the 'half Student's t-test', specifically for detecting differentially expressed genes in heterogeneous diseases. Monte-Carlo simulation shows that the test maintains the nominal α level quite well for both normal and non-normal distributions. Power of the half Student's t is higher than that of the conventional 'pooled' Student's t when there is heterogeneity in the disease under study. The power gain by using the half Student's t can reach ∼10% when the standard deviation of the case group is 50% larger than that of the control group. Application to a colon cancer data reveals that when the false discovery rate (FDR) is controlled at 0.05, the half Student's t can detect 344 differentially expressed genes, whereas the pooled Student's t can detect only 65 genes. Or alternatively, if only 50 genes are to be selected, the FDR for the pooled Student's t has to be set at 0.0320 (false positive rate of ∼3%), but for the half Student's t, it can be at as low as 0.0001 (false positive rate of about one per ten thousands). The half Student's t-test is to be recommended for the detection of differentially expressed genes in heterogeneous diseases.
Hauer, G; Rogerson, A; Anderson, O R
2001-01-01
A new species of naked amoeba, Platyamoeba pseudovannellida n.sp., is described on the basis of light microscopic and fine structural features. The amoeba was isolated from the Salton Sea, California, from water at a salinity of ca. 44%. Locomotive amoebae occasionally had a spatulate outline and floating cells had radiating pseudopodia, sometimes with pointed tips. Both these features are reminiscent of the genus Vannella. However, the surface coat (glycocalyx) as revealed by TEM indicates that this is a species of Platyamoeba. Although salinity was not used as a diagnostic feature, this species was found to have remarkable tolerance to fluctuating salinity levels, even when changes were rapid. Amoebae survived over the range 0 per thousand to 150 per thousand salt and grew within the range 0 per thousand to 138 per thousand salt. The generation time of cells averaged 29 h and was not markedly affected by salt concentration. This is longer than expected for an amoeba of this size and suggests a high energetic cost of coping with salinity changes. The morphology of cells changed with increasing salinity: at 0 per thousand cells were flattened and active and at the other extreme (138 per thousand) amoebae were wrinkled and domed and cell movement was very slow. At the ultrastructural level, the cytoplasm of cells grown at high salinity (98 per thousand was considerably denser than that of cells reared at 0 per thousand.
NASA Astrophysics Data System (ADS)
Pordes, Ruth; OSG Consortium; Petravick, Don; Kramer, Bill; Olson, Doug; Livny, Miron; Roy, Alain; Avery, Paul; Blackburn, Kent; Wenaus, Torre; Würthwein, Frank; Foster, Ian; Gardner, Rob; Wilde, Mike; Blatecky, Alan; McGee, John; Quick, Rob
2007-07-01
The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. OSG provides support for and evolution of the infrastructure through activities that cover operations, security, software, troubleshooting, addition of new capabilities, and support for existing and engagement with new communities. The OSG SciDAC-2 project provides specific activities to manage and evolve the distributed infrastructure and support it's use. The innovative aspects of the project are the maintenance and performance of a collaborative (shared & common) petascale national facility over tens of autonomous computing sites, for many hundreds of users, transferring terabytes of data a day, executing tens of thousands of jobs a day, and providing robust and usable resources for scientific groups of all types and sizes. More information can be found at the OSG web site: www.opensciencegrid.org.
NASA Astrophysics Data System (ADS)
Bossu, R.; Roussel, F.; Mazet-Roux, G.; Steed, R.; Frobert, L.
2015-12-01
LastQuake is a smartphone app, browser add-on and the most sophisticated Twitter robot (quakebot) for earthquakes currently in operation. It fulfills eyewitnesses' needs by offering information on felt earthquakes and their effects within tens of seconds of their occurrence. Associated with an active presence on Facebook, Pinterest and on websites, this proves a very efficient engagement strategy. For example, the app was installed thousands of times after the Ghorka earthquake in Nepal. Language barriers have been erased by using visual communication; for example, felt reports are collected through a set of cartoons representing different shaking levels. Within 3 weeks of the magnitude 7.8 Ghorka earthquakes, 7,000 felt reports with thousands of comments were collected related to the mainshock and tens of its aftershocks as well as 100 informative geo-located pics. The QuakeBot was essential in allowing us to be identified so well and interact with those affected. LastQuake is also a risk reduction tool since it provides rapid information. Rapid information is similar to prevention since when it does not exist, disasters can happen. When no information is available after a felt earthquake, the public block emergency lines by trying to find out the cause of the shaking, crowds form potentially leading to unpredictable crowd movement, rumors spread. In its next release LastQuake will also provide people with guidance immediately after a shaking through a number of pop-up cartoons illustrating "do/don't do" items (go to open places, do not phone emergency services except if people are injured…). LastQuake's app design is simple and intuitive and has a global audience. It benefited from a crowdfunding campaign (and the support of the Fondation MAIF) and more improvements have been planned after an online feedback campaign organized in early June with the Ghorka earthquake eyewitnesses. LastQuake is also a seismic risk reduction tools thanks to its very rapid information. When such information does not exist, people tend to call emergency services, crowds emerge and rumors spread. In its next release, LastQuake will also have "do/don't do" cartoons popping up after an earthquake to encourage appropriate behavior.
The awakening of a classical nova from hibernation.
Mróz, Przemek; Udalski, Andrzej; Pietrukowicz, Paweł; Szymański, Michał K; Soszyński, Igor; Wyrzykowski, Łukasz; Poleski, Radosław; Kozłowski, Szymon; Skowron, Jan; Ulaczyk, Krzysztof; Skowron, Dorota; Pawlak, Michał
2016-09-29
Cataclysmic variable stars-novae, dwarf novae, and nova-likes-are close binary systems consisting of a white dwarf star (the primary) that is accreting matter from a low-mass companion star (the secondary). From time to time such systems undergo large-amplitude brightenings. The most spectacular eruptions, with a ten-thousandfold increase in brightness, occur in classical novae and are caused by a thermonuclear runaway on the surface of the white dwarf. Such eruptions are thought to recur on timescales of ten thousand to a million years. In between, the system's properties depend primarily on the mass-transfer rate: if it is lower than a billionth of a solar mass per year, the accretion becomes unstable and the matter is dumped onto the white dwarf during quasi-periodic dwarf nova outbursts. The hibernation hypothesis predicts that nova eruptions strongly affect the mass-transfer rate in the binary, keeping it high for centuries after the event. Subsequently, the mass-transfer rate should significantly decrease for a thousand to a million years, starting the hibernation phase. After that the nova awakes again-with accretion returning to the pre-eruption level and leading to a new nova explosion. The hibernation model predicts cyclical evolution of cataclysmic variables through phases of high and low mass-transfer. The theory gained some support from the discovery of ancient nova shells around the dwarf novae Z Camelopardalis and AT Cancri, but direct evidence for considerable mass-transfer changes prior, during and after nova eruptions has not hitherto been found. Here we report long-term observations of the classical nova V1213 Cen (Nova Centauri 2009) covering its pre- and post-eruption phases and precisely documenting its evolution. Within the six years before the explosion, the system revealed dwarf nova outbursts indicative of a low mass-transfer rate. The post-nova is two orders of magnitude brighter than the pre-nova at minimum light with no trace of dwarf nova behaviour, implying that the mass-transfer rate increased considerably as a result of the nova explosion.
Quantifying causal emergence shows that macro can beat micro.
Hoel, Erik P; Albantakis, Larissa; Tononi, Giulio
2013-12-03
Causal interactions within complex systems can be analyzed at multiple spatial and temporal scales. For example, the brain can be analyzed at the level of neurons, neuronal groups, and areas, over tens, hundreds, or thousands of milliseconds. It is widely assumed that, once a micro level is fixed, macro levels are fixed too, a relation called supervenience. It is also assumed that, although macro descriptions may be convenient, only the micro level is causally complete, because it includes every detail, thus leaving no room for causation at the macro level. However, this assumption can only be evaluated under a proper measure of causation. Here, we use a measure [effective information (EI)] that depends on both the effectiveness of a system's mechanisms and the size of its state space: EI is higher the more the mechanisms constrain the system's possible past and future states. By measuring EI at micro and macro levels in simple systems whose micro mechanisms are fixed, we show that for certain causal architectures EI can peak at a macro level in space and/or time. This happens when coarse-grained macro mechanisms are more effective (more deterministic and/or less degenerate) than the underlying micro mechanisms, to an extent that overcomes the smaller state space. Thus, although the macro level supervenes upon the micro, it can supersede it causally, leading to genuine causal emergence--the gain in EI when moving from a micro to a macro level of analysis.
Ojima, T; Saito, E; Kanagawa, K; Sakata, K; Yanagawa, H
1997-04-01
The purpose of this study was to estimate the manpower required for the health care and nursing services for the aged at home. For prefectural health care and welfare planning for the aged, data such as the proportion of the aged who need help, service demand, and required frequency of services were obtained. The means and "mean +/- 2 x standard deviations" were calculated to obtain various parameters. Calculated figures were those which can be obtained with some effort. The results are as follows (middle level estimation (low level estimation-high level estimation)): requirements are 1.9 (0.61-5.7) public health nurses, 2.6 (0.63-14) visiting nurses, 0.20 (0.084-0.42) dental hygienists, 0.35 (0.17-0.66) dietitians, and 0.25 (0.014-1.27) physical and occupational therapists per population 10,000. For the national total, requirements are 23 (7.3-67) thousand public health nurses, 31 (7.5-160) thousand visiting nurses, 2.4 (1.0-5.0) thousand dental hygienists, 3.9 (2.0-7.8) thousand dietitians, and 3.0 (0.17-15) thousand physical and occupational therapists. By population sizes, for example in the municipalities which has 10-30 thousand people, required are 4.2 (1.7-11) public health nurses, 5.3 (1.3-27) visiting nurses, 0.4 (0.2-0.8) dental hygienists, 0.5 (0.3-0.9) dietitians, and 0.5 (0.0-2.5) physical and occupational therapists. Comparison of the present numbers with estimated manpower needs show that, the present number of public health personnel is almost the same as the low level estimation. But the present numbers of other manpower is lower than the low level estimation. Considering other services such as maternal and child health, it seems that the municipalities which has 10+ thousand population should employ full-time dietitians and dental hygienists. For policy making in a municipality, the policies of other municipalities should be considered. Because it is based on means for municipalities, the results of this study should be useful for application by other municipalities.
Enhanced Detection of Sea-Disposed Man-Made Objects in Backscatter Data
NASA Astrophysics Data System (ADS)
Edwards, M.; Davis, R. B.
2016-12-01
The Hawai'i Undersea Military Munitions Assessment (HUMMA) project developed software to increase data visualization capabilities applicable to seafloor reflectivity datasets acquired by a variety of bottom-mapping sonar systems. The purpose of these improvements is to detect different intensity values within an arbitrary amplitude range that may be associated with relative target reflectivity as well as extend the overall amplitude range across which detailed dynamic contrast may be effectively displayed. The backscatter dataset used to develop this software imaged tens of thousands of reflective targets resting on the seabed that were systematically sea disposed south of Oahu, Hawaii, around the end of World War II in waters ranging from 300-600 meters depth. Human-occupied and remotely operated vehicles conducted ground-truth video and photographic reconnaissance of thousands of these reflective targets, documenting and geo-referencing long curvilinear trials of items including munitions, paint cans, airplane parts, scuttled ships, cars and bundled anti-submarine nets. Edwards et al. [2012] determined that most individual trails consist of objects of one particular type. The software described in this presentation, in combination with the ground-truth images, was developed to help recognize different types of objects based on reflectivity, size, and shape from altitudes of tens of meters above the seabed. The fundamental goal of the software is to facilitate rapid underway detection and geo-location of specific sea-disposed objects so their impact on the environment can be assessed.
Schweiger, Regev; Fisher, Eyal; Rahmani, Elior; Shenhav, Liat; Rosset, Saharon; Halperin, Eran
2018-06-22
Estimation of heritability is an important task in genetics. The use of linear mixed models (LMMs) to determine narrow-sense single-nucleotide polymorphism (SNP)-heritability and related quantities has received much recent attention, due of its ability to account for variants with small effect sizes. Typically, heritability estimation under LMMs uses the restricted maximum likelihood (REML) approach. The common way to report the uncertainty in REML estimation uses standard errors (SEs), which rely on asymptotic properties. However, these assumptions are often violated because of the bounded parameter space, statistical dependencies, and limited sample size, leading to biased estimates and inflated or deflated confidence intervals (CIs). In addition, for larger data sets (e.g., tens of thousands of individuals), the construction of SEs itself may require considerable time, as it requires expensive matrix inversions and multiplications. Here, we present FIESTA (Fast confidence IntErvals using STochastic Approximation), a method for constructing accurate CIs. FIESTA is based on parametric bootstrap sampling, and, therefore, avoids unjustified assumptions on the distribution of the heritability estimator. FIESTA uses stochastic approximation techniques, which accelerate the construction of CIs by several orders of magnitude, compared with previous approaches as well as to the analytical approximation used by SEs. FIESTA builds accurate CIs rapidly, for example, requiring only several seconds for data sets of tens of thousands of individuals, making FIESTA a very fast solution to the problem of building accurate CIs for heritability for all data set sizes.
Hollis, Geoff
2018-04-01
Best-worst scaling is a judgment format in which participants are presented with a set of items and have to choose the superior and inferior items in the set. Best-worst scaling generates a large quantity of information per judgment because each judgment allows for inferences about the rank value of all unjudged items. This property of best-worst scaling makes it a promising judgment format for research in psychology and natural language processing concerned with estimating the semantic properties of tens of thousands of words. A variety of different scoring algorithms have been devised in the previous literature on best-worst scaling. However, due to problems of computational efficiency, these scoring algorithms cannot be applied efficiently to cases in which thousands of items need to be scored. New algorithms are presented here for converting responses from best-worst scaling into item scores for thousands of items (many-item scoring problems). These scoring algorithms are validated through simulation and empirical experiments, and considerations related to noise, the underlying distribution of true values, and trial design are identified that can affect the relative quality of the derived item scores. The newly introduced scoring algorithms consistently outperformed scoring algorithms used in the previous literature on scoring many-item best-worst data.
Zircon reveals protracted magma storage and recycling beneath Mount St. Helens
Claiborne, L.L.; Miller, C.F.; Flanagan, D.M.; Clynne, M.A.; Wooden, J.L.
2010-01-01
Current data and models for Mount St. Helens volcano (Washington, United States) suggest relatively rapid transport from magma genesis to eruption, with no evidence for protracted storage or recycling of magmas. However, we show here that complex zircon age populations extending back hundreds of thousands of years from eruption age indicate that magmas regularly stall in the crust, cool and crystallize beneath the volcano, and are then rejuvenated and incorporated by hotter, young magmas on their way to the surface. Estimated dissolution times suggest that entrained zircon generally resided in rejuvenating magmas for no more than about a century. Zircon elemental compositions reflect the increasing influence of mafic input into the system through time, recording growth from hotter, less evolved magmas tens of thousands of years prior to the appearance of mafic magmas at the surface, or changes in whole-rock geochemistry and petrology, and providing a new, time-correlated record of this evolution independent of the eruption history. Zircon data thus reveal the history of the hidden, long-lived intrusive portion of the Mount St. Helens system, where melt and crystals are stored for as long as hundreds of thousands of years and interact with fresh influxes of magmas that traverse the intrusive reservoir before erupting. ?? 2010 Geological Society of America.
Shrinkage covariance matrix approach based on robust trimmed mean in gene sets detection
NASA Astrophysics Data System (ADS)
Karjanto, Suryaefiza; Ramli, Norazan Mohamed; Ghani, Nor Azura Md; Aripin, Rasimah; Yusop, Noorezatty Mohd
2015-02-01
Microarray involves of placing an orderly arrangement of thousands of gene sequences in a grid on a suitable surface. The technology has made a novelty discovery since its development and obtained an increasing attention among researchers. The widespread of microarray technology is largely due to its ability to perform simultaneous analysis of thousands of genes in a massively parallel manner in one experiment. Hence, it provides valuable knowledge on gene interaction and function. The microarray data set typically consists of tens of thousands of genes (variables) from just dozens of samples due to various constraints. Therefore, the sample covariance matrix in Hotelling's T2 statistic is not positive definite and become singular, thus it cannot be inverted. In this research, the Hotelling's T2 statistic is combined with a shrinkage approach as an alternative estimation to estimate the covariance matrix to detect significant gene sets. The use of shrinkage covariance matrix overcomes the singularity problem by converting an unbiased to an improved biased estimator of covariance matrix. Robust trimmed mean is integrated into the shrinkage matrix to reduce the influence of outliers and consequently increases its efficiency. The performance of the proposed method is measured using several simulation designs. The results are expected to outperform existing techniques in many tested conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, Xuheng; Biesiada, Marek; Zhu, Zong-Hong, E-mail: dingxuheng@mail.bnu.edu.cn, E-mail: marek.biesiada@us.edu.pl, E-mail: zhuzh@bnu.edu.cn
With a fantastic sensitivity improving significantly over the advanced GW detectors, Einstein Telescope (ET) will be able to observe hundreds of thousand inspiralling double compact objects per year. By virtue of gravitational lensing effect, intrinsically unobservable faint sources can be observed by ET due to the magnification by intervening galaxies. We explore the possibility of observing such faint sources amplified by strong gravitational lensing. Following our previous work, we use the merger rates of DCO (NS-NS,BH-NS,BH-BH systems) as calculated by Dominik et al.(2013). It turns out that tens to hundreds of such (lensed) extra events will be registered by ET.more » This will strongly broaden the ET's distance reach for signals from such coalescences to the redshift range z = 2 − 8. However, with respect to the full inspiral event catalog this magnification bias is at the level of 0.001 and should not affect much cosmological inferences.« less
Analysis of DNA-chip and antigen-chip data: studies of cancer, stem cells and autoimmune diseases
NASA Astrophysics Data System (ADS)
Domany, Eytan
2005-07-01
Biology has undergone a revolution during the past decade. Deciphering the human genome has opened new horizons, among which the advent of DNA microarrays has been perhaps the most significant. These miniature measuring devices report the levels at which tens of thousands of genes are expressed in a collection of cells of interest (such as tissue from a tumor). I describe here briefly this technology and present an example of how analysis of data obtained from such high throughput experiments provides insights of possible clinical and therapeutic relevance for Acute Lymphoblastic Leukemia. Next, I describe how gene expression data is used to deduce a new design principle, " Just In Case", used by stem cells. Finally I briefly review a different novel technology, of antigen chips, which provide a fingerprint of a subject's immune system and may become a predictive clinical tool. The work reviewed here was done in collaboration with numerous colleagues and students.
Trade-off between Transcriptome Plasticity and Genome Evolution in Cephalopods.
Liscovitch-Brauer, Noa; Alon, Shahar; Porath, Hagit T; Elstein, Boaz; Unger, Ron; Ziv, Tamar; Admon, Arie; Levanon, Erez Y; Rosenthal, Joshua J C; Eisenberg, Eli
2017-04-06
RNA editing, a post-transcriptional process, allows the diversification of proteomes beyond the genomic blueprint; however it is infrequently used among animals for this purpose. Recent reports suggesting increased levels of RNA editing in squids thus raise the question of the nature and effects of these events. We here show that RNA editing is particularly common in behaviorally sophisticated coleoid cephalopods, with tens of thousands of evolutionarily conserved sites. Editing is enriched in the nervous system, affecting molecules pertinent for excitability and neuronal morphology. The genomic sequence flanking editing sites is highly conserved, suggesting that the process confers a selective advantage. Due to the large number of sites, the surrounding conservation greatly reduces the number of mutations and genomic polymorphisms in protein-coding regions. This trade-off between genome evolution and transcriptome plasticity highlights the importance of RNA recoding as a strategy for diversifying proteins, particularly those associated with neural function. PAPERCLIP. Copyright © 2017 Elsevier Inc. All rights reserved.
Towards mechanism-based simulation of impact damage using exascale computing
NASA Astrophysics Data System (ADS)
Shterenlikht, Anton; Margetts, Lee; McDonald, Samuel; Bourne, Neil K.
2017-01-01
Over the past 60 years, the finite element method has been very successful in modelling deformation in engineering structures. However the method requires the definition of constitutive models that represent the response of the material to applied loads. There are two issues. Firstly, the models are often difficult to define. Secondly, there is often no physical connection between the models and the mechanisms that accommodate deformation. In this paper, we present a potentially disruptive two-level strategy which couples the finite element method at the macroscale with cellular automata at the mesoscale. The cellular automata are used to simulate mechanisms, such as crack propagation. The stress-strain relationship emerges as a continuum mechanics scale interpretation of changes at the micro- and meso-scales. Iterative two-way updating between the cellular automata and finite elements drives the simulation forward as the material undergoes progressive damage at high strain rates. The strategy is particularly attractive on large-scale computing platforms as both methods scale well on tens of thousands of CPUs.
Development of performance assessment methodology for nuclear waste isolation in geologic media
NASA Astrophysics Data System (ADS)
Bonano, E. J.; Chu, M. S. Y.; Cranwell, R. M.; Davis, P. A.
The burial of nuclear wastes in deep geologic formations as a means for their disposal is an issue of significant technical and social impact. The analysis of the processes involved can be performed only with reliable mathematical models and computer codes as opposed to conducting experiments because the time scales associated are on the order of tens of thousands of years. These analyses are concerned primarily with the migration of radioactive contaminants from the repository to the environment accessible to humans. Modeling of this phenomenon depends on a large number of other phenomena taking place in the geologic porous and/or fractured medium. These are ground-water flow, physicochemical interactions of the contaminants with the rock, heat transfer, and mass transport. Once the radionuclides have reached the accessible environment, the pathways to humans and health effects are estimated. A performance assessment methodology for a potential high-level waste repository emplaced in a basalt formation has been developed for the U.S. Nuclear Regulatory Commission.
Matches, Mismatches, and Methods: Multiple-View Workflows for Energy Portfolio Analysis.
Brehmer, Matthew; Ng, Jocelyn; Tate, Kevin; Munzner, Tamara
2016-01-01
The energy performance of large building portfolios is challenging to analyze and monitor, as current analysis tools are not scalable or they present derived and aggregated data at too coarse of a level. We conducted a visualization design study, beginning with a thorough work domain analysis and a characterization of data and task abstractions. We describe generalizable visual encoding design choices for time-oriented data framed in terms of matches and mismatches, as well as considerations for workflow design. Our designs address several research questions pertaining to scalability, view coordination, and the inappropriateness of line charts for derived and aggregated data due to a combination of data semantics and domain convention. We also present guidelines relating to familiarity and trust, as well as methodological considerations for visualization design studies. Our designs were adopted by our collaborators and incorporated into the design of an energy analysis software application that will be deployed to tens of thousands of energy workers in their client base.
Reliability of high-power QCW arrays
NASA Astrophysics Data System (ADS)
Feeler, Ryan; Junghans, Jeremy; Remley, Jennifer; Schnurbusch, Don; Stephens, Ed
2010-02-01
Northrop Grumman Cutting Edge Optronics has developed a family of arrays for high-power QCW operation. These arrays are built using CTE-matched heat sinks and hard solder in order to maximize the reliability of the devices. A summary of a recent life test is presented in order to quantify the reliability of QCW arrays and associated laser gain modules. A statistical analysis of the raw lifetime data is presented in order to quantify the data in such a way that is useful for laser system designers. The life tests demonstrate the high level of reliability of these arrays in a number of operating regimes. For single-bar arrays, a MTTF of 19.8 billion shots is predicted. For four-bar samples, a MTTF of 14.6 billion shots is predicted. In addition, data representing a large pump source is analyzed and shown to have an expected lifetime of 13.5 billion shots. This corresponds to an expected operational lifetime of greater than ten thousand hours at repetition rates less than 370 Hz.
Adaptive harvest management for the Svalbard population of Pink-Footed Geese: 2014 progress summary
Johnson, Fred A.; Madsen, J.
2015-01-01
During the summer of 2013 we computed an optimal harvest strategy for the 3-year period 2013 – 2015. The strategy suggested that the appropriate annual harvest quota is 15 thousand. The 1-year harvest strategy calculated to determine whether an emergency closure of the hunting season is required this year suggested an allowable harvest of 25.0 thousand; thus, a hunting-season closure is not warranted. If the harvest quota of 15 thousand were met in the coming hunting season, the next population count would be expected to be 71.0 thousand. If only the most recent 4-year mean harvest were realized (11.3 thousand), a population size of 74.8 thousand would be expected. Simulations suggest that it will take approximately seven years at current harvest levels to reduce population size to the goal of 60 thousand. However, it is possible that the extension of the forthcoming hunting season in Denmark could result in a total harvest approaching 15 thousand; in this case, simulations suggest it would only take about three years to reach the goal.
Search strategies to identify information on adverse effects: a systematic review
Golder, Su; Loke, Yoon
2009-01-01
Objectives: The review evaluated studies of electronic database search strategies designed to retrieve adverse effects data for systematic reviews. Methods: Studies of adverse effects were located in ten databases as well as by checking references, hand-searching, searching citations, and contacting experts. Two reviewers screened the retrieved records for potentially relevant papers. Results: Five thousand three hundred thirteen citations were retrieved, yielding 19 studies designed to develop or evaluate adverse effect filters, of which 3 met the inclusion criteria. All 3 studies identified highly sensitive search strategies capable of retrieving over 95% of relevant records. However, 1 study did not evaluate precision, while the level of precision in the other 2 studies ranged from 0.8% to 2.8%. Methodological issues in these papers included the relatively small number of records, absence of a validation set of records for testing, and limited evaluation of precision. Conclusions: The results indicate the difficulty of achieving highly sensitive searches for information on adverse effects with a reasonable level of precision. Researchers who intend to locate studies on adverse effects should allow for the amount of resources and time required to conduct a highly sensitive search. PMID:19404498
Carbon Monoxide in Mid-Troposphere over Indonesia Fires, October 2015
2015-10-30
Widespread forest fires across Indonesia have burned tens of thousands of acres over three months, causing high levels of pollution, loss of life, and billions of dollars to the Indonesian government. It is estimated that more than 43 million people have been inhaling toxic fumes, and large parts of Indonesia have been placed in a state of emergency. Most of the fires are believed to have been set to clear farmland during the dry season, but a long term drought enhanced by El Niño conditions have contributed to the fires remaining unchecked due to lack of rain. These images made with data acquired by AIRS, the Atmospheric Infrared Sounder on NASA's Aqua Satellite, show the global concentration of carbon monoxide at the 500hPa pressure level, or approximately 18,000 feet (5,500 meters) altitude. The data are an average of measurements taken over three days, from October 14 through 16, and October 26 through 28, and the high concentration and large extent of the fires over Indonesia are quite apparent. While the scale for this image extends to 400 parts per billion, local values of carbon monoxide can be significantly higher. http://photojournal.jpl.nasa.gov/catalog/PIA20042
The 'Out of Africa' Hypothesis, Human Genetic Diversity, and Comparative Economic Development
Ashraf, Quamrul; Galor, Oded
2013-01-01
This research argues that deep-rooted factors, determined tens of thousands of years ago, had a significant effect on the course of economic development from the dawn of human civilization to the contemporary era. It advances and empirically establishes the hypothesis that, in the course of the exodus of Homo sapiens out of Africa, variation in migratory distance from the cradle of humankind to various settlements across the globe affected genetic diversity and has had a long-lasting effect on the pattern of comparative economic development that is not captured by geographical, institutional, and cultural factors. In particular, the level of genetic diversity within a society is found to have a hump-shaped effect on development outcomes in both the pre-colonial and the modern era, reflecting the trade-off between the beneficial and the detrimental effects of diversity on productivity. While the intermediate level of genetic diversity prevalent among Asian and European populations has been conducive for development, the high degree of diversity among African populations and the low degree of diversity among Native American populations have been a detrimental force in the development of these regions. PMID:25506083
NASA Astrophysics Data System (ADS)
Yang, HongJiang; Wang, Enliang; Dong, WenXiu; Gong, Maomao; Shen, Zhenjie; Tang, Yaguo; Shan, Xu; Chen, Xiangjun
2018-05-01
The a b i n i t i o molecular dynamics (MD) simulations using an atom-centered density matrix propagation method have been carried out to investigate the fragmentation of the ground-state triply charged carbon dioxide, CO23 +→C+ + Oa+ + Ob+ . Ten thousands of trajectories have been simulated. By analyzing the momentum correlation of the final fragments, it is demonstrated that the sequential fragmentation dominates in the three-body dissociation, consistent with our experimental observations which were performed by electron collision at impact energy of 1500 eV. Furthermore, the MD simulations allow us to have detailed insight into the ultrafast evolution of the molecular bond breakage at a very early stage, within several tens of femtoseconds, and the result shows that the initial nuclear vibrational mode plays a decisive role in switching the dissociation pathways.
NCBI GEO: mining tens of millions of expression profiles--database and tools update.
Barrett, Tanya; Troup, Dennis B; Wilhite, Stephen E; Ledoux, Pierre; Rudnev, Dmitry; Evangelista, Carlos; Kim, Irene F; Soboleva, Alexandra; Tomashevsky, Maxim; Edgar, Ron
2007-01-01
The Gene Expression Omnibus (GEO) repository at the National Center for Biotechnology Information (NCBI) archives and freely disseminates microarray and other forms of high-throughput data generated by the scientific community. The database has a minimum information about a microarray experiment (MIAME)-compliant infrastructure that captures fully annotated raw and processed data. Several data deposit options and formats are supported, including web forms, spreadsheets, XML and Simple Omnibus Format in Text (SOFT). In addition to data storage, a collection of user-friendly web-based interfaces and applications are available to help users effectively explore, visualize and download the thousands of experiments and tens of millions of gene expression patterns stored in GEO. This paper provides a summary of the GEO database structure and user facilities, and describes recent enhancements to database design, performance, submission format options, data query and retrieval utilities. GEO is accessible at http://www.ncbi.nlm.nih.gov/geo/
Assessing, Modeling, and Monitoring the Impacts of Extreme Climate Events
NASA Astrophysics Data System (ADS)
Murnane, Richard J.; Diaz, Henry F.
2006-01-01
Extreme weather and climate events provide dramatic content for the news media, and the past few years have supplied plenty of material. The 2004 and 2005 Atlantic hurricane seasons were very active; the United States was struck repeatedly by landfalling major hurricanes. A five-year drought in the southwestern United States was punctuated in 2003 by wildfires in southern California that caused billions of dollars in losses. Ten cyclones of at least tropical storm strength struck Japan in 2004, easily breaking the 1990 and 1993 records of six cyclones each year. Hurricane Catarina was the first recorded hurricane in the South Atlantic. Europe's summer of 2003 saw record-breaking heat that caused tens of thousands of deaths. These events have all been widely publicized, and they naturally raise several questions: Is climate changing, and if so, why? What can we expect in the future? How can we better respond to climate variability regardless of its source?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, Luning; Neuscamman, Eric
We present a modification to variational Monte Carlo’s linear method optimization scheme that addresses a critical memory bottleneck while maintaining compatibility with both the traditional ground state variational principle and our recently-introduced variational principle for excited states. For wave function ansatzes with tens of thousands of variables, our modification reduces the required memory per parallel process from tens of gigabytes to hundreds of megabytes, making the methodology a much better fit for modern supercomputer architectures in which data communication and per-process memory consumption are primary concerns. We verify the efficacy of the new optimization scheme in small molecule tests involvingmore » both the Hilbert space Jastrow antisymmetric geminal power ansatz and real space multi-Slater Jastrow expansions. Satisfied with its performance, we have added the optimizer to the QMCPACK software package, with which we demonstrate on a hydrogen ring a prototype approach for making systematically convergent, non-perturbative predictions of Mott-insulators’ optical band gaps.« less
A Molecular Dynamic Modeling of Hemoglobin-Hemoglobin Interactions
NASA Astrophysics Data System (ADS)
Wu, Tao; Yang, Ye; Sheldon Wang, X.; Cohen, Barry; Ge, Hongya
2010-05-01
In this paper, we present a study of hemoglobin-hemoglobin interaction with model reduction methods. We begin with a simple spring-mass system with given parameters (mass and stiffness). With this known system, we compare the mode superposition method with Singular Value Decomposition (SVD) based Principal Component Analysis (PCA). Through PCA we are able to recover the principal direction of this system, namely the model direction. This model direction will be matched with the eigenvector derived from mode superposition analysis. The same technique will be implemented in a much more complicated hemoglobin-hemoglobin molecule interaction model, in which thousands of atoms in hemoglobin molecules are coupled with tens of thousands of T3 water molecule models. In this model, complex inter-atomic and inter-molecular potentials are replaced by nonlinear springs. We employ the same method to get the most significant modes and their frequencies of this complex dynamical system. More complex physical phenomena can then be further studied by these coarse grained models.
Droplet microfluidics--a tool for single-cell analysis.
Joensson, Haakan N; Andersson Svahn, Helene
2012-12-03
Droplet microfluidics allows the isolation of single cells and reagents in monodisperse picoliter liquid capsules and manipulations at a throughput of thousands of droplets per second. These qualities allow many of the challenges in single-cell analysis to be overcome. Monodispersity enables quantitative control of solute concentrations, while encapsulation in droplets provides an isolated compartment for the single cell and its immediate environment. The high throughput allows the processing and analysis of the tens of thousands to millions of cells that must be analyzed to accurately describe a heterogeneous cell population so as to find rare cell types or access sufficient biological space to find hits in a directed evolution experiment. The low volumes of the droplets make very large screens economically viable. This Review gives an overview of the current state of single-cell analysis involving droplet microfluidics and offers examples where droplet microfluidics can further biological understanding. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Edge systems in the deep ocean
NASA Astrophysics Data System (ADS)
Coon, Andrew; Earp, Samuel L.
2010-04-01
DARPA has initiated a program to explore persistent presence in the deep ocean. The deep ocean is difficult to access and presents a hostile environment. Persistent operations in the deep ocean will require new technology for energy, communications and autonomous operations. Several fundamental characteristics of the deep ocean shape any potential system architecture. The deep sea presents acoustic sensing opportunities that may provide significantly enhanced sensing footprints relative to sensors deployed at traditional depths. Communication limitations drive solutions towards autonomous operation of the platforms and automation of data collection and processing. Access to the seabed presents an opportunity for fixed infrastructure with no important limitations on size and weight. Difficult access and persistence impose requirements for long-life energy sources and potentially energy harvesting. The ocean is immense, so there is a need to scale the system footprint for presence over tens of thousands and perhaps hundreds of thousands of square nautical miles. This paper focuses on the aspect of distributed sensing, and the engineering of networks of sensors to cover the required footprint.
Sriramoju, Manoj Kumar; Chen, Yen; Lee, Yun-Tzai Cloud; Hsu, Shang-Te Danny
2018-05-04
More than one thousand knotted protein structures have been identified so far, but the functional roles of these knots remain elusive. It has been postulated that backbone entanglement may provide additional mechanostability. Here, we employed a bacterial proteasome, ClpXP, to mechanically unfold 5 2 -knotted human ubiquitin C-terminal hydrolase (UCH) paralogs from their C-termini, followed by processive translocation into the proteolytic chamber for degradation. Our results revealed unprecedentedly slow kinetics of ClpXP-mediated proteolysis for the proteasome-associated UCHL5: ten thousand times slower than that of a green fluorescence protein (GFP), which has a comparable size to the UCH domain but much higher chemical and thermal stabilities. The ClpXP-dependent mechanostability positively correlates with the intrinsic unfolding rates of the substrates, spanning over several orders of magnitude for the UCHs. The broad range of mechanostability within the same protein family may be associated with the functional requirements for their differential malleabilities.
DataWarrior: an open-source program for chemistry aware data visualization and analysis.
Sander, Thomas; Freyss, Joel; von Korff, Modest; Rufener, Christian
2015-02-23
Drug discovery projects in the pharmaceutical industry accumulate thousands of chemical structures and ten-thousands of data points from a dozen or more biological and pharmacological assays. A sufficient interpretation of the data requires understanding, which molecular families are present, which structural motifs correlate with measured properties, and which tiny structural changes cause large property changes. Data visualization and analysis software with sufficient chemical intelligence to support chemists in this task is rare. In an attempt to contribute to filling the gap, we released our in-house developed chemistry aware data analysis program DataWarrior for free public use. This paper gives an overview of DataWarrior's functionality and architecture. Exemplarily, a new unsupervised, 2-dimensional scaling algorithm is presented, which employs vector-based or nonvector-based descriptors to visualize the chemical or pharmacophore space of even large data sets. DataWarrior uses this method to interactively explore chemical space, activity landscapes, and activity cliffs.
Upadhyay, Amit A.; Fleetwood, Aaron D.; Adebali, Ogun; ...
2016-04-06
Cellular receptors usually contain a designated sensory domain that recognizes the signal. Per/Arnt/Sim (PAS) domains are ubiquitous sensors in thousands of species ranging from bacteria to humans. Although PAS domains were described as intracellular sensors, recent structural studies revealed PAS-like domains in extracytoplasmic regions in several transmembrane receptors. However, these structurally defined extracellular PAS-like domains do not match sequence-derived PAS domain models, and thus their distribution across the genomic landscape remains largely unknown. Here we show that structurally defined extracellular PAS-like domains belong to the Cache superfamily, which is homologous to, but distinct from the PAS superfamily. Our newly builtmore » computational models enabled identification of Cache domains in tens of thousands of signal transduction proteins including those from important pathogens and model organisms.Moreover, we show that Cache domains comprise the dominant mode of extracellular sensing in prokaryotes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Upadhyay, Amit A.; Fleetwood, Aaron D.; Adebali, Ogun
Cellular receptors usually contain a designated sensory domain that recognizes the signal. Per/Arnt/Sim (PAS) domains are ubiquitous sensors in thousands of species ranging from bacteria to humans. Although PAS domains were described as intracellular sensors, recent structural studies revealed PAS-like domains in extracytoplasmic regions in several transmembrane receptors. However, these structurally defined extracellular PAS-like domains do not match sequence-derived PAS domain models, and thus their distribution across the genomic landscape remains largely unknown. Here we show that structurally defined extracellular PAS-like domains belong to the Cache superfamily, which is homologous to, but distinct from the PAS superfamily. Our newly builtmore » computational models enabled identification of Cache domains in tens of thousands of signal transduction proteins including those from important pathogens and model organisms.Moreover, we show that Cache domains comprise the dominant mode of extracellular sensing in prokaryotes.« less
Friend, M.; Franson, J.C.
1999-01-01
Individual disease outbreaks have killed many thousands of animals on numerous occasions. Tens of thousands of migratory birds have died in single die-offs with as many as 1,000 birds succumbing in 1 day. The ability to successfully combat such explosive situations is highly dependent on the readiness of field personnel to deal with them. Because many disease agents can spread through wildlife populations very quickly, advance preparation is essential for preventing infected animals from spreading disease to additional species and locations. Carefully thought-out disease contingency plans should be developed as practical working documents for field personnel and updated as necessary. Well-designed plans can prove invaluable in minimizing wildlife losses and the costs associated with disease control activities.Although requirements for disease control operations vary and must be tailored to each situation, all disease contingency planning involves general concepts and basic biological information. This chapter, which is intended to be a practical guide, identifies the major activities and needs of disease control operations, and relates them to disease contingency planning.
Unveiling adaptation using high-resolution lineage tracking
NASA Astrophysics Data System (ADS)
Blundell, Jamie; Levy, Sasha; Fisher, Daniel; Petrov, Dmitri; Sherlock, Gavin
2013-03-01
Human diseases such as cancer and microbial infections are adaptive processes inside the human body with enormous population sizes: between 106 -1012 cells. In spite of this our understanding of adaptation in large populations is limited. The key problem is the difficulty in identifying anything more than a handful of rare, large-effect beneficial mutations. The development and use of molecular barcodes allows us to uniquely tag hundreds of thousands of cells and enable us to track tens of thousands of adaptive mutations in large yeast populations. We use this system to test some of the key theories on which our understanding of adaptation in large populations is based. We (i) measure the fitness distribution in an evolving population at different times, (ii) identify when an appreciable fraction of clones in the population have at most a single adaptive mutation and isolate a large number of clones with independent single adaptive mutations, and (iii) use this clone collection to determine the distribution of fitness effects of single beneficial mutations.
Huang, Rao; Lo, Li-Ta; Wen, Yuhua; Voter, Arthur F; Perez, Danny
2017-10-21
Modern molecular-dynamics-based techniques are extremely powerful to investigate the dynamical evolution of materials. With the increase in sophistication of the simulation techniques and the ubiquity of massively parallel computing platforms, atomistic simulations now generate very large amounts of data, which have to be carefully analyzed in order to reveal key features of the underlying trajectories, including the nature and characteristics of the relevant reaction pathways. We show that clustering algorithms, such as the Perron Cluster Cluster Analysis, can provide reduced representations that greatly facilitate the interpretation of complex trajectories. To illustrate this point, clustering tools are used to identify the key kinetic steps in complex accelerated molecular dynamics trajectories exhibiting shape fluctuations in Pt nanoclusters. This analysis provides an easily interpretable coarse representation of the reaction pathways in terms of a handful of clusters, in contrast to the raw trajectory that contains thousands of unique states and tens of thousands of transitions.
NASA Astrophysics Data System (ADS)
Huang, Rao; Lo, Li-Ta; Wen, Yuhua; Voter, Arthur F.; Perez, Danny
2017-10-01
Modern molecular-dynamics-based techniques are extremely powerful to investigate the dynamical evolution of materials. With the increase in sophistication of the simulation techniques and the ubiquity of massively parallel computing platforms, atomistic simulations now generate very large amounts of data, which have to be carefully analyzed in order to reveal key features of the underlying trajectories, including the nature and characteristics of the relevant reaction pathways. We show that clustering algorithms, such as the Perron Cluster Cluster Analysis, can provide reduced representations that greatly facilitate the interpretation of complex trajectories. To illustrate this point, clustering tools are used to identify the key kinetic steps in complex accelerated molecular dynamics trajectories exhibiting shape fluctuations in Pt nanoclusters. This analysis provides an easily interpretable coarse representation of the reaction pathways in terms of a handful of clusters, in contrast to the raw trajectory that contains thousands of unique states and tens of thousands of transitions.
2D photoacoustic scanning imaging with a single pulsed laser diode excitation
NASA Astrophysics Data System (ADS)
Chen, Xuegang; Li, Changwei; Zeng, Lvming; Liu, Guodong; Huang, Zhen; Ren, Zhong
2012-03-01
A portable near-infrared photoacoustic scanning imaging system has been developed with a single pulsed laser diode, which was integrated with an optical lens system to straightforward boost the laser energy density for photoacoustic generation. The 905 nm laser diode provides a maximum energy output of 14 μJ within 100 ns pulse duration, and the pulse repetition frequency rate is 0.8 KHz. As a possible alternative light source, the preliminary 2D photoacoustic results primely correspond with the test phantoms of umbonate extravasated gore and knotted blood vessel network. The photoacoustic SNR can reach 20.6+/-1.2 dB while signal averaging reduces to 128 pulses from thousands to tens of thousands times, and the signal acquisition time accelerates to less than 0.2 s in each A-scan, especially the volume of the total radiation source is only 10 × 3 × 3 cm3. It demonstrated that the pulsed semiconductor laser could be a candidate of photoacoustic equipment for daily clinical application.
Ultrapulse welding: A new joining technique. [for automotive industry
NASA Technical Reports Server (NTRS)
Anderson, D. G.
1972-01-01
The ultrapulse process is a resistance welding process that utilizes unidirectional current of high magnitude for a very short time with a precisely controlled dynamic force pulse. Peak currents of up to 220,000 amperes for two to ten milliseconds are used with synchronized force pulses of up to nine thousand pounds. The welding current passing through the relatively high resistance of the interface between the parts that are being joined results in highly localized heating. Described is the UPW process as it applies to the automotive industry.
Promising application of dynamic nuclear polarization for in vivo (13)C MR imaging.
Yen, Yi-Fen; Nagasawa, Kiyoshi; Nakada, Tsutomu
2011-01-01
Use of hyperpolarized (13)C in magnetic resonance (MR) imaging is a new technique that enhances signal tens of thousands-fold. Recent in vivo animal studies of metabolic imaging that used hyperpolarized (13)C demonstrated its potential in many applications for disease indication, metabolic profiling, and treatment monitoring. We review the basic physics for dynamic nuclear polarization (DNP) and in vivo studies reported in prostate cancer research, hepatocellular carcinoma research, diabetes and cardiac applications, brain metabolism, and treatment response as well as investigations of various DNP (13)C substrates.
Using VizieR/Aladin to Measure Neglected Double Stars
NASA Astrophysics Data System (ADS)
Harshaw, Richard
2013-04-01
The VizierR service of the Centres de Donnes Astronomiques de Strasbourg (France) offers amateur astronomers a treasure trove of resources, including access to the most current version of the Washington Double Star Catalog (WDS) and links to tens of thousands of digitized sky survey plates via the Aladin Java applet. These plates allow the amateur to make accurate measurements of position angle and separation for many neglected pairs that fall within reasonable tolerances for the use of Aladin. This paper presents 428 measurements of 251 neglected pairs from the WDS.
Owen, Jesse; Imel, Zac E
2016-04-01
This article introduces the special section on utilizing large data sets to explore psychotherapy processes and outcomes. The increased use of technology has provided new opportunities for psychotherapy researchers. In particular, there is a rise in large databases of tens of thousands clients. Additionally, there are new ways to pool valuable resources for meta-analytic processes. At the same time, these tools also come with limitations. These issues are introduced as well as brief overview of the articles. (c) 2016 APA, all rights reserved).
Holographic Characterization of Colloidal Fractal Aggregates
NASA Astrophysics Data System (ADS)
Wang, Chen; Cheong, Fook Chiong; Ruffner, David B.; Zhong, Xiao; Ward, Michael D.; Grier, David G.
In-line holographic microscopy images of micrometer-scale fractal aggregates can be interpreted with the Lorenz-Mie theory of light scattering and an effective-sphere model to obtain each aggregate's size and the population-averaged fractal dimension. We demonstrate this technique experimentally using model fractal clusters of polystyrene nanoparticles and fractal protein aggregates composed of bovine serum albumin and bovine pancreas insulin. This technique can characterize several thousand aggregates in ten minutes and naturally distinguishes aggregates from contaminants such as silicone oil droplets. Work supported by the SBIR program of the NSF.
1994-03-01
reality the structure of even one individual aircraft consists of many bat- ches and the tens of thousand of cars of one type manufactured in even...generated neural network power spectral densities of surface pressures are used to augment existing data and then load an elastic finite clement...investigated for possible use in augmenting this information which is required for fatigue life calculations. Since empennage environments on fighter
2006-11-01
color images. 14. ABSTRACT 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18 . NUMBER OF PAGES 8 19a. NAME OF...Std Z39- 18 small problem domain can require millions of solution vari- ables solved repeatedly for tens of thousands of time steps. Finally, the...terms of vector and scalar potentials, A and ψ respec- tively. E = − ( ∂A ∂t +∇ψ ) = Erot + Eirr (5) Since the curl of a gradient is always zero, ∇ψ
The Genetics of Canine Skull Shape Variation
Schoenebeck, Jeffrey J.; Ostrander, Elaine A.
2013-01-01
A dog’s craniofacial diversity is the result of continual human intervention in natural selection, a process that began tens of thousands of years ago. To date, we know little of the genetic underpinnings and developmental mechanisms that make dog skulls so morphologically plastic. In this Perspectives, we discuss the origins of dog skull shapes in terms of history and biology and highlight recent advances in understanding the genetics of canine skull shapes. Of particular interest are those molecular genetic changes that are associated with the development of distinct breeds. PMID:23396475
Twist-induced tuning in tapered fiber couplers.
Birks, T A
1989-10-01
The power-splitting ratio of fused tapered single-mode fiber couplers can be reversibly tuned by axial twisting without affecting loss. The twist-tuning behavior of a range of different tapered couplers is described. A simple expression for twist-tuning can be derived by representing the effects of twist by a change in the refractive index profile. Good agreement between this expression and experimental results is demonstrated. Repeated tuning over tens of thousands of cycles is found not to degrade coupler performance, and a number of practical applications, including a freely tunable tapered coupler, are described.
Bensaddek, Dalila; Narayan, Vikram; Nicolas, Armel; Murillo, Alejandro Brenes; Gartner, Anton; Kenyon, Cynthia J; Lamond, Angus I
2016-02-01
Proteomics studies typically analyze proteins at a population level, using extracts prepared from tens of thousands to millions of cells. The resulting measurements correspond to average values across the cell population and can mask considerable variation in protein expression and function between individual cells or organisms. Here, we report the development of micro-proteomics for the analysis of Caenorhabditis elegans, a eukaryote composed of 959 somatic cells and ∼1500 germ cells, measuring the worm proteome at a single organism level to a depth of ∼3000 proteins. This includes detection of proteins across a wide dynamic range of expression levels (>6 orders of magnitude), including many chromatin-associated factors involved in chromosome structure and gene regulation. We apply the micro-proteomics workflow to measure the global proteome response to heat-shock in individual nematodes. This shows variation between individual animals in the magnitude of proteome response following heat-shock, including variable induction of heat-shock proteins. The micro-proteomics pipeline thus facilitates the investigation of stochastic variation in protein expression between individuals within an isogenic population of C. elegans. All data described in this study are available online via the Encyclopedia of Proteome Dynamics (http://www.peptracker.com/epd), an open access, searchable database resource. © 2015 The Authors. PROTEOMICS Published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Noisy Icebergs: Low Frequency Acoustic Noise Levels Observed off Palmyra Atoll
NASA Astrophysics Data System (ADS)
Matsumoto, H.; Wiggins, S. M.; Sirovic, A.; Tournadre, J.; Oleson, E.; Haxel, J. H.; Dziak, R. P.
2016-12-01
Annually tens of thousands of icebergs from Antarctica drift into the open ocean. In late 2007, two unusually large icebergs, B15a and C19a, entered the Pacific region of the Southern Ocean, and began rapidly disintegrating. Approximately 1.5 years later in April 2009, both icebergs had completely fragmented. An unappreciated aspect of the destructive processes that occur while these large icebergs break apart is the high acoustic source levels that are generated and the contribution of those signals to the ocean soundscape throughout the southern hemisphere. Matsumoto et al. (2014) found evidence of B15a and C19a affecting low-frequency noise levels below 36 Hz at 8°N, 110°W in the eastern equatorial Pacific at a range of 7,500 km. Similar evidence for disintegrating icebergs affecting soundscapes at a similar range was observed in data from 2007-2009 High-frequency Acoustic Recording Package recordings by Scripps Institution of Oceanography near Palmyra atoll in the central equatorial Pacific. Noise levels rose in 2007 as the icebergs entered the Pacific and decreased as the destructive processes declined and the icebergs disintegrated in 2009. This suggests that iceberg sounds are a significant natural noise source in the global ocean, and the area affected by the destructive processes during their decomposition can be as large as the entire southern hemisphere.
NASA Technical Reports Server (NTRS)
Mena-Werth, Jose
1998-01-01
The Vulcan Photometric Planet Search is the ground-based counterpart of Kepler Mission Proposal. The Kepler Proposal calls for the launch of telescope to look intently at a small patch of sky for four year. The mission is designed to look for extra-solar planets that transit sun-like stars. The Kepler Mission should be able to detect Earth-size planets. This goal requires an instrument and software capable of detecting photometric changes of several parts per hundred thousand in the flux of a star. The goal also requires the continuous monitoring of about a hundred thousand stars. The Kepler Mission is a NASA Discovery Class proposal similar in cost to the Lunar Prospector. The Vulcan Search is also a NASA project but based at Lick Observatory. A small wide-field telescope monitors various star fields successively during the year. Dozens of images, each containing tens of thousands of stars, are taken any night that weather permits. The images are then monitored for photometric changes of the order of one part in a thousand. These changes would reveal the transit of an inner-orbit Jupiter-size planet similar to those discovered recently in spectroscopic searches. In order to achieve a one part in one thousand photometric precision even the choice of a filter used in taking an exposure can be critical. The ultimate purpose of an filter is to increase the signal-to-noise ratio (S/N) of one's observation. Ideally, filters reduce the sky glow cause by street lights and, thereby, make the star images more distinct. The higher the S/N, the higher is the chance to observe a transit signal that indicates the presence of a new planet. It is, therefore, important to select the filter that maximizes the S/N.
Wilber 3: A Python-Django Web Application For Acquiring Large-scale Event-oriented Seismic Data
NASA Astrophysics Data System (ADS)
Newman, R. L.; Clark, A.; Trabant, C. M.; Karstens, R.; Hutko, A. R.; Casey, R. E.; Ahern, T. K.
2013-12-01
Since 2001, the IRIS Data Management Center (DMC) WILBER II system has provided a convenient web-based interface for locating seismic data related to a particular event, and requesting a subset of that data for download. Since its launch, both the scale of available data and the technology of web-based applications have developed significantly. Wilber 3 is a ground-up redesign that leverages a number of public and open-source projects to provide an event-oriented data request interface with a high level of interactivity and scalability for multiple data types. Wilber 3 uses the IRIS/Federation of Digital Seismic Networks (FDSN) web services for event data, metadata, and time-series data. Combining a carefully optimized Google Map with the highly scalable SlickGrid data API, the Wilber 3 client-side interface can load tens of thousands of events or networks/stations in a single request, and provide instantly responsive browsing, sorting, and filtering of event and meta data in the web browser, without further reliance on the data service. The server-side of Wilber 3 is a Python-Django application, one of over a dozen developed in the last year at IRIS, whose common framework, components, and administrative overhead represent a massive savings in developer resources. Requests for assembled datasets, which may include thousands of data channels and gigabytes of data, are queued and executed using the Celery distributed Python task scheduler, giving Wilber 3 the ability to operate in parallel across a large number of nodes.
NASA Astrophysics Data System (ADS)
Lai, H.; Russell, C. T.; Wei, H.; Delzanno, G. L.; Connors, M. G.
2014-12-01
Near-Earth objects (NEOs) of tens of meters in diameter are difficult to detect by optical methods from the Earth but they result in the most damage per year. Many of these bodies are produced in non-destructive collisions with larger well-characterized NEOs. After generation, the debris spreads forward and backward in a cocoon around the orbit of the parent body. Thereafter, scattering will occur due to gravitational perturbations when the debris stream passes near a planet even when the parent body has no such close approaches. Therefore "safe" NEOs which have no close encounters to the Earth for thousands of years may be accompanied by potentially hazardous co-orbiting debris. We have developed a technique to identify co-orbiting debris by detecting the magnetic signature produced when some of the debris suffers destructive collisions with meteoroids, which are numerous and can be as small as tens of centimeters in diameter. Clouds of nanoscale dust/gas particles released in such collisions can interact coherently with the solar wind electromagnetically. The resultant magnetic perturbations are readily identified when they pass spacecraft equipped with magnetometers. We can use such observations to obtain the spatial and size distribution as well as temporal variation of the debris streams. A test of this technique has been performed and debris streams both leading and trailing asteroid 138175 have been identified. There is a finite spread across the original orbit and most of the co-orbitals were tens of meters in diameter before the disruptive collisions. We estimate that there were tens of thousands of such co-orbiting objects, comprising only 1% of the original mass of the parent asteroid but greatly increasing the impact hazard. A loss of the co-orbitals since 1970s has been inferred from observations with a decay time consistent with that calculated from the existing collisional model [Grün et al., 1985]. Therefore disruptive collisions are the main loss mechanism of the co-orbiting debris associated with 138175. In summary, our technique helps us to identify which NEOs are accompanied by hazardous debris trails. Although our technique provides only the statistical properties, it indicates where high resolution optical surveys should be obtained in order to identify and track specific hazardous bodies.
[Economic burden of cancer in China during 1996-2014: a systematic review].
Shi, J F; Shi, C L; Yue, X P; Huang, H Y; Wang, L; Li, J; Lou, P A; Mao, A Y; Dai, M
2016-12-23
Objective: To explore the current status of research on economic burden of cancer in China from 1996 to 2014. Methods: The key words including cancer, economic burden, expenditure, cost were used to retrieve the literatures published in CNKI and Wanfang (the two most commonly used databases for literature in Chinese) and PubMed during 1996-2014. A total of 91 studies were included after several exclusionary procedures. Information on subjects and data source, methodology, main results were structurally abstracted. All the expenditure data were discounted to year of 2013 value using China's health care consumer price indices. Results: More than half of the included studies were published over the past 5 years, 32 of the studies were about lung cancer. Among the 83 individual-based surveys, 77 were hospital-based and obtained data via individually medical record abstraction, and most of which only considered the direct medical expenditure. Expenditure per cancer patient and expenditure per diem were the most commonly used outcome indicators. Majority of the findings on expenditure per cancer patient ranged from 10 thousands to 30 thousands Chinese Yuan (CNY), with larger disparity in lung and breast cancer (ranged from 10 thousands to 90 thousands CNY), narrower difference in esophageal and stomach cancer (ranged from 10 thousands to 50 thousands CNY), and most stable trend in cervical cancer (almost all the values less than 20 thousands CNY). Without exception, the expenditures per diem for all the common cancers were increasing over the period from 1996 to 2014 (3-7 fold increase). Only 8 population-level economic burden studies were included and the reported expenditure of cancer at national level ranged from 32.6 billions to 100.7 billions CNY. Conclusions: Evidence on economic burden of cancer in China from 1996 to 2014 are limited and weakly comparable, particularly at a population level, and the reported expenditure per patient may be underestimated.
Tian, Yue Yue; Zhang, Li Xia; Zhang, Zheng Qun; Qiao, Ming Ming; Fan, Yan Gen
2017-03-18
In order to ensure the suitable shade model for 'Huangjinya' tea plant in Shandong Province, black or blue shading net at 55%, 70% or 85% shading rates was selected to recover tea garden in summer and autumn, then micro-climate of tea garden, leaf color, chlorophyll fluorescence parameters, growth status and biochemical composition of tea shoots were investigated.The results showed that compared with the control, light intensity and air temperature in tea garden, leaf temperature of tea plants in different shading treatments significantly decreased, while air humidity in tea garden increased. The contents of chlorophyll in the tea leaves were obviously increased with increasing the shading rate, which resulted in the leaf color becoming green. The yellowing characteristics and biochemical quality of 'Huangjinya' tea plants could be well kept in 55% shading treatments. In 70% shading treatments, 'Huangjinya' tea plants had better growth situation and higher yield with no photo-inhibition. Compared with the blue shading treatments, black shading treatments could obviously promote the growth of 'Huangjinya' tea plants, keep yellowing characteristics, and improve the quality. Therefore, the 70% black shading treatment (daily PAR values of 1.2-3.5 ten thousand lx) was appropriate for promoting the growth of 'Huangjinya' tea plants at the seedling stage. For mature tea plants, the 55% black shading treatment (daily PAR values of 1.8-5.5 ten thousand lx) could be used to keep the yellowing characteristics and to improve biochemical quality effectively, so as to give full play to its variety characteristics, to achieve goal of high quality and high yield.
Slone, Daniel H.; Reid, James P.; Kenworthy, W. Judson
2013-01-01
Turbid water conditions make the delineation and characterization of benthic habitats difficult by traditional in situ and remote sensing methods. Here, we develop and validate modeling and sampling methodology for detecting and characterizing seagrass beds by analyzing GPS telemetry records from radio-tagged manatees. Between October 2002 and October 2005, 14 manatees were tracked in the Ten Thousand Islands (TTI) in southwest Florida (USA) using Global Positioning System (GPS) tags. High density manatee use areas were found to occur off each island facing the open, nearshore waters of the Gulf of Mexico. We implemented a spatially stratified random sampling plan and used a camera-based sampling technique to observe and record bottom observations of seagrass and macroalgae presence and abundance. Five species of seagrass were identified in our study area: Halodule wrightii, Thalassia testudinum, Syringodium filiforme, Halophila engelmannii, and Halophila decipiens. A Bayesian model was developed to choose and parameterize a spatial process function that would describe the observed patterns of seagrass and macroalgae. The seagrasses were found in depths <2 m and in the higher manatee use strata, whereas macroalgae was found at moderate densities at all sampled depths and manatee use strata. The manatee spatial data showed a strong association with seagrass beds, a relationship that increased seagrass sampling efficiency. Our camera-based field sampling proved to be effective for assessing seagrass density and spatial coverage under turbid water conditions, and would be an effective monitoring tool to detect changes in seagrass beds.
Grid site availability evaluation and monitoring at CMS
Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe; ...
2017-10-01
The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impactmore » data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Furthermore, enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.« less
Grid site availability evaluation and monitoring at CMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe
The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impactmore » data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Furthermore, enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.« less
Grid site availability evaluation and monitoring at CMS
NASA Astrophysics Data System (ADS)
Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe; Lammel, Stephan; Sciabà, Andrea
2017-10-01
The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impact data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.
BlazeDEM3D-GPU A Large Scale DEM simulation code for GPUs
NASA Astrophysics Data System (ADS)
Govender, Nicolin; Wilke, Daniel; Pizette, Patrick; Khinast, Johannes
2017-06-01
Accurately predicting the dynamics of particulate materials is of importance to numerous scientific and industrial areas with applications ranging across particle scales from powder flow to ore crushing. Computational discrete element simulations is a viable option to aid in the understanding of particulate dynamics and design of devices such as mixers, silos and ball mills, as laboratory scale tests comes at a significant cost. However, the computational time required to simulate an industrial scale simulation which consists of tens of millions of particles can take months to complete on large CPU clusters, making the Discrete Element Method (DEM) unfeasible for industrial applications. Simulations are therefore typically restricted to tens of thousands of particles with highly detailed particle shapes or a few million of particles with often oversimplified particle shapes. However, a number of applications require accurate representation of the particle shape to capture the macroscopic behaviour of the particulate system. In this paper we give an overview of the recent extensions to the open source GPU based DEM code, BlazeDEM3D-GPU, that can simulate millions of polyhedra and tens of millions of spheres on a desktop computer with a single or multiple GPUs.
Hill, Jon; Davis, Katie E
2014-01-01
Building large supertrees involves the collection, storage, and processing of thousands of individual phylogenies to create large phylogenies with thousands to tens of thousands of taxa. Such large phylogenies are useful for macroevolutionary studies, comparative biology and in conservation and biodiversity. No easy to use and fully integrated software package currently exists to carry out this task. Here, we present a new Python-based software package that uses well defined XML schema to manage both data and metadata. It builds on previous versions by 1) including new processing steps, such as Safe Taxonomic Reduction, 2) using a user-friendly GUI that guides the user to complete at least the minimum information required and includes context-sensitive documentation, and 3) a revised storage format that integrates both tree- and meta-data into a single file. These data can then be manipulated according to a well-defined, but flexible, processing pipeline using either the GUI or a command-line based tool. Processing steps include standardising names, deleting or replacing taxa, ensuring adequate taxonomic overlap, ensuring data independence, and safe taxonomic reduction. This software has been successfully used to store and process data consisting of over 1000 trees ready for analyses using standard supertree methods. This software makes large supertree creation a much easier task and provides far greater flexibility for further work.
Guillebaud, Julia; Mahamadou, Aboubacar; Zamanka, Halima; Katzelma, Mariama; Arzika, Ibrahim; Ibrahim, Maman L; Eltahir, Elfatih Ab; Labbo, Rabiou; Druilhe, Pierre; Duchemin, Jean-Bernard; Fandeur, Thierry
2013-10-30
Few data are available about malaria epidemiological situation in Niger. However, implementation of new strategies such as vaccination or seasonal treatment of a target population requires the knowledge of baseline epidemiological features of malaria. A population-based study was conducted to provide better characterization of malaria seasonal variations and population groups the most at risk in this particular area. From July 2007 to December 2009, presumptive cases of malaria among a study population living in a typical Sahelian village of Niger were recorded, and confirmed by microscopic examination. In parallel, asymptomatic carriers were actively detected at the end of each dry season in 2007, 2008 and 2009. Among the 965 presumptive malaria cases recorded, 29% were confirmed by microscopic examination. The incidence of malaria was found to decrease significantly with age (p < 0.01). The mean annual incidence was 0.254. The results show that the risk of malaria was higher in children under ten years (p < 0.0001). The number of malaria episodes generally followed the temporal pattern of changes in precipitation levels, with a peak of transmission in August and September. One-thousand and ninety subjects were submitted to an active detection of asymptomatic carriage of whom 16% tested positive; asymptomatic carriage decreased with increasing age. A higher prevalence of gametocyte carriage among asymptomatic population was recorded in children aged two to ten years, though it did not reach significance. In Southern Niger, malaria transmission mostly occurs from July to October. Children aged two to ten years are the most at risk of malaria, and may also represent the main reservoir for gametocytes. Strategies such as intermittent preventive treatment in children (IPTc) could be of interest in this area, where malaria transmission is highly seasonal. Based on these preliminary data, a pilot study could be implemented in Zindarou using IPTc targeting children aged two to ten years, during the three months of malaria transmission, together with an accurate monitoring of drug resistance.
Vargas, Carlos; Falchook, Aaron; Indelicato, Daniel; Yeung, Anamaria; Henderson, Randall; Olivier, Kenneth; Keole, Sameer; Williams, Christopher; Li, Zuofeng; Palta, Jatinder
2009-04-01
The ability to determine the accuracy of the final prostate position within a determined action level threshold for image-guided proton therapy is unclear. Three thousand one hundred ten images for 20 consecutive patients treated in 1 of our 3 proton prostate protocols from February to May of 2007 were analyzed. Daily kV images and patient repositioning were performed employing an action-level threshold (ALT) of > or = 2.5 mm for each beam. Isocentric orthogonal x-rays were obtained, and prostate position was defined via 3 gold markers for each patient in the 3 axes. To achieve and confirm our action level threshold, an average of 2 x-rays sets (median 2; range, 0-4) was taken daily for each patient. Based on our ALT, we made no corrections in 8.7% (range, 0%-54%), 1 correction in 82% (41%-98%), and 2 to 3 corrections in 9% (0-27%). No patient needed 4 or more corrections. All patients were treated with a confirmed error of < 2.5 mm for every beam delivered. After all corrections, the mean and standard deviations were: anterior-posterior (z): 0.003 +/- 0.094 cm; superior-inferior (y): 0.028 +/- 0.073 cm; and right-left (x) -0.013 +/- 0.08 cm. It is feasible to limit all final prostate positions to less than 2.5 mm employing an action level image-guided radiation therapy (IGRT) process. The residual errors after corrections were very small.
Perspectives on the Near-Earth Object Impact Hazard After Chelyabinsk
NASA Astrophysics Data System (ADS)
Chapman, C. R.
2013-12-01
Until this year, the NEO impact hazard had been regarded as a theoretical example of a very low probability high consequence natural disaster. There had been no confirmed examples of fatalities directly due to asteroid or meteoroid strikes. (There still aren't.) The several megaton Tunguska event in 1908 was in a remote, unpopulated place. So human beings have been witnessing only the tiniest analogs of asteroid strikes, the night-sky meteors and occasional bolides, which - on rare occasions - yield meteoritic fragments that puncture holes in roofs. Though the NEO impact hazard has occasionally been treated in the natural hazards literature, interest primarily remained in the planetary science and aerospace communities. The Chelyabinsk asteroid impact on 15 February 2013 was a real disaster, occurring near a city with a population exceeding a million. Well over a thousand people were injured, thousands of buildings suffered at least superficial damage (mainly to windows), schools and sports facilities were closed, and emergency responders swarmed across the city and surrounding rural areas. While the consequences were very small compared with larger natural disasters, which kill tens of thousands of people annually worldwide, this specific case - for the first time - has permitted a calibration of the consequences of the rare impacts asteroid astronomers have been predicting. There now are reasons to expect that impacts by bodies tens of meters in diameter are several times more frequent than had been thought and each impact is more damaging than previously estimated. The Chelyabinsk event, produced by a 20 meter diameter asteroid, specifically suggests that asteroids just 15 meters diameter, or even smaller, could be very dangerous and damaging; indeed, a more common steeper impact angle would have produced more consequential damage on the ground. This contrasts with estimates a decade earlier [NASA NEO Science Definition Team report, 2003] that asteroids smaller than 40 to 50 meters diameter would explode harmlessly in the upper atmosphere. Given the observed size-frequency relation for NEOs, this means that dangerous impacts could be many tens of times more frequent than had been thought. New observing campaigns (e.g. ATLAS) oriented towards finding roughly half of the frequent smaller impactors meters to tens of meters in size during their final days to weeks before impact will soon result in warnings every few years of a potentially dangerous impact, perhaps requiring evacuation or instructions to shelter-in-place, even though most will turn out to be essentially harmless events. Warnings may become even more frequent as prudent emergency managers take into account the large uncertainties in sizes and destructive potential of these 'final plungers.' So emergency management officials around the world should at least be aware of the potential for a NEO impact to produce a real, if generally minor and local, natural disaster. Fortunately, success of the Spaceguard search for civilization-threatening large NEOs (> 1 km diameter) over the last 15 years has nearly retired the risk of global calamity by impact. So attention turns to the much smaller impacts that are far less dangerous, but soon will be frequently predicted and so cannot be ignored.
Solar System science with the Large Synoptic Survey Telescope
NASA Astrophysics Data System (ADS)
Jones, Lynne; Brown, Mike; Ivezić, Zeljko; Jurić, Mario; Malhotra, Renu; Trilling, David
2015-11-01
The Large Synoptic Survey Telescope (LSST; http://lsst.org) will be a large-aperture, wide-field, ground-based telescope that will survey half the sky every few nights in six optical bands from 320 to 1050 nm. It will explore a wide range of astrophysical questions, ranging from performing a census of the Solar System, to examining the nature of dark energy. It is currently in construction, slated for first light in 2019 and full operations by 2022.The LSST will survey over 20,000 square degrees with a rapid observational cadence, to typical limiting magnitudes of r~24.5 in each visit (9.6 square degree field of view). Automated software will link the individual detections into orbits; these orbits, as well as precisely calibrated astrometry (~50mas) and photometry (~0.01-0.02 mag) in multiple bandpasses will be available as LSST data products. The resulting data set will have tremendous potential for planetary astronomy; multi-color catalogs of hundreds of thousands of NEOs and Jupiter Trojans, millions of asteroids, tens of thousands of TNOs, as well as thousands of other objects such as comets and irregular satellites of the major planets.LSST catalogs will increase the sample size of objects with well-known orbits 10-100 times for small body populations throughout the Solar System, enabling a major increase in the completeness level of the inventory of most dynamical classes of small bodies and generating new insights into planetary formation and evolution. Precision multi-color photometry will allow determination of lightcurves and colors, as well as spin state and shape modeling through sparse lightcurve inversion. LSST is currently investigating survey strategies to optimize science return across a broad range of goals. To aid in this investigation, we are making a series of realistic simulated survey pointing histories available together with a Python software package to model and evaluate survey detections for a user-defined input population. Preliminary metrics from these simulations are shown here; the community is invited to provide further input.
2017-01-01
Abstract Tens of thousands of women were coercively sterilized in Czechoslovakia and its successor states. Romani women were particularly targeted for these measures. These practices stopped only in 2004, as a result of international pressure. Although some measures have been taken to ensure that these practices are not repeated, to date neither the Czech Republic nor Slovakia have completed the work of providing effective remedy to victims, as is their right. This article focusses on efforts in the Czech Republic. It concludes that, inter alia, an administrative mechanism is needed to provide financial compensation to victims, since the road to remedy via courts is effectively blocked. PMID:29302159
Dynamics of playa lakes in the Texas High Plains
NASA Technical Reports Server (NTRS)
Reeves, C. C., Jr. (Principal Investigator)
1972-01-01
The author has identified the following significant results. Regional viewing of ERTS-1 imagery around the test sites shows that storm paths can be accurately traced and a count made of the number of intermittent lake basins filled by the storm. Therefore, during wet years ERTS-type imagery can be used to conduct a reliable count of the tens of thousands of natural lake basins on the southern High Plains which contain water. This type of regional overview of water filled basins in the normally arid southern High Plains is illustrated by bands 6 and 7, ERTS E-1078-16524.
Thermonuclear runaways in thick hydrogen rich envelopes of neutron stars
NASA Technical Reports Server (NTRS)
Starrfield, S. G.; Kenyon, S.; Truran, J. W.; Sparks, W. M.
1981-01-01
A Lagrangian, fully implicit, one dimensional hydrodynamic computer code was used to evolve thermonuclear runaways in the accreted hydrogen rich envelopes of 1.0 Msub solar neutron stars with radii of 10 km and 20 km. Simulations produce outbursts which last from about 750 seconds to about one week. Peak effective temeratures and luninosities were 26 million K and 80 thousand Lsub solar for the 10 km study and 5.3 millison and 600 Lsub solar for the 20 km study. Hydrodynamic expansion on the 10 km neutron star produced a precursor lasting about one ten thousandth seconds.
Reliability, synchrony and noise
Ermentrout, G. Bard; Galán, Roberto F.; Urban, Nathaniel N.
2008-01-01
The brain is noisy. Neurons receive tens of thousands of highly fluctuating inputs and generate spike trains that appear highly irregular. Much of this activity is spontaneous—uncoupled to overt stimuli or motor outputs—leading to questions about the functional impact of this noise. Although noise is most often thought of as disrupting patterned activity and interfering with the encoding of stimuli, recent theoretical and experimental work has shown that noise can play a constructive role—leading to increased reliability or regularity of neuronal firing in single neurons and across populations. These results raise fundamental questions about how noise can influence neural function and computation. PMID:18603311
Precision Astrophysics Experiments with the Kepler Satellite
NASA Astrophysics Data System (ADS)
Jackiewicz, Jason
2012-10-01
Long photometric observations from space of tens of thousands of stars, such as those provided by Kepler, offer unique opportunities to carry out ensemble astrophysics as well as detailed studies of individual objects. One of the primary tools at our disposal for understanding pulsating stars is asteroseismology, which uses observed stellar oscillation frequencies to determine interior properties. This can provide very strict constraints on theories of stellar evolution, structure, and the population characteristics of stars in the Milky Way galaxy. This talk will focus on several of the exciting insights Kepler has enabled through asteroseismology of stars across the H-R diagram.
The role of the dentist in identifying missing and unidentified persons.
Riley, Amber D
2015-01-01
The longer a person is missing, the more profound the need for dental records becomes. In 2013, there were >84,000 missing persons and >8,000 unidentified persons registered in the National Crime Information Center (NCIC) database. Tens of thousands of families are left without answers or closure, always maintaining hope that their relative will be located. Law enforcement needs the cooperation of organized dentistry to procure dental records, translate their findings, and upload them into the NCIC database for cross-matching with unidentified person records created by medical examiner and coroner departments across the United States and Canada.
Vaccines and Immunization Practice.
Hogue, Michael D; Meador, Anna E
2016-03-01
Vaccines are among most cost-effective public health strategies. Despite effective vaccines for many bacterial and viral illnesses, tens of thousands of adults and hundreds of children die each year in the United States from vaccine-preventable diseases. Underutilization of vaccines requires rethinking the approach to incorporating vaccines into practice. Arguably, immunizations could be a part all health care encounters. Shared responsibility is paramount if deaths are to be reduced. This article reviews the available vaccines in the US market, as well as practice recommendations of the Centers for Disease Control and Prevention's Advisory Committee on Immunization Practices. Copyright © 2016 Elsevier Inc. All rights reserved.
Algorithms for classification of astronomical object spectra
NASA Astrophysics Data System (ADS)
Wasiewicz, P.; Szuppe, J.; Hryniewicz, K.
2015-09-01
Obtaining interesting celestial objects from tens of thousands or even millions of recorded optical-ultraviolet spectra depends not only on the data quality but also on the accuracy of spectra decomposition. Additionally rapidly growing data volumes demands higher computing power and/or more efficient algorithms implementations. In this paper we speed up the process of substracting iron transitions and fitting Gaussian functions to emission peaks utilising C++ and OpenCL methods together with the NOSQL database. In this paper we implemented typical astronomical methods of detecting peaks in comparison to our previous hybrid methods implemented with CUDA.
ICPS Turnover GSDO Employee Event
2017-11-07
Kennedy Space Center Associate Director Kelvin Manning, right, speaks with a guest during a ceremony marking NASA's Spacecraft/Payload Integration and Evolution (SPIE) organization formally turning over processing of the Space Launch System (SLS) rocket's Interim Cryogenic Propulsion Stage (ICPS) to the center's Ground Systems Development and Operations (GSDO) Directorate. The ICPS is the first integrated piece of flight hardware to arrive in preparation for the uncrewed Exploration Mission-1. With the Orion attached, the ICPS sits atop the SLS rocket and will provide the spacecraft with the additional thrust needed to travel tens of thousands of miles beyond the Moon.
Interim Cryogenic Propulsion Stage (ICPS) Handover Signing
2017-10-26
Meeting in the Launch Control Center of NASA's Kennedy Space Center in Florida, officials of the agency's Spacecraft/Payload Integration and Evolution (SPIE) organization formally turn over processing of the Space Launch System (SLS) rocket's Interim Cryogenic Propulsion Stage (ICPS) to the center's Ground Systems Development and Operations (GSDO) directorate. The ICPS is the first integrated piece of flight hardware to arrive in preparation for the uncrewed Exploration Mission-1. With the Orion attached, the ICPS sits atop the SLS rocket and will provide the spacecraft with the additional thrust needed to travel tens of thousands of miles beyond the Moon.
Ibrahim, Khaled Z.; Madduri, Kamesh; Williams, Samuel; ...
2013-07-18
The Gyrokinetic Toroidal Code (GTC) uses the particle-in-cell method to efficiently simulate plasma microturbulence. This paper presents novel analysis and optimization techniques to enhance the performance of GTC on large-scale machines. We introduce cell access analysis to better manage locality vs. synchronization tradeoffs on CPU and GPU-based architectures. Finally, our optimized hybrid parallel implementation of GTC uses MPI, OpenMP, and NVIDIA CUDA, achieves up to a 2× speedup over the reference Fortran version on multiple parallel systems, and scales efficiently to tens of thousands of cores.
NASA Technical Reports Server (NTRS)
Phillips, Veronica J.
2017-01-01
STI is for a fact sheet on the Space Object Query Tool being created by the MDC. When planning launches, NASA must first factor in the tens of thousands of objects already in orbit around the Earth. The number of human-made objects, including nonfunctional spacecraft, abandoned launch vehicle stages, mission-related debris and fragmentation debris orbiting Earth has grown steadily since Sputnik 1 was launched in 1957. Currently, the U.S. Department of Defenses Joint Space Operations Center, or JSpOC, tracks over 15,000 distinct objects and provides data for more than 40,000 objects via its Space-Track program, found at space-track.org.
Supersonic gas streams enhance the formation of massive black holes in the early universe
NASA Astrophysics Data System (ADS)
Hirano, Shingo; Hosokawa, Takashi; Yoshida, Naoki; Kuiper, Rolf
2017-09-01
Supermassive black holes existed less than a billion years after the Big Bang. Because black holes can grow at a maximum rate that depends on their current mass, it has been difficult to understand how such massive black holes could have formed so quickly. Hirano et al. developed simulations to show that streaming motions—velocity offsets between the gas and dark matter components—could have produced black holes with tens of thousands of solar masses in the early universe. That's big enough to grow into the supermassive black holes that we observe today.
Catching Cosmic Light with the Galileoscope
NASA Astrophysics Data System (ADS)
Fienberg, R. T.; Arion, D. N.
2015-09-01
Created for the 2009 International Year of Astronomy, the Galileoscope solved a long-standing problem: the lack of high quality, low cost telescope kits suitable for both optics education and celestial observation. Through an effort managed entirely by the volunteers who have authored this article almost 240 000 Galileoscope kits have now been distributed in 106 countries across the globe, for use in science teaching and public outreach. The Galileoscope outreach programme for the 2015 International Year of Light is now in full swing, giving tens of thousands of students, teachers and parents their first telescopic look at the Moon's craters and Saturn's rings.
MODEST - JPL GEODETIC AND ASTROMETRIC VLBI MODELING AND PARAMETER ESTIMATION PROGRAM
NASA Technical Reports Server (NTRS)
Sovers, O. J.
1994-01-01
Observations of extragalactic radio sources in the gigahertz region of the radio frequency spectrum by two or more antennas, separated by a baseline as long as the diameter of the Earth, can be reduced, by radio interferometry techniques, to yield time delays and their rates of change. The Very Long Baseline Interferometric (VLBI) observables can be processed by the MODEST software to yield geodetic and astrometric parameters of interest in areas such as geophysical satellite and spacecraft tracking applications and geodynamics. As the accuracy of radio interferometry has improved, increasingly complete models of the delay and delay rate observables have been developed. MODEST is a delay model (MOD) and parameter estimation (EST) program that takes into account delay effects such as geometry, clock, troposphere, and the ionosphere. MODEST includes all known effects at the centimeter level in modeling. As the field evolves and new effects are discovered, these can be included in the model. In general, the model includes contributions to the observables from Earth orientation, antenna motion, clock behavior, atmospheric effects, and radio source structure. Within each of these categories, a number of unknown parameters may be estimated from the observations. Since all parts of the time delay model contain nearly linear parameter terms, a square-root-information filter (SRIF) linear least-squares algorithm is employed in parameter estimation. Flexibility (via dynamic memory allocation) in the MODEST code ensures that the same executable can process a wide array of problems. These range from a few hundred observations on a single baseline, yielding estimates of tens of parameters, to global solutions estimating tens of thousands of parameters from hundreds of thousands of observations at antennas widely distributed over the Earth's surface. Depending on memory and disk storage availability, large problems may be subdivided into more tractable pieces that are processed sequentially. MODEST is written in FORTRAN 77, C-language, and VAX ASSEMBLER for DEC VAX series computers running VMS. It requires 6Mb of RAM for execution. The standard distribution medium for this package is a 1600 BPI 9-track magnetic tape in DEC VAX BACKUP format. It is also available on a TK50 tape cartridge in DEC VAX BACKUP format. Instructions for use and sample input and output data are available on the distribution media. This program was released in 1993 and is a copyrighted work with all copyright vested in NASA.
McNearney, Terry A; Sallam, Hanaa S; Hunnicutt, Sonya E; Doshi, Dipti; Chen, Jiande D Z
2013-01-01
We assessed the effects of transcutaneous electrical nerve stimulation (TENS) on neurogastric functioning in scleroderma patients. Seventeen SSc patients underwent 30 min TENS treatment >10Hz at GI acupuncture points PC6 and ST36, once (acute TENS) and then after two weeks of TENS sessions for 30 min twice daily (prolonged TENS). Data collected at Visits 1 and 2 included gastric myoelectrical activity (GMA) by surface electrogastrography (EGG), heart rate variability (HRV) by surface electrocardiography (EKG), GI specific symptoms and health related SF-36 questionnaires. Plasma VIP, motilin and IL-6 levels were determined. Statistical analyses were performed by Student's t-test, Spearman Rank and p-values <0.05 were considered significant. 1. Only after prolonged TENS, the percentages of normal slow waves and average slow wave coupling (especially channels 1, 2 reflecting gastric pacemaker and corpus regions) were significantly increased; 2. the percentage of normal slow waves was significantly correlated to sympathovagal balance; 3. Mean plasma VIP and motilin levels were significantly decreased after acute TENS, (vs. baseline), generally maintained in the prolonged TENS intervals. Compared to baseline, mean plasma IL-6 levels were significantly increased after acute TENS, but significantly decreased after prolonged TENS. 4. After prolonged TENS, the frequency of awakening due to abdominal pain and abdominal bloating were significantly and modestly decreased, respectively. In SSc patients, two weeks of daily TENS improved patient GMA scores, lowered plasma VIP, motilin and IL-6 levels and improved association between GMA and sympathovagal balance. This supports the therapeutic potential of prolonged TENS to enhance gastric myoelectrical functioning in SSc.
Rediscovering the Concept of Asylum for Persons with Serious Mental Illness.
Lamb, H Richard; Weinberger, Linda E
2016-03-01
Treating persons with serious mental illness is a complex and challenging endeavor. One intervention that has received little attention in recent years is the need for asylum. Asylum means a sanctuary, a place that lowers levels of stress and provides protection, safety, security, and social support, as well as an array of treatment services. The concept of "asylum" may have lost favor because it was equated with the abysmal conditions found in the state psychiatric hospitals of the past. Among the reasons persons with serious mental illness have been arrested and incarcerated is society's failure to provide adequate levels of asylum. With the release of tens of thousands of mentally ill inmates from state and federal jails and prisons, it is time to revisit this concept, not only for these persons but for those who have not been criminalized. Asylum can be found in various settings, including with family in the patient's home, in a board-and-care facility, or in a psychiatric hospital if necessary. Not all persons with a major mental illness are capable of achieving high levels of social and vocational functioning; however, living in a place that provides asylum can promote a higher quality of life. The value of asylum for many persons with serious mental illness should not be underestimated. © 2016 American Academy of Psychiatry and the Law.
MEGALEX: A megastudy of visual and auditory word recognition.
Ferrand, Ludovic; Méot, Alain; Spinelli, Elsa; New, Boris; Pallier, Christophe; Bonin, Patrick; Dufau, Stéphane; Mathôt, Sebastiaan; Grainger, Jonathan
2018-06-01
Using the megastudy approach, we report a new database (MEGALEX) of visual and auditory lexical decision times and accuracy rates for tens of thousands of words. We collected visual lexical decision data for 28,466 French words and the same number of pseudowords, and auditory lexical decision data for 17,876 French words and the same number of pseudowords (synthesized tokens were used for the auditory modality). This constitutes the first large-scale database for auditory lexical decision, and the first database to enable a direct comparison of word recognition in different modalities. Different regression analyses were conducted to illustrate potential ways to exploit this megastudy database. First, we compared the proportions of variance accounted for by five word frequency measures. Second, we conducted item-level regression analyses to examine the relative importance of the lexical variables influencing performance in the different modalities (visual and auditory). Finally, we compared the similarities and differences between the two modalities. All data are freely available on our website ( https://sedufau.shinyapps.io/megalex/ ) and are searchable at www.lexique.org , inside the Open Lexique search engine.
Public Housing: A Tailored Approach to Energy Retrofits
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dentz, Jordan; Conlin, Francis; Podorson, David
2014-06-01
More than 1 million HUD-supported public housing units provide rental housing for eligible low-income families across the country. A survey of over 100 public housing authorities (PHAs) across the country indicated that there is a high level of interest in developing low-cost solutions that improve energy efficiency and can be seamlessly included in the refurbishment process. Further, PHAs, have incentives (both internal and external) to reduce utility bills. ARIES worked with two PHAs to develop packages of energy efficiency retrofit measures the PHAs can cost effectively implement with their own staffs in the normal course of housing operations when unitsmore » are refurbished between occupancies. The energy efficiency turnover protocols emphasized air infiltration reduction, duct sealing and measures that improve equipment efficiency. ARIES documented implementation 10 ten housing units. Total source energy consumption savings was estimated at 6%-10% based on BEopt modeling with a simple payback of 1.7 to 2.2 years. At typical housing unit turnover rates, these measures could impact hundreds of thousands of units per year nationally.« less
Numbers and distribution of Double-crested Cormorants on the upper Mississippi river
Kirsch, E.M.
1997-01-01
Historic records indicate that Double-crested Cormorants (Phalacrocorax auritus) were common breeders and abundant during migration on the Upper Mississippi River from St. Paul, Minnesota, to St. Louis, Missouri, during the 1940s and 1950s. Their numbers declined in the mid- to late-1950s, remained low through the 1970s, and began to increase somewhat in the late 1980s. Aerial surveys of migrating cormorants and ground surveys at cormorant colonies during 1991-1993, indicate that numbers have not returned to historic levels. Only 500-2,000 cormorants were seen during spring migration 1992-1993; and 5,000-7,000 during fall migration 1991-1992; whereas, tens of thousands were reported in the 1940s and 1950s. Four hundred ninety-six nests were counted at 4 colonies in 1992, and 545 nests were counted in 9 colonies in 1993; whereas, during the 1940s and 1950s, about 2,500 birds were reported nesting in 4 locations. Pools 6 and 13 have always attracted breeding and migrating cormorants, currently attract the largest numbers of cormorants during migration, and still support breeding colonies.
Numbers and distribution of double-crested cormorants on the upper Mississippi River
Kirsch, E.M.
1997-01-01
Historic records indicate that Double-crested Cormorants (Phalacrocorax auritus) were common breeders and abundant during migration on the Upper Mississippi River from St. Paul, Minnesota, to St. Louis, Missouri, during the 1940s and 1950s. Their numbers declined in the mid-to late-1950s, remained low through the 1970s, and began to increase somewhat in the late 1980s. Aerial surveys of migrating cormorants and ground surveys at cormorant colonies during 1991-1993, indicate that numbers have not returned to historic levels. Only 500-2,000 cormorants were seen during spring migration 1992-1993; and 5,000-7,000 during fall migration 1991-1992; whereas, tens of thousands were reported in the 1940s and 1950s. Four hundred ninety-six nests were counted at 4 colonies in 1992, and 545 nests were counted in 9 colonies in 1993; whereas, during the 1940s and 1950s, about 2,500 birds were reported nesting in 4 locations. Pools 6 and 13 have always attracted breeding and migrating cormorants, currently attract the largest numbers of cormorants during migration, and still support breeding colonies.
Challenges in scaling NLO generators to leadership computers
NASA Astrophysics Data System (ADS)
Benjamin, D.; Childers, JT; Hoeche, S.; LeCompte, T.; Uram, T.
2017-10-01
Exascale computing resources are roughly a decade away and will be capable of 100 times more computing than current supercomputers. In the last year, Energy Frontier experiments crossed a milestone of 100 million core-hours used at the Argonne Leadership Computing Facility, Oak Ridge Leadership Computing Facility, and NERSC. The Fortran-based leading-order parton generator called Alpgen was successfully scaled to millions of threads to achieve this level of usage on Mira. Sherpa and MadGraph are next-to-leading order generators used heavily by LHC experiments for simulation. Integration times for high-multiplicity or rare processes can take a week or more on standard Grid machines, even using all 16-cores. We will describe our ongoing work to scale the Sherpa generator to thousands of threads on leadership-class machines and reduce run-times to less than a day. This work allows the experiments to leverage large-scale parallel supercomputers for event generation today, freeing tens of millions of grid hours for other work, and paving the way for future applications (simulation, reconstruction) on these and future supercomputers.
Costa, Marta; Manton, James D; Ostrovsky, Aaron D; Prohaska, Steffen; Jefferis, Gregory S X E
2016-07-20
Neural circuit mapping is generating datasets of tens of thousands of labeled neurons. New computational tools are needed to search and organize these data. We present NBLAST, a sensitive and rapid algorithm, for measuring pairwise neuronal similarity. NBLAST considers both position and local geometry, decomposing neurons into short segments; matched segments are scored using a probabilistic scoring matrix defined by statistics of matches and non-matches. We validated NBLAST on a published dataset of 16,129 single Drosophila neurons. NBLAST can distinguish neuronal types down to the finest level (single identified neurons) without a priori information. Cluster analysis of extensively studied neuronal classes identified new types and unreported topographical features. Fully automated clustering organized the validation dataset into 1,052 clusters, many of which map onto previously described neuronal types. NBLAST supports additional query types, including searching neurons against transgene expression patterns. Finally, we show that NBLAST is effective with data from other invertebrates and zebrafish. VIDEO ABSTRACT. Copyright © 2016 MRC Laboratory of Molecular Biology. Published by Elsevier Inc. All rights reserved.
Selecting a Classification Ensemble and Detecting Process Drift in an Evolving Data Stream
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heredia-Langner, Alejandro; Rodriguez, Luke R.; Lin, Andy
2015-09-30
We characterize the commercial behavior of a group of companies in a common line of business using a small ensemble of classifiers on a stream of records containing commercial activity information. This approach is able to effectively find a subset of classifiers that can be used to predict company labels with reasonable accuracy. Performance of the ensemble, its error rate under stable conditions, can be characterized using an exponentially weighted moving average (EWMA) statistic. The behavior of the EWMA statistic can be used to monitor a record stream from the commercial network and determine when significant changes have occurred. Resultsmore » indicate that larger classification ensembles may not necessarily be optimal, pointing to the need to search the combinatorial classifier space in a systematic way. Results also show that current and past performance of an ensemble can be used to detect when statistically significant changes in the activity of the network have occurred. The dataset used in this work contains tens of thousands of high level commercial activity records with continuous and categorical variables and hundreds of labels, making classification challenging.« less
Leong, Wai-Mun; Ripen, Adiratna Mat; Mirsafian, Hoda; Mohamad, Saharuddin Bin; Merican, Amir Feisal
2018-06-07
High-depth next generation sequencing data provide valuable insights into the number and distribution of RNA editing events. Here, we report the RNA editing events at cellular level of human primary monocyte using high-depth whole genomic and transcriptomic sequencing data. We identified over a ten thousand putative RNA editing sites and 69% of the sites were A-to-I editing sites. The sites enriched in repetitive sequences and intronic regions. High-depth sequencing datasets revealed that 90% of the canonical sites were edited at lower frequencies (<0.7). Single and multiple human monocytes and brain tissues samples were analyzed through genome sequence independent approach. The later approach was observed to identify more editing sites. Monocytes was observed to contain more C-to-U editing sites compared to brain tissues. Our results establish comparable pipeline that can address current limitations as well as demonstrate the potential for highly sensitive detection of RNA editing events in single cell type. Copyright © 2018 Elsevier Inc. All rights reserved.
Modern Church Construction in Urals. Problems and Prospects
NASA Astrophysics Data System (ADS)
Surin, D. N.; Tereshina, O. B.
2017-11-01
The article analyzes the problems of the modern Orthodox church architecture in Russia, special attention is paid to the problems of the Ural region. It justifies the importance of addressing to this issue connected with the Orthodox traditions revival in Russia over the last decades and the need to compensate for tens of thousands of the churches destroyed in the Soviet period. The works on the theory and history of the Russian architecture and art, studies of the architectural heritage and the art of building of the Ural craftsmen are used as a scientific and methodological base for the church architecture development. The article discloses the historically formed architectural features of the Russian Orthodox churches the artistic image of which is designed to create a certain religious and aesthetic experience. It is stated that the restoration of the Russian church construction tradition is possible on the background of architectural heritage. It sets the tendencies and vital tasks in church construction and outlines a complex of measures to solve these tasks at the public and regional levels.
NASA Technical Reports Server (NTRS)
Wilson, Brad; Galatzer, Yishai
2008-01-01
The Space Shuttle is protected by a Thermal Protection System (TPS) made of tens of thousands of individually shaped heat protection tile. With every flight, tiles are damaged on take-off and return to earth. After each mission, the heat tiles must be fixed or replaced depending on the level of damage. As part of the return to flight mission, the TPS requirements are more stringent, leading to a significant increase in heat tile replacements. The replacement operation requires scanning tile cavities, and in some cases the actual tiles. The 3D scan data is used to reverse engineer each tile into a precise CAD model, which in turn, is exported to a CAM system for the manufacture of the heat protection tile. Scanning is performed while other activities are going on in the shuttle processing facility. Many technicians work simultaneously on the space shuttle structure, which results in structural movements and vibrations. This paper will cover a portable, ultra-fast data acquisition approach used to scan surfaces in this unstable environment.
Dynamics and Novel Mechanisms of SN2 Reactions on ab Initio Analytical Potential Energy Surfaces.
Szabó, István; Czakó, Gábor
2017-11-30
We describe a novel theoretical approach to the bimolecular nucleophilic substitution (S N 2) reactions that is based on analytical potential energy surfaces (PESs) obtained by fitting a few tens of thousands high-level ab initio energy points. These PESs allow computing millions of quasi-classical trajectories thereby providing unprecedented statistical accuracy for S N 2 reactions, as well as performing high-dimensional quantum dynamics computations. We developed full-dimensional ab initio PESs for the F - + CH 3 Y [Y = F, Cl, I] systems, which describe the direct and indirect, complex-forming Walden-inversion, the frontside attack, and the new double-inversion pathways as well as the proton-transfer channels. Reaction dynamics simulations on the new PESs revealed (a) a novel double-inversion S N 2 mechanism, (b) frontside complex formation, (c) the dynamics of proton transfer, (d) vibrational and rotational mode specificity, (e) mode-specific product vibrational distributions, (f) agreement between classical and quantum dynamics, (g) good agreement with measured scattering angle and product internal energy distributions, and (h) significant leaving group effect in accord with experiments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bobyshev, A.; Lamore, D.; Demar, P.
2004-12-01
In a large campus network, such at Fermilab, with tens of thousands of nodes, scanning initiated from either outside of or within the campus network raises security concerns. This scanning may have very serious impact on network performance, and even disrupt normal operation of many services. In this paper we introduce a system for detecting and automatic blocking excessive traffic of different kinds of scanning, DoS attacks, virus infected computers. The system, called AutoBlocker, is a distributed computing system based on quasi-real time analysis of network flow data collected from the border router and core switches. AutoBlocker also has anmore » interface to accept alerts from IDS systems (e.g. BRO, SNORT) that are based on other technologies. The system has multiple configurable alert levels for the detection of anomalous behavior and configurable trigger criteria for automated blocking of scans at the core or border routers. It has been in use at Fermilab for about 2 years, and has become a very valuable tool to curtail scan activity within the Fermilab campus network.« less
Alignment and Integration of Lightweight Mirror Segments
NASA Technical Reports Server (NTRS)
Evans, Tyler; Biskach, Michael; Mazzarella, Jim; McClelland, Ryan; Saha, Timo; Zhang, Will; Chan, Kai-Wing
2011-01-01
The optics for the International X-Ray Observatory (IXO) require alignment and integration of about fourteen thousand thin mirror segments to achieve the mission goal of 3.0 square meters of effective area at 1.25 keV with an angular resolution of five arc-seconds. These mirror segments are 0.4 mm thick, and 200 to 400 mm in size, which makes it difficult not to impart distortion at the sub-arc-second level. This paper outlines the precise alignment, permanent bonding, and verification testing techniques developed at NASA's Goddard Space Flight Center (GSFC). Improvements in alignment include new hardware and automation software. Improvements in bonding include two module new simulators to bond mirrors into, a glass housing for proving single pair bonding, and a Kovar module for bonding multiple pairs of mirrors. Three separate bonding trials were x-ray tested producing results meeting the requirement of sub ten arc-second alignment. This paper will highlight these recent advances in alignment, testing, and bonding techniques and the exciting developments in thin x-ray optic technology development.
Adaptive Fusion of Information for Seeing into Ordos Basin, China: A China-Germany-US Joint Venture.
NASA Astrophysics Data System (ADS)
Yeh, T. C. J.; Yin, L.; Sauter, M.; Hu, R.; Ptak, T.; Hou, G. C.
2014-12-01
Adaptive fusion of information for seeing into geological basins is the theme of this joint venture. The objective of this venture is to initiate possible collaborations between scientists from China, Germany, and US to develop innovative technologies, which can be utilized to characterize geological and hydrological structures and processes as well as other natural resources in regional scale geological basins of hundreds of thousands of kilometers (i.e., the Ordos Basin, China). This adaptive fusion of information aims to assimilate active (manmade) and passive (natural) hydrologic and geophysical tomography surveys to enhance our ability of seeing into hydrogeological basins at the resolutions of our interests. The active hydrogeophysical tomography refers to recently developed hydraulic tomgoraphic surveys by Chinese and German scientists, as well as well-established geophysical tomography surveys (such as electrical resistivity tomography, cross-borehole radars, electrical magnetic surveys). These active hydrogeophysical tomgoraphic surveys have been proven to be useful high-resolution surveys for geological media of tens and hundreds of meters wide and deep. For basin-scale (i.e., tens and hundreds of kilometers) problems, their applicabilities are however rather limited. The passive hydrogeophysical tomography refers to unexplored technologies that exploit natural stimuli as energy sources for tomographic surveys, which include direct lightning strikes, groundwater level fluctuations due to earthquakes, river stage fluctuations, precipitation storms, barometric pressure variations, and long term climate changes. These natural stimuli are spatially varying, recurrent, and powerful, influencing geological media over great distances and depths (e.g., tens and hundreds of kilometers). Monitoring hydrological and geophysical responses of geological media to these stimuli at different locations is tantamount to collecting data of naturally occurring tomographic surveys. Exploiting natural stimuli as tomographic surveys is a novel concept for cost-effective characterization and monitor of subsurface processes in regional-scale basins at great depths.
Smoothed particle hydrodynamics with GRAPE-1A
NASA Technical Reports Server (NTRS)
Umemura, Masayuki; Fukushige, Toshiyuki; Makino, Junichiro; Ebisuzaki, Toshikazu; Sugimoto, Daiichiro; Turner, Edwin L.; Loeb, Abraham
1993-01-01
We describe the implementation of a smoothed particle hydrodynamics (SPH) scheme using GRAPE-1A, a special-purpose processor used for gravitational N-body simulations. The GRAPE-1A calculates the gravitational force exerted on a particle from all other particles in a system, while simultaneously making a list of the nearest neighbors of the particle. It is found that GRAPE-1A accelerates SPH calculations by direct summation by about two orders of magnitudes for a ten thousand-particle simulation. The effective speed is 80 Mflops, which is about 30 percent of the peak speed of GRAPE-1A. Also, in order to investigate the accuracy of GRAPE-SPH, some test simulations were executed. We found that the force and position errors are smaller than those due to representing a fluid by a finite number of particles. The total energy and momentum were conserved within 0.2-0.4 percent and 2-5 x 10 exp -5, respectively, in simulations with several thousand particles. We conclude that GRAPE-SPH is quite effective and sufficiently accurate for self-gravitating hydrodynamics.
Electroforming of optical tooling in high-strength Ni-Co alloy
NASA Astrophysics Data System (ADS)
Stein, Berl
2003-05-01
Plastic optics are often mass produced by injection, compression or injection-compression molding. Optical quality molds can be directly machined in appropriate materials (tool steels, electroless nickel, aluminum, etc.), but much greater cost efficiency can be achieved with electroformed modl inserts. Traditionally, electroforming of optical quality mold inserts has been carried out in nickel, a material much softer than tool steels which, when hardened to 45 - 50 HRc usually exhibit high wear resistance and long service life (hundreds of thousands of impressions per mold). Because of their low hardness (< 20 HRc), nickel molds can produce only tens of thousands of parts before they are scrapped due to wear or accidental damage. This drawback prevented their wider usage in general plastic and optical mold making. Recently, NiCoForm has developed a proprietary Ni-CO electroforming bath combining the high strength and wear resistance of the alloy with the low stress and high replication fidelity typical of pure nickel electroforming. This paper will outline the approach to electroforming of optical quality tooling in low stress, high strength Ni-Co alloy and present several examples of electroformed NiColoy mold inserts.
Genome-scale measurement of off-target activity using Cas9 toxicity in high-throughput screens.
Morgens, David W; Wainberg, Michael; Boyle, Evan A; Ursu, Oana; Araya, Carlos L; Tsui, C Kimberly; Haney, Michael S; Hess, Gaelen T; Han, Kyuho; Jeng, Edwin E; Li, Amy; Snyder, Michael P; Greenleaf, William J; Kundaje, Anshul; Bassik, Michael C
2017-05-05
CRISPR-Cas9 screens are powerful tools for high-throughput interrogation of genome function, but can be confounded by nuclease-induced toxicity at both on- and off-target sites, likely due to DNA damage. Here, to test potential solutions to this issue, we design and analyse a CRISPR-Cas9 library with 10 variable-length guides per gene and thousands of negative controls targeting non-functional, non-genic regions (termed safe-targeting guides), in addition to non-targeting controls. We find this library has excellent performance in identifying genes affecting growth and sensitivity to the ricin toxin. The safe-targeting guides allow for proper control of toxicity from on-target DNA damage. Using this toxicity as a proxy to measure off-target cutting, we demonstrate with tens of thousands of guides both the nucleotide position-dependent sensitivity to single mismatches and the reduction of off-target cutting using truncated guides. Our results demonstrate a simple strategy for high-throughput evaluation of target specificity and nuclease toxicity in Cas9 screens.
Genome-scale measurement of off-target activity using Cas9 toxicity in high-throughput screens
Morgens, David W.; Wainberg, Michael; Boyle, Evan A.; Ursu, Oana; Araya, Carlos L.; Tsui, C. Kimberly; Haney, Michael S.; Hess, Gaelen T.; Han, Kyuho; Jeng, Edwin E.; Li, Amy; Snyder, Michael P.; Greenleaf, William J.; Kundaje, Anshul; Bassik, Michael C.
2017-01-01
CRISPR-Cas9 screens are powerful tools for high-throughput interrogation of genome function, but can be confounded by nuclease-induced toxicity at both on- and off-target sites, likely due to DNA damage. Here, to test potential solutions to this issue, we design and analyse a CRISPR-Cas9 library with 10 variable-length guides per gene and thousands of negative controls targeting non-functional, non-genic regions (termed safe-targeting guides), in addition to non-targeting controls. We find this library has excellent performance in identifying genes affecting growth and sensitivity to the ricin toxin. The safe-targeting guides allow for proper control of toxicity from on-target DNA damage. Using this toxicity as a proxy to measure off-target cutting, we demonstrate with tens of thousands of guides both the nucleotide position-dependent sensitivity to single mismatches and the reduction of off-target cutting using truncated guides. Our results demonstrate a simple strategy for high-throughput evaluation of target specificity and nuclease toxicity in Cas9 screens. PMID:28474669
An efficient method to identify differentially expressed genes in microarray experiments
Qin, Huaizhen; Feng, Tao; Harding, Scott A.; Tsai, Chung-Jui; Zhang, Shuanglin
2013-01-01
Motivation Microarray experiments typically analyze thousands to tens of thousands of genes from small numbers of biological replicates. The fact that genes are normally expressed in functionally relevant patterns suggests that gene-expression data can be stratified and clustered into relatively homogenous groups. Cluster-wise dimensionality reduction should make it feasible to improve screening power while minimizing information loss. Results We propose a powerful and computationally simple method for finding differentially expressed genes in small microarray experiments. The method incorporates a novel stratification-based tight clustering algorithm, principal component analysis and information pooling. Comprehensive simulations show that our method is substantially more powerful than the popular SAM and eBayes approaches. We applied the method to three real microarray datasets: one from a Populus nitrogen stress experiment with 3 biological replicates; and two from public microarray datasets of human cancers with 10 to 40 biological replicates. In all three analyses, our method proved more robust than the popular alternatives for identification of differentially expressed genes. Availability The C++ code to implement the proposed method is available upon request for academic use. PMID:18453554
Lab-to-Lab Cooperative Threat Reduction
NASA Astrophysics Data System (ADS)
Hecker, Siegfried S.
2017-11-01
It is difficult to imagine today how dramatically global nuclear risks changed 25 years ago as the Soviet Union disintegrated. Instead of the threat of mutual nuclear annihilation, the world became concerned that Russia and the other 14 former Soviet states would lose control of their huge nuclear assets - tens of thousands of nuclear weapons, more than a million kilograms of fissile materials, hundreds of thousands of nuclear workers, and a huge nuclear complex. I will describe how scientists and engineers at the DOE laboratories, with a focus on Los Alamos, Lawrence Livermore and Sandia national laboratories, joined forces with those at the Russian nuclear weapon institutes for more than 20 years to avoid what looked like the perfect nuclear storm - a story told in the two-volume book Doomed to Cooperate1 published in 2016. Due to an internal processing error, an incorrect version of this article was published on 15 November 2017 that omitted the footnotes. AIP Publishing apologizes for this error. An updated version of this article, including the missing footnotes, was published on 21 November 2017.
Emerging Tools to Estimate and to Predict Exposures to ...
The timely assessment of the human and ecological risk posed by thousands of existing and emerging commercial chemicals is a critical challenge facing EPA in its mission to protect public health and the environment The US EPA has been conducting research to enhance methods used to estimate and forecast exposures for tens of thousands of chemicals. This research is aimed at both assessing risks and supporting life cycle analysis, by developing new models and tools for high throughput exposure screening and prioritization, as well as databases that support these and other tools, especially regarding consumer products. The models and data address usage, and take advantage of quantitative structural activity relationships (QSARs) for both inherent chemical properties and function (why the chemical is a product ingredient). To make them more useful and widely available, the new tools, data and models are designed to be: • Flexible • Intraoperative • Modular (useful to more than one, stand-alone application) • Open (publicly available software) Presented at the Society for Risk Analysis Forum: Risk Governance for Key Enabling Technologies, Venice, Italy, March 1-3, 2017
Fundraising for Accelerated Study for the PhD in Nursing: A Community Partnership.
Starck, Patricia L
2015-01-01
This article describes fundraising strategies by a School of Nursing to support a post-master's accelerated (3-year) PhD degree program. A sample proposal to solicit funds is included, as well as a contract that students sign before accepting the scholarship and agreeing to teach for 3 years or repay the money. The first campaign raised $2.3 million for ten students, and the second campaign raised $1.3 million for six students. One useful marketing strategy is to show the impact of an investment in educating ten doctoral students who will become faculty and teach 100 additional students per year, who will then become professionals caring for thousands of patients during their careers. Over a 10 year period, the impact of an accelerated program is enormous, with 660 students taught who in their lifetime will care for 2.4 million patients. The article also discusses motivation and mind sets for giving to promote success in fundraising. Copyright © 2015 Elsevier Inc. All rights reserved.
A Blocked Linear Method for Optimizing Large Parameter Sets in Variational Monte Carlo
Zhao, Luning; Neuscamman, Eric
2017-05-17
We present a modification to variational Monte Carlo’s linear method optimization scheme that addresses a critical memory bottleneck while maintaining compatibility with both the traditional ground state variational principle and our recently-introduced variational principle for excited states. For wave function ansatzes with tens of thousands of variables, our modification reduces the required memory per parallel process from tens of gigabytes to hundreds of megabytes, making the methodology a much better fit for modern supercomputer architectures in which data communication and per-process memory consumption are primary concerns. We verify the efficacy of the new optimization scheme in small molecule tests involvingmore » both the Hilbert space Jastrow antisymmetric geminal power ansatz and real space multi-Slater Jastrow expansions. Satisfied with its performance, we have added the optimizer to the QMCPACK software package, with which we demonstrate on a hydrogen ring a prototype approach for making systematically convergent, non-perturbative predictions of Mott-insulators’ optical band gaps.« less
NASA Astrophysics Data System (ADS)
Haider, Shahid A.; Tran, Megan Y.; Wong, Alexander
2018-02-01
Observing the circular dichroism (CD) caused by organic molecules in biological fluids can provide powerful indicators of patient health and provide diagnostic clues for treatment. Methods for this kind of analysis involve tabletop devices that weigh tens of kilograms with costs on the order of tens of thousands of dollars, making them prohibitive in point-of-care diagnostic applications. In an e ort to reduce the size, cost, and complexity of CD estimation systems for point-of-care diagnostics, we propose a novel method for CD estimation that leverages a vortex half-wave retarder in between two linear polarizers and a two-dimensional photodetector array to provide an overall complexity reduction in the system. This enables the measurement of polarization variations across multiple polarizations after they interact with a biological sample, simultaneously, without the need for mechanical actuation. We further discuss design considerations of this methodology in the context of practical applications to point-of-care diagnostics.
Small volcanic edifices and volcanism in the plains of Venus
NASA Technical Reports Server (NTRS)
Guest, John E.; Bulmer, Mark H.; Aubele, Jayne; Beratan, Kathi; Greeley, Ronald; Head, James W.; Michaels, Gregory; Weitz, Catherine; Wiles, Charles
1992-01-01
The different types of eruption that have occurred over time in the Venusian plains are considered. The most extensive volcanic units consist of flood lavas, the largest of which have volumes of the order of thousands of cubic kilometers. They are inferred to have erupted at high effusion rates, and they exhibit a range of radar backscatter characteristics indicating different surface textures and ages. Small edifices on the plains occur mainly in clusters associated with fracture belts. The majority are shield volcanos that may be up to a few tens of kilometers across but are generally 10 km or less in diameter. Volcanic domes have diameters up to several tens of kilometers and volumes of the order of 100 cu cm. These are interpreted as being constructed of lava erupted with a relatively high effective viscosity and thus possibly composed of more silicic lava. For many domes, the flanks were unstable during and after eruption and experienced gravity sliding that produced steep scalloped outer margins.
2018-05-14
This image from NASA's Mars Reconnaissance Orbiter shows barchan sand dunes, common on Mars and often forming vast dune fields within very large (tens to hundreds of kilometers) impact basins. The regions upwind of barchans are usually devoid of sandy bedforms, so if you were walking in a downwind direction, then the barchans would seem to appear out of nowhere. As you walk downwind, you would notice the barchans link up ("joining arms") and eventually slope into featureless sand sheets. We call this progression of dunes a "Herschel-type dune field" named after the first place this sequence was described: Herschel Crater. But here is something interesting: a barchan dune filling the upwind portion of a small impact crater in a Pac-Man-like shape. This "dune-in-a-crater" is nearly at the highest extent of the field. It is also probably a rare configuration, and over the next few tens of thousands of years the sand will be blown out of the crater. https://photojournal.jpl.nasa.gov/catalog/PIA22456
Adaptive real-time dual-comb spectroscopy.
Ideguchi, Takuro; Poisson, Antonin; Guelachvili, Guy; Picqué, Nathalie; Hänsch, Theodor W
2014-02-27
The spectrum of a laser frequency comb consists of several hundred thousand equally spaced lines over a broad spectral bandwidth. Such frequency combs have revolutionized optical frequency metrology and they now hold much promise for significant advances in a growing number of applications including molecular spectroscopy. Despite an intriguing potential for the measurement of molecular spectra spanning tens of nanometres within tens of microseconds at Doppler-limited resolution, the development of dual-comb spectroscopy is hindered by the demanding stability requirements of the laser combs. Here we overcome this difficulty and experimentally demonstrate a concept of real-time dual-comb spectroscopy, which compensates for laser instabilities by electronic signal processing. It only uses free-running mode-locked lasers without any phase-lock electronics. We record spectra spanning the full bandwidth of near-infrared fibre lasers with Doppler-limited line profiles highly suitable for measurements of concentrations or line intensities. Our new technique of adaptive dual-comb spectroscopy offers a powerful transdisciplinary instrument for analytical sciences.
Adaptive real-time dual-comb spectroscopy
NASA Astrophysics Data System (ADS)
Ideguchi, Takuro; Poisson, Antonin; Guelachvili, Guy; Picqué, Nathalie; Hänsch, Theodor W.
2014-02-01
The spectrum of a laser frequency comb consists of several hundred thousand equally spaced lines over a broad spectral bandwidth. Such frequency combs have revolutionized optical frequency metrology and they now hold much promise for significant advances in a growing number of applications including molecular spectroscopy. Despite an intriguing potential for the measurement of molecular spectra spanning tens of nanometres within tens of microseconds at Doppler-limited resolution, the development of dual-comb spectroscopy is hindered by the demanding stability requirements of the laser combs. Here we overcome this difficulty and experimentally demonstrate a concept of real-time dual-comb spectroscopy, which compensates for laser instabilities by electronic signal processing. It only uses free-running mode-locked lasers without any phase-lock electronics. We record spectra spanning the full bandwidth of near-infrared fibre lasers with Doppler-limited line profiles highly suitable for measurements of concentrations or line intensities. Our new technique of adaptive dual-comb spectroscopy offers a powerful transdisciplinary instrument for analytical sciences.
Adaptive real-time dual-comb spectroscopy
Ideguchi, Takuro; Poisson, Antonin; Guelachvili, Guy; Picqué, Nathalie; Hänsch, Theodor W.
2014-01-01
The spectrum of a laser frequency comb consists of several hundred thousand equally spaced lines over a broad spectral bandwidth. Such frequency combs have revolutionized optical frequency metrology and they now hold much promise for significant advances in a growing number of applications including molecular spectroscopy. Despite an intriguing potential for the measurement of molecular spectra spanning tens of nanometres within tens of microseconds at Doppler-limited resolution, the development of dual-comb spectroscopy is hindered by the demanding stability requirements of the laser combs. Here we overcome this difficulty and experimentally demonstrate a concept of real-time dual-comb spectroscopy, which compensates for laser instabilities by electronic signal processing. It only uses free-running mode-locked lasers without any phase-lock electronics. We record spectra spanning the full bandwidth of near-infrared fibre lasers with Doppler-limited line profiles highly suitable for measurements of concentrations or line intensities. Our new technique of adaptive dual-comb spectroscopy offers a powerful transdisciplinary instrument for analytical sciences. PMID:24572636
Diverse Redundant Systems for Reliable Space Life Support
NASA Technical Reports Server (NTRS)
Jones, Harry W.
2015-01-01
Reliable life support systems are required for deep space missions. The probability of a fatal life support failure should be less than one in a thousand in a multi-year mission. It is far too expensive to develop a single system with such high reliability. Using three redundant units would require only that each have a failure probability of one in ten over the mission. Since the system development cost is inverse to the failure probability, this would cut cost by a factor of one hundred. Using replaceable subsystems instead of full systems would further cut cost. Using full sets of replaceable components improves reliability more than using complete systems as spares, since a set of components could repair many different failures instead of just one. Replaceable components would require more tools, space, and planning than full systems or replaceable subsystems. However, identical system redundancy cannot be relied on in practice. Common cause failures can disable all the identical redundant systems. Typical levels of common cause failures will defeat redundancy greater than two. Diverse redundant systems are required for reliable space life support. Three, four, or five diverse redundant systems could be needed for sufficient reliability. One system with lower level repair could be substituted for two diverse systems to save cost.
Xu, Wenting; Zheng, Mei; Zhou, Feng
2015-01-01
In 2009, a global epidemic of influenza A(H1N1) virus caused the death of tens of thousands of people. Vaccination is the most effective means of controlling an epidemic of influenza and reducing the mortality rate. In this study, the long-term immunogenicity of influenza A/California/7/2009 (H1N1) split vaccine was observed as long as 15 months (450 days) after immunization in a mouse model. Female BALB/c mice were immunized intraperitoneally with different doses of aluminum-adjuvanted vaccine. The mice were challenged with a lethal dose (10× 50% lethal dose [LD50]) of homologous virus 450 days after immunization. The results showed that the supplemented aluminum adjuvant not only effectively enhanced the protective effect of the vaccine but also reduced the immunizing dose of the vaccine. In addition, the aluminum adjuvant enhanced the IgG antibody level of mice immunized with the H1N1 split vaccine. The IgG level was correlated to the survival rate of the mice. Aluminum-adjuvanted inactivated split-virion 2009 pandemic influenza A H1N1 vaccine has good immunogenicity and provided long-term protection against lethal influenza virus challenge in mice. PMID:25589552
Infrastructure to Support Ultra High Throughput Biodosimetry Screening after a Radiological Event
Garty, G.; Karam, P.A.; Brenner, D. J.
2011-01-01
Purpose After a large-scale radiological event, there will be a pressing need to assess, within a few days, the radiation doses received by tens or hundreds of thousands of individuals. This is for triage, to prevent treatment locations from being overwhelmed, in what is sure to be a resource limited scenario, as well as to facilitate dose-dependent treatment decisions. In addition there are psychosocial considerations, in that active reassurance of minimal exposure is a potentially effective antidote to mass panic, as well as long-term considerations, to facilitate later studies of cancer and other long-term disease risks. Materials and Methods As described elsewhere in this issue, we are developing a Rapid Automated Biodosimetry Tool (RABiT). The RABiT allows high throughput analysis of thousands of blood samples per day, providing a dose estimate that can be used to support clinical triage and treatment decisions. Results Development of the RABiT has motivated us to consider the logistics of incorporating such a system into the existing emergency response scenarios of a large metropolitan area. We present here a view of how one or more centralized biodosimetry readout devices might be incorporated into an infrastructure in which fingerstick blood samples are taken at many distributed locations within an affected city or region and transported to centralized locations. Conclusions High throughput biodosimetry systems offer the opportunity to perform biodosimetric assessments on a large number of persons. As such systems reach a high level of maturity, emergency response scenarios will need to be tweaked to make use of these powerful tools. This can be done relatively easily within the framework of current scenarios. PMID:21675819
Perspective: Ab initio force field methods derived from quantum mechanics
NASA Astrophysics Data System (ADS)
Xu, Peng; Guidez, Emilie B.; Bertoni, Colleen; Gordon, Mark S.
2018-03-01
It is often desirable to accurately and efficiently model the behavior of large molecular systems in the condensed phase (thousands to tens of thousands of atoms) over long time scales (from nanoseconds to milliseconds). In these cases, ab initio methods are difficult due to the increasing computational cost with the number of electrons. A more computationally attractive alternative is to perform the simulations at the atomic level using a parameterized function to model the electronic energy. Many empirical force fields have been developed for this purpose. However, the functions that are used to model interatomic and intermolecular interactions contain many fitted parameters obtained from selected model systems, and such classical force fields cannot properly simulate important electronic effects. Furthermore, while such force fields are computationally affordable, they are not reliable when applied to systems that differ significantly from those used in their parameterization. They also cannot provide the information necessary to analyze the interactions that occur in the system, making the systematic improvement of the functional forms that are used difficult. Ab initio force field methods aim to combine the merits of both types of methods. The ideal ab initio force fields are built on first principles and require no fitted parameters. Ab initio force field methods surveyed in this perspective are based on fragmentation approaches and intermolecular perturbation theory. This perspective summarizes their theoretical foundation, key components in their formulation, and discusses key aspects of these methods such as accuracy and formal computational cost. The ab initio force fields considered here were developed for different targets, and this perspective also aims to provide a balanced presentation of their strengths and shortcomings. Finally, this perspective suggests some future directions for this actively developing area.
Geochemistry of waters in the Valley of Ten Thousand Smokes region, Alaska
Keith, T.E.C.; Thompson, J.M.; Hutchinson, R.A.; White, L.D.
1992-01-01
Meteoric waters from cold springs and streams outside of the 1912 eruptive deposits filling the Valley of Ten Thousand Smokes (VTTS) and in the upper parts of the two major rivers draining the 1912 deposits have similar chemical trends. Thermal springs issue in the mid-valley area along a 300-m lateral section of ash-flow tuff, and range in temperature from 21 to 29.8??C in early summer and from 15 to 17??C in mid-summer. Concentrations of major and minor chemical constituents in the thermal waters are nearly identical regardless of temperature. Waters in the downvalley parts of the rivers draining the 1912 deposits are mainly mixtures of cold meteoric waters and thermal waters of which the mid-valley thermal spring waters are representative. The weathering reactions of cold waters with the 1912 deposits appear to have stabilized and add only subordinate amounts of chemical constituents to the rivers relative to those contributed by the thermal waters. Isotopic data indicate that the mid-valley thermal spring waters are meteoric, but data is inconclusive regarding the heat source. The thermal waters could be either from a shallow part of a hydrothermal system beneath the 1912 vent region or from an incompletely cooled, welded tuff lens deep in the 1912 ash-flow sheet of the upper River Lethe area. Bicarbonate-sulfate waters resulting from interaction of near-surface waters and the cooling 1953-1968 southwest Trident plug issue from thermal springs south of Katmai Pass and near Mageik Creek, although the Mageik Creek spring waters are from a well-established, more deeply circulating hydrothermal system. Katmai caldera lake waters are a result of acid gases from vigorous drowned fumaroles dissolving in lake waters composed of snowmelt and precipitation. ?? 1992.
Kodosky, L.G.; Keith, T.E.C.
1993-01-01
Factor and canonical correlation analysis of geochemical data from eight fossil fumaroles suggest that six major factors controlled the formation and evolution of fumarolic encrustations on the 1912 ash-flow sheet in the Valley of Ten Thousand Smokes (VTTS). The six-factor solution model explains a large proportion (low of 74% for Ni to high of 99% for Si) of the individual element data variance. Although the primary fumarolic deposits have been degraded by secondary alteration reactions and up to 75 years of weathering, the relict encrustations still preserve a signature of vapor-phase element transport. This vapor-phase transport probably occurred as halide or oxyhalide species and was significant for As, Sb and Br. At least three, and possibly four, varied temperature leaching events affected the fumarolic deposits. High-temperature gases/liquids heavily altered the ejecta glass and mineral phases adjacent to the fumarolic conduit. As the fumaroles cooled. Fe-rich acidic condensate leached the ejecta and primary fumarolic deposits and resulted in the subsequent precipitation of Fe-hydroxides and/or Fe-oxides. Low- to ambient-temperature leaching and hydration reactions generated abundant hydrated amorphous phases. Up to 87% of the individual element data variance is apparently controlled by the chemistry of the ejecta on which the relict encrustations are found. This matrix chemistry factor illustrates that the primary fumarolic minerals surrounding the active VTTS vents observed by earlier workers have been effectively removed by the dissolution reactions. Element enrichment factors calculated for the VTTS relict encrustations support the statistical factor interpretations. On the average, the relict encrustations are enriched, relative to visibly unaltered matrix protolith, in As, Br, Cr, Sb, Cu, Ni, Pb, Fe, and LOI (an indirect measure of sample H2O content). ?? 1993.
Saturn's F Ring Core: Calm in the Midst of Chaos
NASA Technical Reports Server (NTRS)
Cuzzi, J. N.; Whizin, A. D.; Hogan, R. C.; Dobrovolskis, A. R.; Dones, L.; Showalter. M. R.; Colwell, J. E.; Scargle, J. D.
2013-01-01
The long-term stability of the narrow F Ring core has been hard to understand. Instead of acting as "shepherds", Prometheus and Pandora together stir the vast preponderance of the region into a chaotic state, consistent with the orbits of newly discovered objects like S/2004S6. We show how a comb of very narrow radial locations of high stability in semimajor axis is embedded within this otherwise chaotic region. The stability of these semimajor axes relies fundamentally on the unusual combination of rapid apse precession and long synodic period which characterizes the region. This situation allows stable "antiresonances" to fall on or very close to traditional Lindblad resonances which, under more common circumstances, are destabilizing. We present numerical integrations of tens of thousands of test particles over tens of thousands of Prometheus orbits that map out the effect. The stable antiresonance zones are most stable in a subset of the region where Prometheus first-order resonances are least cluttered by Pandora resonances. This region of optimum stability is paradoxically closer to Prometheus than a location more representative of "torque balance", helping explain a longstanding paradox. One stable zone corresponds closely to the currently observed semimajor axis of the F Ring core. While the model helps explain the stability of the narrow F Ring core, it does not explain why the F Ring material all shares a common apse longitude; we speculate that collisional damping at the preferred semimajor axis (not included in the current simulations) may provide that final step. Essentially, we find that the F Ring core is not confined by a combination of Prometheus and Pandora, but a combination of Prometheus and precession.
Fevers and Chills: Separating thermal and synchrotron components in SNR spectra
NASA Astrophysics Data System (ADS)
Fedor, Emily Elizabeth; Martina-Hood, Hyourin; Stage, Michael D.
2018-06-01
Spatially-resolved spectroscopy is an extremely powerful tool in X-ray analysis of extended sources, but can be computationally difficult if a source exhibits complex morphology. For example, high-resolution Chandra data of bright Galactic supernova remnants (Cas A, Tycho, etc.) allow extractions of high-quality spectra from tens to hundreds of thousands of regions, providing a rich laboratory for localizing emission from processes such as thermal line emission, bremsstrahlung, and synchrotron. This soft-band analysis informs our understanding of the typically nonthermal hard X-ray emission observed with other lower-resolution instruments. The analysis is complicated by both projection effects and the presence of multiple emission mechanisms in some regions. In particular, identifying regions with significant nonthermal emission is critical to understanding acceleration processes in remnants. Fitting tens of thousands of regions with complex, multi-component models can be time-consuming and involve so many free parameters that little constraint can be placed on the values. Previous work by Stage & Allen ('06, '07, '11) on Cas A used a technique to identify regions dominated by the highest-cutoff synchrotron emission by fitting with a simple thermal emission model and identifying regions with anomalously high apparent temperatures (caused by presence of the high-energy tail of the synchrotron emission component). Here, we present a similar technique. We verify the previous approach and, more importantly, expand it to include a method to identify regions containing strong lower-cutoff synchrotron radiation. Such regions might be associated with the reverse-shock of a supernova. Identification of a nonthermal electron population in the interior of an SNR would have significant implications for the energy balance and emission mechanisms producing the high-energy (> 10 keV) spectrum.
Landers, Richard N; Brusso, Robert C; Cavanaugh, Katelyn J; Collmus, Andrew B
2016-12-01
The term big data encompasses a wide range of approaches of collecting and analyzing data in ways that were not possible before the era of modern personal computing. One approach to big data of great potential to psychologists is web scraping, which involves the automated collection of information from webpages. Although web scraping can create massive big datasets with tens of thousands of variables, it can also be used to create modestly sized, more manageable datasets with tens of variables but hundreds of thousands of cases, well within the skillset of most psychologists to analyze, in a matter of hours. In this article, we demystify web scraping methods as currently used to examine research questions of interest to psychologists. First, we introduce an approach called theory-driven web scraping in which the choice to use web-based big data must follow substantive theory. Second, we introduce data source theories , a term used to describe the assumptions a researcher must make about a prospective big data source in order to meaningfully scrape data from it. Critically, researchers must derive specific hypotheses to be tested based upon their data source theory, and if these hypotheses are not empirically supported, plans to use that data source should be changed or eliminated. Third, we provide a case study and sample code in Python demonstrating how web scraping can be conducted to collect big data along with links to a web tutorial designed for psychologists. Fourth, we describe a 4-step process to be followed in web scraping projects. Fifth and finally, we discuss legal, practical and ethical concerns faced when conducting web scraping projects. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Xu, Jingxiang; Higuchi, Yuji; Ozawa, Nobuki; Sato, Kazuhisa; Hashida, Toshiyuki; Kubo, Momoji
2017-09-20
Ni sintering in the Ni/YSZ porous anode of a solid oxide fuel cell changes the porous structure, leading to degradation. Preventing sintering and degradation during operation is a great challenge. Usually, a sintering molecular dynamics (MD) simulation model consisting of two particles on a substrate is used; however, the model cannot reflect the porous structure effect on sintering. In our previous study, a multi-nanoparticle sintering modeling method with tens of thousands of atoms revealed the effect of the particle framework and porosity on sintering. However, the method cannot reveal the effect of the particle size on sintering and the effect of sintering on the change in the porous structure. In the present study, we report a strategy to reveal them in the porous structure by using our multi-nanoparticle modeling method and a parallel large-scale multimillion-atom MD simulator. We used this method to investigate the effect of YSZ particle size and tortuosity on sintering and degradation in the Ni/YSZ anodes. Our parallel large-scale MD simulation showed that the sintering degree decreased as the YSZ particle size decreased. The gas fuel diffusion path, which reflects the overpotential, was blocked by pore coalescence during sintering. The degradation of gas diffusion performance increased as the YSZ particle size increased. Furthermore, the gas diffusion performance was quantified by a tortuosity parameter and an optimal YSZ particle size, which is equal to that of Ni, was found for good diffusion after sintering. These findings cannot be obtained by previous MD sintering studies with tens of thousands of atoms. The present parallel large-scale multimillion-atom MD simulation makes it possible to clarify the effects of the particle size and tortuosity on sintering and degradation.
2014-01-01
Abstract Building large supertrees involves the collection, storage, and processing of thousands of individual phylogenies to create large phylogenies with thousands to tens of thousands of taxa. Such large phylogenies are useful for macroevolutionary studies, comparative biology and in conservation and biodiversity. No easy to use and fully integrated software package currently exists to carry out this task. Here, we present a new Python-based software package that uses well defined XML schema to manage both data and metadata. It builds on previous versions by 1) including new processing steps, such as Safe Taxonomic Reduction, 2) using a user-friendly GUI that guides the user to complete at least the minimum information required and includes context-sensitive documentation, and 3) a revised storage format that integrates both tree- and meta-data into a single file. These data can then be manipulated according to a well-defined, but flexible, processing pipeline using either the GUI or a command-line based tool. Processing steps include standardising names, deleting or replacing taxa, ensuring adequate taxonomic overlap, ensuring data independence, and safe taxonomic reduction. This software has been successfully used to store and process data consisting of over 1000 trees ready for analyses using standard supertree methods. This software makes large supertree creation a much easier task and provides far greater flexibility for further work. PMID:24891820
An iterative method for airway segmentation using multiscale leakage detection
NASA Astrophysics Data System (ADS)
Nadeem, Syed Ahmed; Jin, Dakai; Hoffman, Eric A.; Saha, Punam K.
2017-02-01
There are growing applications of quantitative computed tomography for assessment of pulmonary diseases by characterizing lung parenchyma as well as the bronchial tree. Many large multi-center studies incorporating lung imaging as a study component are interested in phenotypes relating airway branching patterns, wall-thickness, and other morphological measures. To our knowledge, there are no fully automated airway tree segmentation methods, free of the need for user review. Even when there are failures in a small fraction of segmentation results, the airway tree masks must be manually reviewed for all results which is laborious considering that several thousands of image data sets are evaluated in large studies. In this paper, we present a CT-based novel airway tree segmentation algorithm using iterative multi-scale leakage detection, freezing, and active seed detection. The method is fully automated requiring no manual inputs or post-segmentation editing. It uses simple intensity based connectivity and a new leakage detection algorithm to iteratively grow an airway tree starting from an initial seed inside the trachea. It begins with a conservative threshold and then, iteratively shifts toward generous values. The method was applied on chest CT scans of ten non-smoking subjects at total lung capacity and ten at functional residual capacity. Airway segmentation results were compared to an expert's manually edited segmentations. Branch level accuracy of the new segmentation method was examined along five standardized segmental airway paths (RB1, RB4, RB10, LB1, LB10) and two generations beyond these branches. The method successfully detected all branches up to two generations beyond these segmental bronchi with no visual leakages.
Walker, Bruce F
2016-01-01
Physical manipulation and manual therapies are thousands of years old. The most popular western world iteration of these therapies is delivered by chiropractors. It can be argued that the collective public health benefit from chiropractic for spinal pain has been very substantial, however as chiropractic has transitioned from craft to profession it has encountered many internally and externally driven machinations that have retarded its progress to a fully accepted allied health profession. This article sets out a ten point plan for a new chiropractic that will achieve full acceptance for this troubled profession. This article is based on a keynote speech known as the FG Roberts Memorial Address delivered on October 10, 2015, in Melbourne, Australia at the Chiropractic & Osteopathic College of Australasia and Chiropractic Australia national conference. The ten point plan consists of the following: improving the pre-professional education of chiropractors, establishing a progressive identity, developing a special interest for the profession, marginalising the nonsensical elements of the profession, being pro-public health, supporting the legitimate organised elements of the profession, improving clinical practice, embracing evidence based practice, supporting research and showing personal leadership. Adherence to this fresh ten point plan will, over time, see the chiropractic profession gain full legitimacy in the allied health field and acceptance by other health providers, policy makers and the public at large.
Staton, Margaret; Best, Teodora; Khodwekar, Sudhir; Owusu, Sandra; Xu, Tao; Xu, Yi; Jennings, Tara; Cronn, Richard; Arumuganathan, A. Kathiravetpilla; Coggeshall, Mark; Gailing, Oliver; Liang, Haiying; Romero-Severson, Jeanne; Schlarbaum, Scott; Carlson, John E.
2015-01-01
Forest health issues are on the rise in the United States, resulting from introduction of alien pests and diseases, coupled with abiotic stresses related to climate change. Increasingly, forest scientists are finding genetic/genomic resources valuable in addressing forest health issues. For a set of ten ecologically and economically important native hardwood tree species representing a broad phylogenetic spectrum, we used low coverage whole genome sequencing from multiplex Illumina paired ends to economically profile their genomic content. For six species, the genome content was further analyzed by flow cytometry in order to determine the nuclear genome size. Sequencing yielded a depth of 0.8X to 7.5X, from which in silico analysis yielded preliminary estimates of gene and repetitive sequence content in the genome for each species. Thousands of genomic SSRs were identified, with a clear predisposition toward dinucleotide repeats and AT-rich repeat motifs. Flanking primers were designed for SSR loci for all ten species, ranging from 891 loci in sugar maple to 18,167 in redbay. In summary, we have demonstrated that useful preliminary genome information including repeat content, gene content and useful SSR markers can be obtained at low cost and time input from a single lane of Illumina multiplex sequence. PMID:26698853
Kawabayashi, Yukari; Furuno, Makoto; Uchida, Marina; Kawana, Takashi
2013-04-01
The aim of this study is to estimate the budget impact in a health insurance society and an industry of promoting decision-making for endowing grants for vaccination as prophylaxis against cervical cancer (CC) by the health insurance society for employees. The target population was Japanese female employees aged 20 to 34 and partners and daughters of male employees working for an overseas IT industry. By using a prevalence-based model, the author estimated expected costs in non-vaccination and vaccination scenarios and evaluated the 10-year financial impact on the industry after vaccination by employing a cost-benefit analysis. The incidence of CC in a target group was derived from the actual number of patients with CC in addition to data from JMDC's receipt database and estimated by a Bayesian method. The epidemiological parameters such as mortality rate, screening rate, detailed exam rate and detailed exam consultation rate were taken from epidemiology statistics and published articles available in Japan. Healthcare costs for cancer treatment, screening, detailed exam and vaccination estimated based on medical fee points were input into the model, 'but the analysis did not consider side effect-related costs. In addition, productivity costs for mortality in employees and their families due to CC, estimated by the national employee's statistics, were also input into the model. An annual discount was unconsidered. From the perspective of the healthcare insurance society, expenditure of approximately 129 million yen in the non-vaccination scenario was expected for ten years, but healthcare-related costs were saved by expenditure of approximately 73 million yen with 100% of employees and their families being vaccinated at expenses of approximately 55 million yen. The insurance society lost approximately 1.8 million yen in total if subsidy for vaccination was set at ten thousand yen. In the case of a 100% vaccination rate, the company can save losses in productivity of approximately 563 million yen in ten years as compared with non-inoculation. Furthermore, family finance can save approximately 2.6 million yen, based on our analysis. Sensitivity analyses suggested that subsidy expenses, the uptake rate of vaccination, and time horizon influenced the mortality cost from the perspectives of the company and the employees' families. A grant for vaccinating women, who are an untargeted population for a public grant, by the health insurance society is meaningful for the prevention of CC. It was deemed that a grant for vaccination of women by the health insurance society would be approximately ten thousand yen.
Holographic studies of the vapor explosion of vaporizing water-in-fuel emulsion droplets
NASA Technical Reports Server (NTRS)
Sheffield, S. A.; Hess, C. F.; Trolinger, J. D.
1982-01-01
Holographic studies were performed which examined the fragmentation process during vapor explosion of a water-in-fuel (hexadecane/water) emulsion droplet. Holograms were taken at 700 to 1000 microseconds after the vapor explosion. Photographs of the reconstructed holograms reveal a wide range of fragment droplet sizes created during the explosion process. Fragment droplet diameters range from below 10 microns to over 100 microns. It is estimated that between ten thousand and a million fragment droplets can result from this extremely violent vapor explosion process. This enhanced atomization is thus expected to have a pronounced effect on vaporization processes which are present during combustion of emulsified fuels.
A program continuation to develop processing procedures for advanced silicon solar cells
NASA Technical Reports Server (NTRS)
Avery, J. E.; Scott-Monck, J. A.
1976-01-01
Shallow junctions, aluminum back surface fields and tantalum pentoxide (Ta205) antireflection coatings coupled with the development of a chromium-palladium-silver contact system, were used to produce a 2 x 4 cm wraparound contact silicon solar cell. One thousand cells were successfully fabricated using batch processing techniques. These cells were 0.020 mm thick, with the majority (800) made from nominal ten ohm-cm silicon and the remainder from nominal 30 ohm-cm material. Unfiltered, these cells delivered a minimum AMO efficiency at 25 C of 11.5 percent and successfully passed all the normal in-process and acceptance tests required for space flight cells.
2014 Summer Series - Salman Khan - Khan Academy: Education Re-imagined
2014-06-26
In 2004, Khan began tutoring his young cousin in math. By 2006, word got around and Khan was tutoring 15 family friends and cousins as a hobby. He also began posting videos of his hand-scribbled tutorials on YouTube. In 2009, when the practice problems and instructional videos were reaching tens of thousands of students per month, he quit his day job to commit himself fully to the not-for-profit Khan Academy. It's now the most-used library of educational lessons on the web, with over 10 million unique students per month, over 300 million lessons delivered, and over a billion exercises completed.
Hybridization of Environmental Microbial Community Nucleic Acids by GeoChip.
Van Nostrand, Joy D; Yin, Huaqin; Wu, Liyou; Yuan, Tong; Zhou, Jizhong
2016-01-01
Functional gene arrays, like the GeoChip, allow for the study of tens of thousands of genes in a single assay. The GeoChip array (5.0) contains probes for genes involved in geochemical cycling (N, C, S, and P), metal homeostasis, stress response, organic contaminant degradation, antibiotic resistance, secondary metabolism, and virulence factors as well as genes specific for fungi, protists, and viruses. Here, we briefly describe GeoChip design strategies (gene selection and probe design) and discuss minimum quantity and quality requirements for nucleic acids. We then provide detailed protocols for amplification, labeling, and hybridization of samples to the GeoChip.
Archiving strategy for USGS EROS center and our future direction
Faundeen, John L.
2010-01-01
The U. S. Geological Survey's Earth Resources Observation and Science Center has the responsibility to acquire, manage, and preserve our Nation's land observations. These records are obtained primarily from airplanes and satellites dating back to the 1930s. The ability to compare landscapes from the past with current information enables change analysis at local and global scales. With new observations added daily, the records management challenges are daunting, involving petabytes of electronic data and tens of thousands of rolls of analog film. This paper focuses upon the appraisal and preservation functions employed to ensure that these records are available for current and future generations.
1995-01-01
Digital data matrix, used to identify the millions of Space Shuttle parts, is being commercialized to make barcoding tamper resistant and invisible to the naked eye. These codes are applied directly to the product regardless of shape, size or color. The markings can range from as small as four microns to as large as two square feet. Using the Vericode Symbol which include such details as the manufacturer, serial numbers, the lot number of the parent material, design changes, special processing to which the part was subjected-everything needed to determine accurately and automatically, the extent of the recall needed, which might be a couple of hundred cars instead of tens of thousands.
Mud Volcanoes as Exploration Targets on Mars
NASA Technical Reports Server (NTRS)
Allen, Carlton C.; Oehler, Dorothy Z.
2010-01-01
Tens of thousands of high-albedo mounds occur across the southern part of the Acidalia impact basin on Mars. These structures have geologic, physical, mineralogic, and morphologic characteristics consistent with an origin from a sedimentary process similar to terrestrial mud volcanism. The potential for mud volcanism in the Northern Plains of Mars has been recognized for some time, with candidate mud volcanoes reported from Utopia, Isidis, northern Borealis, Scandia, and the Chryse-Acidalia region. We have proposed that the profusion of mounds in Acidalia is a consequence of this basin's unique geologic setting as the depocenter for the tune fraction of sediments delivered by the outflow channels from the highlands.
ICPS Turnover GSDO Employee Event
2017-11-07
In the Space Station Processing Facility at NASA's Kennedy Space Center in Florida, a ceremony is underway marking the agency's Spacecraft/Payload Integration and Evolution (SPIE) organization formally turning over processing of the Space Launch System (SLS) rocket's Interim Cryogenic Propulsion Stage (ICPS), to the center's Ground Systems Development and Operations (GSDO) Directorate. The ICPS is seen on the left in its shipping container and is the first integrated piece of flight hardware to arrive in preparation for the uncrewed Exploration Mission-1. With the Orion attached, the ICPS sits atop the SLS rocket and will provide the spacecraft with the additional thrust needed to travel tens of thousands of miles beyond the Moon.
NASA Technical Reports Server (NTRS)
Rosenfield, D.; Fiksel, J.
1980-01-01
A Poisson type model was developed and exercised to estimate the risk of economic losses through 1993 due to potential electric effects of carbon fibers released from United States general aviation aircraft in the aftermath of a fire. Of the expected 354 annual general aviation aircraft accidents with fire projected for 1993, approximately 88 could involve carbon fibers. The average annual loss was estimated to be about $250 (1977 dollars) and the likelihood of exceeding $107,000 (1977 dollars) in annual loss in any one year was estimated to be at most one in ten thousand.
Over ten thousand cases and counting: acidbase.org is serving the critical care community.
Elbers, Paul W G; Van Regenmortel, Niels; Gatz, Rainer
2015-01-01
Acidbase.org has been serving the critical care community for over a decade. The backbone of this online resource consists of Peter Stewart's original text "How to understand Acid-Base" which is freely available to everyone. In addition, Stewart's Textbook of Acid Base, which puts the theory in today's clinical context is available for purchase from the website. However, many intensivists use acidbase.org on a daily basis for its educational content and in particular for its analysis module. This review provides an overview of the history of the website, a tutorial and descriptive statistics of over 10,000 queries submitted to the analysis module.
Reconciling short recurrence intervals with minor deformation in the new madrid seismic zone.
Schweig, E S; Ellis, M A
1994-05-27
At least three great earthquakes occurred in the New Madrid seismic zone in 1811 and 1812. Estimates of present-day strain rates suggest that such events may have a repeat time of 1000 years or less. Paleoseismological data also indicate that earthquakes large enough to cause soil liquefaction have occurred several times in the past 5000 years. However, pervasive crustal deformation expected from such a high frequency of large earthquakes is not observed. This suggests that the seismic zone is a young feature, possibly as young as several tens of thousands of years old and no more than a few million years old.
Active Regions' Magnetic Connection
2017-05-22
Several bright bands of plasma connect from one active region to another, even though they are tens of thousands of miles away from each other (May 17-18, 2017). Active regions are, by their nature, strong magnetic areas with north and south poles. The plasma consists of charged particles that stream along the magnetic field lines between these two regions. These connecting lines are clearly visible in this wavelength of extreme ultraviolet light. Other loops and strands of bright plasma can be seen rising up and out of smaller active regions as well. The video covers about one day's worth of activity. Movies are available at https://photojournal.jpl.nasa.gov/catalog/PIA21638
Wang, Qian; He, An M; Gao, Bo; Chen, Lan; Yu, Qiang Z; Guo, Huan; Shi, Bin J; Jiang, Pu; Zhang, Zeng Y; Li, Ping L; Sheng, Ying G; Fu, Mo J; Wu, Chun T; Chen, Min X; Yuan, Jing
2011-01-01
In recent years, adverse health effects of chemicals from electronic waste (e-waste) have been reported. However, little is known about the genotoxic effects of chemicals in e-waste. In the present study, air concentrations of the toxic metals at e-waste and control sites were analyzed using inductively-coupled plasma mass spectrometry. Levels of toxic metals (lead, copper and cadmium) in blood and urine were detected using atomic absorption spectrophotometry in 48 exposed individuals and 56 age- and sex-matched controls. The frequencies of lymphocytic micronucleated binucleated cells (MNBNCs) were determined using a cytokinesis-block micronucleus assay. Results indicated that blood lead levels were significantly higher in the exposed group (median: 11.449 μg/dL, 1st/3rd quartiles: 9.351-14.410 μg/dL) than in the control group (median: 9.104 μg/dL, 1st/3rd quartiles: 7.275-11.389 μg/dL). The exposed group had higher MNBNCs frequencies (median: 4.0 per thousand, 1st/3rd quartiles: 2.0-7.0 per thousand) compared with the controls (median: 1.0 per thousand, 1st/3rd quartiles: 0.0-2.0 per thousand). Additionally, MNBNCs frequencies and blood lead levels were positively correlated (r = 0.254, p<0.01). Further analysis suggested that a history of working with e-waste was a predictor for increased blood lead levels and MNBNCs frequencies in the subjects. The results suggest that both the living and occupational environments at the e-waste site may be risk factors for increased MNBNCs frequencies among those who are exposed.
Wakefield, Ewan D; Owen, Ellie; Baer, Julia; Carroll, Matthew J; Daunt, Francis; Dodd, Stephen G; Green, Jonathan A; Guilford, Tim; Mavor, Roddy A; Miller, Peter I; Newell, Mark A; Newton, Stephen F; Robertson, Gail S; Shoji, Akiko; Soanes, Louise M; Votier, Stephen C; Wanless, Sarah; Bolton, Mark
2017-10-01
Population-level estimates of species' distributions can reveal fundamental ecological processes and facilitate conservation. However, these may be difficult to obtain for mobile species, especially colonial central-place foragers (CCPFs; e.g., bats, corvids, social insects), because it is often impractical to determine the provenance of individuals observed beyond breeding sites. Moreover, some CCPFs, especially in the marine realm (e.g., pinnipeds, turtles, and seabirds) are difficult to observe because they range tens to ten thousands of kilometers from their colonies. It is hypothesized that the distribution of CCPFs depends largely on habitat availability and intraspecific competition. Modeling these effects may therefore allow distributions to be estimated from samples of individual spatial usage. Such data can be obtained for an increasing number of species using tracking technology. However, techniques for estimating population-level distributions using the telemetry data are poorly developed. This is of concern because many marine CCPFs, such as seabirds, are threatened by anthropogenic activities. Here, we aim to estimate the distribution at sea of four seabird species, foraging from approximately 5,500 breeding sites in Britain and Ireland. To do so, we GPS-tracked a sample of 230 European Shags Phalacrocorax aristotelis, 464 Black-legged Kittiwakes Rissa tridactyla, 178 Common Murres Uria aalge, and 281 Razorbills Alca torda from 13, 20, 12, and 14 colonies, respectively. Using Poisson point process habitat use models, we show that distribution at sea is dependent on (1) density-dependent competition among sympatric conspecifics (all species) and parapatric conspecifics (Kittiwakes and Murres); (2) habitat accessibility and coastal geometry, such that birds travel further from colonies with limited access to the sea; and (3) regional habitat availability. Using these models, we predict space use by birds from unobserved colonies and thereby map the distribution at sea of each species at both the colony and regional level. Space use by all four species' British breeding populations is concentrated in the coastal waters of Scotland, highlighting the need for robust conservation measures in this area. The techniques we present are applicable to any CCPF. © 2017 by the Ecological Society of America.
Climate uncertainty and implications for U.S. state-level risk assessment through 2050.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loose, Verne W.; Lowry, Thomas Stephen; Malczynski, Leonard A.
2009-10-01
Decisions for climate policy will need to take place in advance of climate science resolving all relevant uncertainties. Further, if the concern of policy is to reduce risk, then the best-estimate of climate change impacts may not be so important as the currently understood uncertainty associated with realizable conditions having high consequence. This study focuses on one of the most uncertain aspects of future climate change - precipitation - to understand the implications of uncertainty on risk and the near-term justification for interventions to mitigate the course of climate change. We show that the mean risk of damage to themore » economy from climate change, at the national level, is on the order of one trillion dollars over the next 40 years, with employment impacts of nearly 7 million labor-years. At a 1% exceedance-probability, the impact is over twice the mean-risk value. Impacts at the level of individual U.S. states are then typically in the multiple tens of billions dollar range with employment losses exceeding hundreds of thousands of labor-years. We used results of the Intergovernmental Panel on Climate Change's (IPCC) Fourth Assessment Report 4 (AR4) climate-model ensemble as the referent for climate uncertainty over the next 40 years, mapped the simulated weather hydrologically to the county level for determining the physical consequence to economic activity at the state level, and then performed a detailed, seventy-industry, analysis of economic impact among the interacting lower-48 states. We determined industry GDP and employment impacts at the state level, as well as interstate population migration, effect on personal income, and the consequences for the U.S. trade balance.« less
Sheron, Nick
2016-04-01
In the World Health Organisation European Region, more than 2,370,000 years of life are lost from liver disease before the age of 50; more than lung cancer, trachea, bronchus, oesophageal, stomach, colon, rectum and pancreatic cancer combined. Between 60-80% of these deaths are alcohol related, a disease for which no pharmaceutical therapy has yet been shown to improve long-term survival. The toxicity of alcohol is dose related at an individual level, and is dose related at a population level; overall liver mortality is largely determined by population alcohol consumption. Trends in alcohol consumption correlate closely with trends in overall liver mortality, with 3-5-fold decreases or increases in liver mortality in different European countries over the last few decades. The evidence base for alcohol control measures aimed at reducing population alcohol consumption has been subjected to rigorous evaluation; most recently by the Organisation for Economic Co-Operation and Development (OECD). Effective alcohol policy measures reduce alcohol mortality, including mortality from liver disease. The most effective and cost effective measures have been summarised by the OECD and the World Health Organisation: regular incremental above inflation tax increases, a minimum price for alcohol, effective protection of children from alcohol marketing and low level interventions from clinicians. Simple, cheap and effective changes to alcohol policy by European Institutions and member states have the potential to dramatically reduce liver mortality in Europe. Copyright © 2016. Published by Elsevier B.V.
Efficient implementation of core-excitation Bethe-Salpeter equation calculations
NASA Astrophysics Data System (ADS)
Gilmore, K.; Vinson, John; Shirley, E. L.; Prendergast, D.; Pemmaraju, C. D.; Kas, J. J.; Vila, F. D.; Rehr, J. J.
2015-12-01
We present an efficient implementation of the Bethe-Salpeter equation (BSE) method for obtaining core-level spectra including X-ray absorption (XAS), X-ray emission (XES), and both resonant and non-resonant inelastic X-ray scattering spectra (N/RIXS). Calculations are based on density functional theory (DFT) electronic structures generated either by ABINIT or QuantumESPRESSO, both plane-wave basis, pseudopotential codes. This electronic structure is improved through the inclusion of a GW self energy. The projector augmented wave technique is used to evaluate transition matrix elements between core-level and band states. Final two-particle scattering states are obtained with the NIST core-level BSE solver (NBSE). We have previously reported this implementation, which we refer to as OCEAN (Obtaining Core Excitations from Ab initio electronic structure and NBSE) (Vinson et al., 2011). Here, we present additional efficiencies that enable us to evaluate spectra for systems ten times larger than previously possible; containing up to a few thousand electrons. These improvements include the implementation of optimal basis functions that reduce the cost of the initial DFT calculations, more complete parallelization of the screening calculation and of the action of the BSE Hamiltonian, and various memory reductions. Scaling is demonstrated on supercells of SrTiO3 and example spectra for the organic light emitting molecule Tris-(8-hydroxyquinoline)aluminum (Alq3) are presented. The ability to perform large-scale spectral calculations is particularly advantageous for investigating dilute or non-periodic systems such as doped materials, amorphous systems, or complex nano-structures.