Pouria Bahmani; John W. van de Lindt; Mikhail Gershfeld; Gary L. Mochizuki; Steven E. Pryor; Douglas Rammer
2016-01-01
Soft-story wood-frame buildings have been recognized as a disaster preparedness problem for decades. There are tens of thousands of these multifamily three- and four-story structures throughout California and other parts of the United States. The majority were constructed between 1920 and 1970 and are prevalent in regions such as the San Francisco Bay Area in...
Pouria Bahmani; John van de Lindt; Asif Iqbal; Douglas Rammer
2017-01-01
Soft-story wood-frame buildings have been recognized as a disaster preparedness problem for decades. There are tens of thousands of these multi-family three- and four-story structures throughout California and the United States. The majority were constructed between 1920 and 1970, with many being prevalent in the San Francisco Bay Area in California. The NEES Soft...
47 CFR 52.20 - Thousands-block number pooling.
Code of Federal Regulations, 2010 CFR
2010-10-01
... separated into ten sequential blocks of 1,000 numbers each (thousands-blocks), and allocated separately... required to participate in thousands-block number pooling shall donate thousands-blocks with ten percent or... ten percent or less contaminated, as an initial block or footprint block. (d) Thousands-Block Pooling...
2013-10-03
the Stanford NLP Suite∗ to create an- notated dictionaries based on word morphologies ; the human descriptions provide the input. The predicted...keywords from the low level topic models are labeled through these dictionaries. For more than two POS for the same morphology , we prefer verbs, but other...redundancy particularly retaining subjects like “man,” “woman” etc. and verb morphologies (which otherwise stem to the same prefix) as proxies for ten
Gerend, Mary A.; Shepherd, Janet E.
2012-01-01
Background Although theories of health behavior have guided thousands of studies, relatively few studies have compared these theories against one another. Purpose The purpose of the current study was to compare two classic theories of health behavior—the Health Belief Model (HBM) and the Theory of Planned Behavior (TPB)—in their prediction of human papillomavirus (HPV) vaccination. Methods After watching a gain-framed, loss-framed, or control video, women (N=739) ages 18–26 completed a survey assessing HBM and TPB constructs. HPV vaccine uptake was assessed ten months later. Results Although the message framing intervention had no effect on vaccine uptake, support was observed for both the TPB and HBM. Nevertheless, the TPB consistently outperformed the HBM. Key predictors of uptake included subjective norms, self-efficacy, and vaccine cost. Conclusions Despite the observed advantage of the TPB, findings revealed considerable overlap between the two theories and highlighted the importance of proximal versus distal predictors of health behavior. PMID:22547155
Real-Time Interactive Tree Animation.
Quigley, Ed; Yu, Yue; Huang, Jingwei; Lin, Winnie; Fedkiw, Ronald
2018-05-01
We present a novel method for posing and animating botanical tree models interactively in real time. Unlike other state of the art methods which tend to produce trees that are overly flexible, bending and deforming as if they were underwater plants, our approach allows for arbitrarily high stiffness while still maintaining real-time frame rates without spurious artifacts, even on quite large trees with over ten thousand branches. This is accomplished by using an articulated rigid body model with as-stiff-as-desired rotational springs in conjunction with our newly proposed simulation technique, which is motivated both by position based dynamics and the typical algorithms for articulated rigid bodies. The efficiency of our algorithm allows us to pose and animate trees with millions of branches or alternatively simulate a small forest comprised of many highly detailed trees. Even using only a single CPU core, we can simulate ten thousand branches in real time while still maintaining quite crisp user interactivity. This has allowed us to incorporate our framework into a commodity game engine to run interactively even on a low-budget tablet. We show that our method is amenable to the incorporation of a large variety of desirable effects such as wind, leaves, fictitious forces, collisions, fracture, etc.
High-contrast imaging in the cloud with klipReduce and Findr
NASA Astrophysics Data System (ADS)
Haug-Baltzell, Asher; Males, Jared R.; Morzinski, Katie M.; Wu, Ya-Lin; Merchant, Nirav; Lyons, Eric; Close, Laird M.
2016-08-01
Astronomical data sets are growing ever larger, and the area of high contrast imaging of exoplanets is no exception. With the advent of fast, low-noise detectors operating at 10 to 1000 Hz, huge numbers of images can be taken during a single hours-long observation. High frame rates offer several advantages, such as improved registration, frame selection, and improved speckle calibration. However, advanced image processing algorithms are computationally challenging to apply. Here we describe a parallelized, cloud-based data reduction system developed for the Magellan Adaptive Optics VisAO camera, which is capable of rapidly exploring tens of thousands of parameter sets affecting the Karhunen-Loève image processing (KLIP) algorithm to produce high-quality direct images of exoplanets. We demonstrate these capabilities with a visible wavelength high contrast data set of a hydrogen-accreting brown dwarf companion.
DITTY - a computer program for calculating population dose integrated over ten thousand years
DOE Office of Scientific and Technical Information (OSTI.GOV)
Napier, B.A.; Peloquin, R.A.; Strenge, D.L.
The computer program DITTY (Dose Integrated Over Ten Thousand Years) was developed to determine the collective dose from long term nuclear waste disposal sites resulting from the ground-water pathways. DITTY estimates the time integral of collective dose over a ten-thousand-year period for time-variant radionuclide releases to surface waters, wells, or the atmosphere. This document includes the following information on DITTY: a description of the mathematical models, program designs, data file requirements, input preparation, output interpretations, sample problems, and program-generated diagnostic messages.
Raccoon removal reduces sea turtle nest depredation in the Ten Thousand Islands of Florida
Garmestani, A.S.; Percival, H.F.
2005-01-01
Predation by raccoons, Procyon lotor marinus (L.), is the primary cause of sea turtle nest loss in the Ten Thousand Islands archipelago. Four islands within Ten Thousand Islands National Wildlife Refuge were surveyed for sea turtle nesting activity from 1991-95. Raccoons depredated 76-100% of nests on Panther Key from 1991-94, until 14 raccoons were removed in 1995 resulting in 0% depredation and 2 more were removed in 1996 resulting in 0% depredation. Raccoon removal may be an effective management option for increasing sea turtle nest survival on barrier islands.
ERIC Educational Resources Information Center
Snyder, Robin M.
2014-01-01
Just as the cost of high quality laser printing started in the tens of thousands of dollar and can now be purchased for under $100, so too has 3D printing technology started in the tens of thousands of dollars and is now in the thousand dollar range. Current 3D printing technology takes 2D printing into a third dimension. Many 3D printers are…
Nuclear waste`s human dimension
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erikson, K.; Colglazier, E.W.; White, G.F.
1994-12-31
The United States has pinned its hopes for a permanent underground repository for its high-level nuclear wastes on Yucca Mountain, Nevada. Nevertheless, the Department of Energy`s (DOE) site research efforts have failed {open_quotes}to adequately consider human behavior and emotions,{close_quotes} write Kai Erikson of Yale University, E. William Colglazier of the National Academy of Sciences, and Gilbert F. White of the University of Colorado. The authors maintain that it is impossible to predict changes in geology, seismology, and hydrology that may affect the Yucca Mountain area over the next 1,000 years. Predicting human behavior in that time frame remains even moremore » daunting, they insist. They admit that {open_quotes}DOE...has been given the impossible assignment to take tens of thousands of metric tons of the most hazardous materials ever created and, in the face of growing opposition, entomb them so that they will do little harm for thousands of years.{close_quotes} The researchers suggest that the government seek a secure, retrievable storage arrangement while it continues its search for safer long-term options.« less
International Students in Australia: Read Ten Thousand Volumes of Books and Walk Ten Thousand Miles
ERIC Educational Resources Information Center
Arkoudis, Sophie; Tran, Ly Thi
2007-01-01
A number of international students, predominately from Asian countries, are present in universities in the UK, United States, and Australia. There is little research exploring their experiences as they negotiate the disciplinary requirements of their courses. This paper investigates students' agency as they write their first assignment for their…
NASA Astrophysics Data System (ADS)
Campbell, M.; Heijne, E. H. M.; Llopart, X.; Colas, P.; Giganon, A.; Giomataris, Y.; Chefdeville, M.; Colijn, A. P.; Fornaini, A.; van der Graaf, H.; Kluit, P.; Timmermans, J.; Visschers, J. L.; Schmitz, J.
2006-05-01
A small TPC has been read out by means of a Medipix2 chip as direct anode. A Micromegas foil was placed 50 μm above the chip, and electron multiplication occurred in the gap. With a He/isobutane 80/20 mixture, gas multiplication factors up to tens of thousands were achieved, resulting in an efficiency for detecting single electrons of better than 90%. With this new readout technology for gas-filled detectors we recorded many image frames containing 2D images with tracks from cosmic muons. Along these tracks, electron clusters were observed, as well as δ-rays. With a gas layer thickness of only 1 mm, the device could be applied as vertex detector, outperforming all Si-based detectors.
Estuarine River Data for the Ten Thousand Islands Area, Florida, Water Year 2005
Byrne, Michael J.; Patino, Eduardo
2008-01-01
The U.S. Geological Survey collected stream discharge, stage, salinity, and water-temperature data near the mouths of 11 tributaries flowing into the Ten Thousand Islands area of Florida from October 2004 to June 2005. Maximum positive discharge from Barron River and Faka Union River was 6,000 and 3,200 ft3/s, respectively; no other tributary exceeded 2,600 ft3/s. Salinity variation was greatest at Barron River and Faka Union River, ranging from 2 to 37 ppt, and from 3 to 34 ppt, respectively. Salinity maximums were greatest at Wood River and Little Wood River, each exceeding 40 ppt. All data were collected prior to the commencement of the Picayune Strand Restoration Project, which is designed to establish a more natural flow regime to the tributaries of the Ten Thousand Islands area.
ERIC Educational Resources Information Center
Rahimian, Hamid; Kazemi, Mojtaba; Abbspour, Abbas
2017-01-01
This research aims to determine the effectiveness of training based on learning organization in the staff of cement industry with production capacity over ten thousand tons. The purpose of this study is to propose a training model based on learning organization. For this purpose, the factors of organizational learning were introduced by…
Fann, Neal; Nolte, Christopher G; Dolwick, Patrick; Spero, Tanya L; Brown, Amanda Curry; Phillips, Sharon; Anenberg, Susan
2015-05-01
In this United States-focused analysis we use outputs from two general circulation models (GCMs) driven by different greenhouse gas forcing scenarios as inputs to regional climate and chemical transport models to investigate potential changes in near-term U.S. air quality due to climate change. We conduct multiyear simulations to account for interannual variability and characterize the near-term influence of a changing climate on tropospheric ozone-related health impacts near the year 2030, which is a policy-relevant time frame that is subject to fewer uncertainties than other approaches employed in the literature. We adopt a 2030 emissions inventory that accounts for fully implementing anthropogenic emissions controls required by federal, state, and/or local policies, which is projected to strongly influence future ozone levels. We quantify a comprehensive suite of ozone-related mortality and morbidity impacts including emergency department visits, hospital admissions, acute respiratory symptoms, and lost school days, and estimate the economic value of these impacts. Both GCMs project average daily maximum temperature to increase by 1-4°C and 1-5 ppb increases in daily 8-hr maximum ozone at 2030, though each climate scenario produces ozone levels that vary greatly over space and time. We estimate tens to thousands of additional ozone-related premature deaths and illnesses per year for these two scenarios and calculate an economic burden of these health outcomes of hundreds of millions to tens of billions of U.S. dollars (2010$). Near-term changes to the climate have the potential to greatly affect ground-level ozone. Using a 2030 emission inventory with regional climate fields downscaled from two general circulation models, we project mean temperature increases of 1 to 4°C and climate-driven mean daily 8-hr maximum ozone increases of 1-5 ppb, though each climate scenario produces ozone levels that vary significantly over space and time. These increased ozone levels are estimated to result in tens to thousands of ozone-related premature deaths and illnesses per year and an economic burden of hundreds of millions to tens of billions of U.S. dollars (2010$).
DOT National Transportation Integrated Search
1966-09-01
General aviation pilots are increasingly ascending to altitudes exceeding ten thousand feet. As one becomes exposed to heights above twelve thousand feet, blood oxygen saturation diminishes in accordance with a predicable schedule. Recommended measur...
Field investigation on severely damaged aseismic buildings in 2014 Ludian earthquake
NASA Astrophysics Data System (ADS)
Lin, Xuchuan; Zhang, Haoyu; Chen, Hongfu; Chen, Hao; Lin, Junqi
2015-03-01
The 2014 magnitude 6.5 Ludian earthquake caused a death toll of 617, many landslides and tens of thousands of collapsed buildings. A field investigation to evaluate the damage to buildings was carried out immediately after the occurrence of the earthquake. Severely damaged aseismic buildings, which were basically observed in the downtown of Longtoushan Town, were carefully examined one by one with the aim to improve design codes. This paper summarizes the damage observed to the investigated aseismic buildings in both the structural and local levels. A common failure mode was observed that most of the aseismic buildings, such as RC frame structures and confined masonry structures, were similarly destroyed by severe damage or complete collapse of the first story. The related strong ground motion, which was recorded at the nearby station, had a short duration of less than 20 s but a very large PGA up to 1.0 g. The RC frames based on the new design codes still failed to achieve the design target for "strong column, weak beam". Typical local failure details, which were related to the interaction between RC columns and infill walls and between constructional columns and masonry walls, are summarized with preliminary analyses.
Multiple Hypothesis Tracking (MHT) for Space Surveillance: Results and Simulation Studies
NASA Astrophysics Data System (ADS)
Singh, N.; Poore, A.; Sheaff, C.; Aristoff, J.; Jah, M.
2013-09-01
With the anticipated installation of more accurate sensors and the increased probability of future collisions between space objects, the potential number of observable space objects is likely to increase by an order of magnitude within the next decade, thereby placing an ever-increasing burden on current operational systems. Moreover, the need to track closely-spaced objects due, for example, to breakups as illustrated by the recent Chinese ASAT test or the Iridium-Kosmos collision, requires new, robust, and autonomous methods for space surveillance to enable the development and maintenance of the present and future space catalog and to support the overall space surveillance mission. The problem of correctly associating a stream of uncorrelated tracks (UCTs) and uncorrelated optical observations (UCOs) into common objects is critical to mitigating the number of UCTs and is a prerequisite to subsequent space catalog maintenance. Presently, such association operations are mainly performed using non-statistical simple fixed-gate association logic. In this paper, we report on the salient features and the performance of a newly-developed statistically-robust system-level multiple hypothesis tracking (MHT) system for advanced space surveillance. The multiple-frame assignment (MFA) formulation of MHT, together with supporting astrodynamics algorithms, provides a new joint capability for space catalog maintenance, UCT/UCO resolution, and initial orbit determination. The MFA-MHT framework incorporates multiple hypotheses for report to system track data association and uses a multi-arc construction to accommodate recently developed algorithms for multiple hypothesis filtering (e.g., AEGIS, CAR-MHF, UMAP, and MMAE). This MHT framework allows us to evaluate the benefits of many different algorithms ranging from single- and multiple-frame data association to filtering and uncertainty quantification. In this paper, it will be shown that the MHT system can provide superior tracking performance compared to existing methods at a lower computational cost, especially for closely-spaced objects, in realistic multi-sensor multi-object tracking scenarios over multiple regimes of space. Specifically, we demonstrate that the prototype MHT system can accurately and efficiently process tens of thousands of UCTs and angles-only UCOs emanating from thousands of objects in LEO, GEO, MEO and HELO, many of which are closely-spaced, in real-time on a single laptop computer, thereby making it well-suited for large-scale breakup and tracking scenarios. This is possible in part because complexity reduction techniques are used to control the runtime of MHT without sacrificing accuracy. We assess the performance of MHT in relation to other tracking methods in multi-target, multi-sensor scenarios ranging from easy to difficult (i.e., widely-spaced objects to closely-spaced objects), using realistic physics and probabilities of detection less than one. In LEO, it is shown that the MHT system is able to address the challenges of processing breakups by analyzing multiple frames of data simultaneously in order to improve association decisions, reduce cross-tagging, and reduce unassociated UCTs. As a result, the multi-frame MHT system can establish orbits up to ten times faster than single-frame methods. Finally, it is shown that in GEO, MEO and HELO, the MHT system is able to address the challenges of processing angles-only optical observations by providing a unified multi-frame framework.
Tomlinson, Robert
2018-05-01
Reacting to a never event is difficult and often embarrassing for staff involved. East Lancashire Hospitals NHS Trust has demonstrated that treating staff with respect after a never event, creates an open culture that encourages problem solving and service improvement. The approach has allowed learning to be shared and paved the way for the trust to be the first in the UK to launch the patient centric behavioural noise reduction strategy 'Below ten thousand'.
Computational Fact Checking from Knowledge Networks
Ciampaglia, Giovanni Luca; Shiralkar, Prashant; Rocha, Luis M.; Bollen, Johan; Menczer, Filippo; Flammini, Alessandro
2015-01-01
Traditional fact checking by expert journalists cannot keep up with the enormous volume of information that is now generated online. Computational fact checking may significantly enhance our ability to evaluate the veracity of dubious information. Here we show that the complexities of human fact checking can be approximated quite well by finding the shortest path between concept nodes under properly defined semantic proximity metrics on knowledge graphs. Framed as a network problem this approach is feasible with efficient computational techniques. We evaluate this approach by examining tens of thousands of claims related to history, entertainment, geography, and biographical information using a public knowledge graph extracted from Wikipedia. Statements independently known to be true consistently receive higher support via our method than do false ones. These findings represent a significant step toward scalable computational fact-checking methods that may one day mitigate the spread of harmful misinformation. PMID:26083336
NASA Astrophysics Data System (ADS)
Javh, Jaka; Slavič, Janko; Boltežar, Miha
2018-02-01
Instantaneous full-field displacement fields can be measured using cameras. In fact, using high-speed cameras full-field spectral information up to a couple of kHz can be measured. The trouble is that high-speed cameras capable of measuring high-resolution fields-of-view at high frame rates prove to be very expensive (from tens to hundreds of thousands of euro per camera). This paper introduces a measurement set-up capable of measuring high-frequency vibrations using slow cameras such as DSLR, mirrorless and others. The high-frequency displacements are measured by harmonically blinking the lights at specified frequencies. This harmonic blinking of the lights modulates the intensity changes of the filmed scene and the camera-image acquisition makes the integration over time, thereby producing full-field Fourier coefficients of the filmed structure's displacements.
Observation of a cavitation cloud in tissue using correlation between ultrafast ultrasound images.
Prieur, Fabrice; Zorgani, Ali; Catheline, Stefan; Souchon, Rémi; Mestas, Jean-Louis; Lafond, Maxime; Lafon, Cyril
2015-07-01
The local application of ultrasound is known to improve drug intake by tumors. Cavitating bubbles are one of the contributing effects. A setup in which two ultrasound transducers are placed confocally is used to generate cavitation in ex vivo tissue. As the transducers emit a series of short excitation bursts, the evolution of the cavitation activity is monitored using an ultrafast ultrasound imaging system. The frame rate of the system is several thousands of images per second, which provides several tens of images between consecutive excitation bursts. Using the correlation between consecutive images for speckle tracking, a decorrelation of the imaging signal appears due to the creation, fast movement, and dissolution of the bubbles in the cavitation cloud. By analyzing this area of decorrelation, the cavitation cloud can be localized and the spatial extent of the cavitation activity characterized.
A discourse analysis of the construction of mental illness in two UK newspapers from 1985-2000.
Paterson, Brodie
2007-10-01
This study explored the discourse of mental illness contained within two UK newspapers over a 15-year period, excluding those stories that mentioned any reference to a diagnosis. Using frame analysis, a form of discourse analysis, ten distinct frames were identified and classified into "stories." These ten stories were categorized as: foreign, legal, drug, feature, trauma, tragedy, community care tragedy, social policy, inquiry report, and sports/celebrity stories. Each frame is described and the potential influence of such frames on both social policy and nursing practice is discussed.
Vision-based object detection and recognition system for intelligent vehicles
NASA Astrophysics Data System (ADS)
Ran, Bin; Liu, Henry X.; Martono, Wilfung
1999-01-01
Recently, a proactive crash mitigation system is proposed to enhance the crash avoidance and survivability of the Intelligent Vehicles. Accurate object detection and recognition system is a prerequisite for a proactive crash mitigation system, as system component deployment algorithms rely on accurate hazard detection, recognition, and tracking information. In this paper, we present a vision-based approach to detect and recognize vehicles and traffic signs, obtain their information, and track multiple objects by using a sequence of color images taken from a moving vehicle. The entire system consist of two sub-systems, the vehicle detection and recognition sub-system and traffic sign detection and recognition sub-system. Both of the sub- systems consist of four models: object detection model, object recognition model, object information model, and object tracking model. In order to detect potential objects on the road, several features of the objects are investigated, which include symmetrical shape and aspect ratio of a vehicle and color and shape information of the signs. A two-layer neural network is trained to recognize different types of vehicles and a parameterized traffic sign model is established in the process of recognizing a sign. Tracking is accomplished by combining the analysis of single image frame with the analysis of consecutive image frames. The analysis of the single image frame is performed every ten full-size images. The information model will obtain the information related to the object, such as time to collision for the object vehicle and relative distance from the traffic sings. Experimental results demonstrated a robust and accurate system in real time object detection and recognition over thousands of image frames.
Predicting point-of-departure values from the ToxCast data (TDS)
There are less than two-thousand health assessments available for the tens of thousands of chemicals in commerce today. Traditional toxicity testing takes time, money, and resources leading in part to this large discrepancy. Faster and more efficient ways of understanding adverse...
Number Concepts and Special Needs Students: The Power of Ten-Frame Tiles
ERIC Educational Resources Information Center
Losq, Christine S.
2005-01-01
For the last several years teachers have been using counters and connected cube "trains" and creating base-10 block models to help students develop number sense and understand number concepts. It is described how ten-frame tiles could be more useful tool for building number understanding for many students.
Thousands of military personnel and tens of thousands of civilian workers perform tank entry procedures. OSHA regulations (1910.146) require the internal atmosphere be tested, with a calibrated direct-reading instrument, for oxygen content, flammable gases and vapors, and poten...
Madison County energy conservation study : 2012-2013 survey of roadside vegetation : [summary
DOT National Transportation Integrated Search
2014-02-01
The many thousands of miles of roads in Floridas : State Highway System (SHS) are flanked by tens : of thousands of acres of planted right-of-way : and medians. The nature of the plants and soils : in the right-of-way is important in helping to : ...
Proving communal warfare among hunter-gatherers: The Quasi-Rousseauan error.
Gat, Azar
2015-01-01
Was human fighting always there, as old as our species? Or is it a late cultural invention, emerging after the transition to agriculture and the rise of the state, which began, respectively, only around ten thousand and five thousand years ago? Viewed against the life span of our species, Homo sapiens, stretching back 150,000-200,000 years, let alone the roughly two million years of our genus Homo, this is the tip of the iceberg. We now have a temporal frame and plenty of empirical evidence for the "state of nature" that Thomas Hobbes and Jean-Jacque Rousseau discussed in the abstract and described in diametrically opposed terms. All human populations during the Pleistocene, until about 12,000 years ago, were hunter-gatherers, or foragers, of the simple, mobile sort that lacked accumulated resources. Studying such human populations that survived until recently or still survive in remote corners of the world, anthropology should have been uniquely positioned to answer the question of aboriginal human fighting or lack thereof. Yet access to, and the interpretation of, that information has been intrinsically problematic. The main problem has been the "contact paradox." Prestate societies have no written records of their own. Therefore, documenting them requires contact with literate state societies that necessarily affects the former and potentially changes their behavior, including fighting. © 2015 Wiley Periodicals, Inc.
[Actinomycetes of the genus Micromonospora in meadow ecosystems].
Zenova, G M; Zviagintsev, D G
2002-01-01
Investigations showed that micromonosporas, along with streptomycetes, are the major inhabitants of floodplain meadow ecosystems, where their population varies from tens of thousands to hundreds of thousands of CFU per g substrate. In spring, the population of micromonosporas in soil and on the plant roots was found to be denser than that of streptomycetes.
Thousands of military personnel and tens of thousands of civilian workers perform jet fuel tank entry procedures. Before entering the confined space of a jet fuel tank, OSHA regulations (29CFR1910.146) require the internal atmosphere be tested with a calibrated, direct-reading...
Matches, Mismatches, and Methods: Multiple-View Workflows for Energy Portfolio Analysis.
Brehmer, Matthew; Ng, Jocelyn; Tate, Kevin; Munzner, Tamara
2016-01-01
The energy performance of large building portfolios is challenging to analyze and monitor, as current analysis tools are not scalable or they present derived and aggregated data at too coarse of a level. We conducted a visualization design study, beginning with a thorough work domain analysis and a characterization of data and task abstractions. We describe generalizable visual encoding design choices for time-oriented data framed in terms of matches and mismatches, as well as considerations for workflow design. Our designs address several research questions pertaining to scalability, view coordination, and the inappropriateness of line charts for derived and aggregated data due to a combination of data semantics and domain convention. We also present guidelines relating to familiarity and trust, as well as methodological considerations for visualization design studies. Our designs were adopted by our collaborators and incorporated into the design of an energy analysis software application that will be deployed to tens of thousands of energy workers in their client base.
NASA Astrophysics Data System (ADS)
Yokoyama, Ryouta; Yagi, Shin-ichi; Tamura, Kiyoshi; Sato, Masakazu
2009-07-01
Ultrahigh speed dynamic elastography has promising potential capabilities in applying clinical diagnosis and therapy of living soft tissues. In order to realize the ultrahigh speed motion tracking at speeds of over thousand frames per second, synthetic aperture (SA) array signal processing technology must be introduced. Furthermore, the overall system performance should overcome the fine quantitative evaluation in accuracy and variance of echo phase changes distributed across a tissue medium. On spatial evaluation of local phase changes caused by pulsed excitation on a tissue phantom, investigation was made with the proposed SA signal system utilizing different virtual point sources that were generated by an array transducer to probe each component of local tissue displacement vectors. The final results derived from the cross-correlation method (CCM) brought about almost the same performance as obtained by the constrained least square method (LSM) extended to successive echo frames. These frames were reconstructed by SA processing after the real-time acquisition triggered by the pulsed irradiation from a point source. The continuous behavior of spatial motion vectors demonstrated the dynamic generation and traveling of the pulsed shear wave at a speed of one thousand frames per second.
Ten Thousand Years of Solitude
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benford, G.; Kirkwood, C.W.; Harry, O.
1991-03-01
This report documents the authors work as an expert team advising the US Department of Energy on modes of inadvertent intrusion over the next 10,000 years into the Waste Isolation Pilot Project (WIPP) nuclear waste repository. Credible types of potential future accidental intrusion into the WIPP are estimated as a basis for creating warning markers to prevent inadvertent intrusion. A six-step process is used to structure possible scenarios for such intrusion, and it is concluded that the probability of inadvertent intrusion into the WIPP repository over the next ten thousand years lies between one and twenty-five percent. 3 figs., 5more » tabs.« less
Large scale exact quantum dynamics calculations: Ten thousand quantum states of acetonitrile
NASA Astrophysics Data System (ADS)
Halverson, Thomas; Poirier, Bill
2015-03-01
'Exact' quantum dynamics (EQD) calculations of the vibrational spectrum of acetonitrile (CH3CN) are performed, using two different methods: (1) phase-space-truncated momentum-symmetrized Gaussian basis and (2) correlated truncated harmonic oscillator basis. In both cases, a simple classical phase space picture is used to optimize the selection of individual basis functions-leading to drastic reductions in basis size, in comparison with existing methods. Massive parallelization is also employed. Together, these tools-implemented into a single, easy-to-use computer code-enable a calculation of tens of thousands of vibrational states of CH3CN to an accuracy of 0.001-10 cm-1.
ERIC Educational Resources Information Center
Wager, J. James
2012-01-01
Thousands--if not tens of thousands--of books, monographs, and articles have been written on the subject of leadership. A Google search of the word returns nearly a half-billion Web sites. As a professional who has spent nearly 40 years in the higher education sector, the author has been blessed with opportunities to view and practice leadership…
Exploring Algorithms for Stellar Light Curves With TESS
NASA Astrophysics Data System (ADS)
Buzasi, Derek
2018-01-01
The Kepler and K2 missions have produced tens of thousands of stellar light curves, which have been used to measure rotation periods, characterize photometric activity levels, and explore phenomena such as differential rotation. The quasi-periodic nature of rotational light curves, combined with the potential presence of additional periodicities not due to rotation, complicates the analysis of these time series and makes characterization of uncertainties difficult. A variety of algorithms have been used for the extraction of rotational signals, including autocorrelation functions, discrete Fourier transforms, Lomb-Scargle periodograms, wavelet transforms, and the Hilbert-Huang transform. In addition, in the case of K2 a number of different pipelines have been used to produce initial detrended light curves from the raw image frames.In the near future, TESS photometry, particularly that deriving from the full-frame images, will dramatically further expand the number of such light curves, but details of the pipeline to be used to produce photometry from the FFIs remain under development. K2 data offers us an opportunity to explore the utility of different reduction and analysis tool combinations applied to these astrophysically important tasks. In this work, we apply a wide range of algorithms to light curves produced by a number of popular K2 pipeline products to better understand the advantages and limitations of each approach and provide guidance for the most reliable and most efficient analysis of TESS stellar data.
NASA Astrophysics Data System (ADS)
Rath, K.; Rooney-varga, J. N.; Jones, A.; Johnston, E.; Sterman, J.
2015-12-01
As a simulation-based role-playing exercise, World Climate provides an opportunity for participants to have an immersive experience in which they learn first-hand about both the social dynamics of climate change decision-making, through role-play, and the geophysical dynamics of the climate system, through an interactive computer simulation. In June 2015, we launched the World Climate Project with the intent of bringing this powerful tool to students, citizens, and decision-makers across government, NGO, and private sectors around the world. Within a period of six weeks from the launch date, 440 educators from 36 states and 56 countries have enrolled in the initiative, offering the potential to reach tens of thousands of participants around the world. While this project is clearly in its infancy, we see several characteristics that may be contributing to widespread interest in it. These factors include the ease-of-use, real-world relevance, and scientific rigor of the decision-support simulation, C-ROADS, that frames the World Climate Exercise. Other characteristics of World Climate include its potential to evoke an emotional response that is arousing and inspirational and its use of positive framing and a call to action. Similarly, the World Climate Project takes a collaborative approach, enabling educators to be innovators and valued contributors and regularly communicating with people who join the initiative through webinars, social media, and resources.
Fu, Lili; Han, Bingying; Tan, Deguan; Wang, Meng; Ding, Mei; Zhang, Jiaming
2016-02-22
Myrosinases are β-thioglucoside glucohydrolases and serve as defense mechanisms against insect pests and pathogens by producing toxic compounds. AtTGG6 in Arabidopsis thaliana was previously reported to be a myrosinase pseudogene but specifically expressed in pollen. However, we found that AlTGG6, an ortholog to AtTGG6 in A. lyrata (an outcrossing relative of A. thaliana) was functional, suggesting that functional AtTGG6 alleles may still exist in A. thaliana. AtTGG6 alleles in 29 A. thaliana ecotypes were cloned and sequenced. Results indicate that ten alleles were functional and encoded Myr II type myrosinase of 512 amino acids, and myrosinase activity was confirmed by overexpressing AtTGG6 in Pichia pastoris. However, the 19 other ecotypes had disabled alleles with highly polymorphic frame-shift mutations and diversified sequences. Thirteen frame-shift mutation types were identified, which occurred independently many times in the evolutionary history within a few thousand years. The functional allele was expressed specifically in pollen similar to the disabled alleles but at a higher expression level, suggesting its role in defense of pollen against insect pests such as pollen beetles. However, the defense function may have become less critical after A. thaliana evolved to self-fertilization, and thus resulted in loss of function in most ecotypes.
Temporal enhancement of two-dimensional color doppler echocardiography
NASA Astrophysics Data System (ADS)
Terentjev, Alexey B.; Settlemier, Scott H.; Perrin, Douglas P.; del Nido, Pedro J.; Shturts, Igor V.; Vasilyev, Nikolay V.
2016-03-01
Two-dimensional color Doppler echocardiography is widely used for assessing blood flow inside the heart and blood vessels. Currently, frame acquisition time for this method varies from tens to hundreds of milliseconds, depending on Doppler sector parameters. This leads to low frame rates of resulting video sequences equal to tens of Hz, which is insufficient for some diagnostic purposes, especially in pediatrics. In this paper, we present a new approach for reconstruction of 2D color Doppler cardiac images, which results in the frame rate being increased to hundreds of Hz. This approach relies on a modified method of frame reordering originally applied to real-time 3D echocardiography. There are no previous publications describing application of this method to 2D Color Doppler data. The approach has been tested on several in-vivo cardiac 2D color Doppler datasets with approximate duration of 30 sec and native frame rate of 15 Hz. The resulting image sequences had equivalent frame rates to 500Hz.
ERIC Educational Resources Information Center
Murphy, Edward; Bell, Randy L.
2005-01-01
On any night, the stars seen in the sky can be as close to Earth as a few light-years or as distant as a few thousand light-years. Distances this large are hard to comprehend. The stars are so far away that the fastest spacecraft would take tens of thousands of years to reach even the nearest one. Yet, astronomers have been able to accurately…
Building to Scale: An Analysis of Web-Based Services in CIC (Big Ten) Libraries.
ERIC Educational Resources Information Center
Dewey, Barbara I.
Advancing library services in large universities requires creative approaches for "building to scale." This is the case for CIC, Committee on Institutional Cooperation (Big Ten), libraries whose home institutions serve thousands of students, faculty, staff, and others. Developing virtual Web-based services is an increasingly viable…
John W. van de Lindt; Pouria Bahmani; Mikhail Gershfeld; Gary Mochizuki; Xiaoyun Shao; Steven E. Pryor; Weichiang Pang; Michael D. Symans; Jingjing Tian; Ershad Ziaei; Elaina N. Jennings; Douglas Rammer
2014-01-01
There are thousands of soft-story wood-frame buildings in California which have been recognized as a disaster preparedness problem with concerted mitigation efforts underway in many cities throughout the state. The vast majority of those efforts are based on numerical modelling, often with half-century old data in which assumptions have to be made based on engineering...
Developing Number Sense in Pre-K with Five-Frames
ERIC Educational Resources Information Center
McGuire, Patrick; Kinzie, Mable B.; Berch, Daniel B.
2012-01-01
Teachers in early childhood and elementary classrooms (grades K-5) have been using ten-frames as an instructional tool to support students' mathematics skill development for many years. Use of the similar five-frame has been limited, however, despite its apparent potential as an instructional scaffold in the early elementary grades. Due to scant…
Middleweight black holes found at last
NASA Astrophysics Data System (ADS)
Clery, Daniel
2018-06-01
How did giant black holes grow so big? Astronomers have long had evidence of baby black holes with masses of no more than tens of suns, and of million- or billion-solar-mass behemoths lurking at the centers of galaxies. But middle-size ones, weighing thousands or tens of thousands of suns, seemed to be missing. Their absence forced theorists to propose that supermassive black holes didn't grow gradually by slowly consuming matter, but somehow emerged as ready-made giants. Now, astronomers appear to have located some missing middleweights. An international team has scoured an archive of galaxy spectra and found more than 300 small galaxies that have the signature of intermediate mass black holes in their cores, opening new questions for theorists.
Online tools for nucleosynthesis studies
NASA Astrophysics Data System (ADS)
Göbel, K.; Glorius, J.; Koloczek, A.; Pignatari, M.; Plag, R.; Reifarth, R.; Ritter, C.; Schmidt, S.; Sonnabend, K.; Thomas, B.; Travaglio, C.
2018-01-01
The nucleosynthesis of the elements between iron and uranium involves many different astrophysical scenarios covering wide ranges of temperatures and densities. Thousands of nuclei and ten thousands of reaction rates have to be included in the corresponding simulations. We investigate the impact of single rates on the predicted abundance distributions with post-processing nucleosynthesis simulations. We present online tools, which allow the investigation of sensitivities and integrated mass fluxes in different astrophysical scenarios.
The power dynamics perpetuating unsafe abortion in Africa: a feminist perspective.
Braam, Tamara; Hessini, Leila
2004-04-01
Tens of thousands of African women die every year because societies and governments either ignore the issue of unsafe abortion or actively refuse to address it. This paper explores the issue of abortion from a feminist perspective, centrally arguing that finding appropriate strategies to reclaim women's power at an individual and social level is a central lever for developing effective strategies to increase women's access to safe abortion services. The paper emphasises the central role of patriarchy in shaping the ways power plays itself out in individual relationships, and at social, economic and political levels. The ideology of male superiority denies abortion as an important issue of status and frames the morality, legality and socio-cultural attitudes towards abortion. Patriarchy sculpts unequal gender power relationships and takes power away from women in making decisions about their bodies. Other forms of power such as economic inequality, discourse and power within relationships are also explored. Recommended solutions to shifting the power dynamics around the issue include a combination of public health, rights-based, legal reform and social justice approaches.
In-situ investigations of the ionosphere of comet 67P
NASA Astrophysics Data System (ADS)
Eriksson, A. I.; Edberg, N. J. T.; Odelstad, E.; Vigren, E.; Engelhardt, I.; Henri, P.; Lebreton, J.-P.; Galand, M.; Carr, C. M.; Koenders, C.; Nilsson, H.; Broiles, T.; Rubin, M.
2015-10-01
Since arrival of Rosetta at its target comet 67P/Churyumov-Gerasimenko in August 2014, the plasma environment has been dominated by ionized gas emanating from the comet nucleus rather than by solar wind plasma. This was evident early on from the strong modulation seen with Rosetta's position in a reference frame fixed to the rotating nucleus, with higher plasma densities observed when the spacecraft is above the neck region and when the comet exposes maximum area to the sun. In this respect, Rosetta is inside the comet ionosphere, providing excellent in situ investigation opportunities for the instruments of the Rosetta Plasma Consortium (RPC). In contrast to the often modelled scenario for a very active comet, the Langmuir probe instrument (RPC-LAP) finds electron temperatures mainly in the range of tens of thousand kelvin around this less active comet. This can be attributed to the lower density of neutral gas, meaning little cooling of recently produced electrons. A side effect of this is that the spacecraft charges negatively when within about 100 km from the nucleus. Interesting in itself, this also may point to similar charging for dust grains in the coma, with implications for the detection of the smallest particles and possibly for processes like electrostatic fragmentation. The inner coma also proves to be very dynamic, with large variations not only with latitude and longitude in a comet frame, but also with the solar wind and various wave phenomena.
Krauss, Ken W.; From, Andrew S.; Doyle, Thomas W.; Doyle, Terry J.; Barry, Michael J.
2011-01-01
The Ten Thousand Islands region of southwestern Florida, USA is a major feeding and resting destination for breeding, migrating, and wintering birds. Many species of waterbirds rely specifically on marshes as foraging habitat, making mangrove encroachment a concern for wildlife managers. With the alteration of freshwater flow and sea-level rise trends for the region, mangroves have migrated upstream into traditionally salt and brackish marshes, mirroring similar descriptions around the world. Aside from localized freezes in some years, very little seems to be preventing mangrove encroachment. We mapped changes in mangrove stand boundaries from the Gulf of Mexico inland to the northern boundary of Ten Thousand Islands National Wildlife Refuge (TTINWR) from 1927 to 2005, and determined the area of mangroves to be approximately 7,281 hectares in 2005, representing an 1,878 hectare increase since 1927. Overall change represents an approximately 35% increase in mangrove coverage on TTINWR over 78 years. Sea-level rise is likely the primary driver of this change; however, the construction of new waterways facilitates the dispersal of mangrove propagules into new areas by extending tidal influence, exacerbating encroachment. Reduced volume of freshwater delivery to TTINWR via overland flow and localized rainfall may influence the balance between marsh and mangrove as well, potentially offering some options to managers interested in conserving marsh over mangrove.
Data resulting from the CFD analysis of ten window frames according to the UNI EN ISO 10077-2.
Baglivo, Cristina; Malvoni, Maria; Congedo, Paolo Maria
2016-09-01
Data are related to the numerical simulation performed in the study entitled "CFD modeling to evaluate the thermal performances of window frames in accordance with the ISO 10077" (Malvoni et al., 2016) [1]. The paper focuses on the results from a two-dimensional numerical analysis for ten frame sections suggested by the ISO 10077-2 and performed using GAMBIT 2.2 and ANSYS FLUENT 14.5 CFD code. The dataset specifically includes information about the CFD setup and boundary conditions considered as the input values of the simulations. The trend of the isotherms points out the different impacts on the thermal behaviour of all sections with air solid material or ideal gas into the cavities.
Optical dipole forces: Working together
NASA Astrophysics Data System (ADS)
Aiello, Clarice D.
2017-03-01
Strength lies in numbers and in teamwork: tens of thousands of artificial atoms tightly packed in a nanodiamond act cooperatively, enhancing the optical trapping forces beyond the expected classical bulk polarizability contribution.
NASA Astrophysics Data System (ADS)
Fung, Kenneth K. H.; Lewis, Geraint F.; Wu, Xiaofeng
2017-04-01
A vast wealth of literature exists on the topic of rocket trajectory optimisation, particularly in the area of interplanetary trajectories due to its relevance today. Studies on optimising interstellar and intergalactic trajectories are usually performed in flat spacetime using an analytical approach, with very little focus on optimising interstellar trajectories in a general relativistic framework. This paper examines the use of low-acceleration rockets to reach galactic destinations in the least possible time, with a genetic algorithm being employed for the optimisation process. The fuel required for each journey was calculated for various types of propulsion systems to determine the viability of low-acceleration rockets to colonise the Milky Way. The results showed that to limit the amount of fuel carried on board, an antimatter propulsion system would likely be the minimum technological requirement to reach star systems tens of thousands of light years away. However, using a low-acceleration rocket would require several hundreds of thousands of years to reach these star systems, with minimal time dilation effects since maximum velocities only reached about 0.2 c . Such transit times are clearly impractical, and thus, any kind of colonisation using low acceleration rockets would be difficult. High accelerations, on the order of 1 g, are likely required to complete interstellar journeys within a reasonable time frame, though they may require prohibitively large amounts of fuel. So for now, it appears that humanity's ultimate goal of a galactic empire may only be possible at significantly higher accelerations, though the propulsion technology requirement for a journey that uses realistic amounts of fuel remains to be determined.
NASA Astrophysics Data System (ADS)
Shapley, Alan H.; Hart, Pembroke J.
One of the lasting heritages of the International Geophysical Year (1957-58) is the system of world data centers (WDC) through which there has been international exchange of a wide variety of geophysical data on a continuing basis. This voluntary exchange mechanism has been remarkably successful. The basic operating costs of the centers are provided by the host country. The international exchanges are mainly by barter. The data providers number in the thousands and the users in the tens of thousands.
Ice Loads and Ship Response to Ice. Summer 1982/Winter 1983 Test Program
1984-12-01
approximately 100 ft2 (9.2 M 2) was instrumented to measure ice pressures by measuring compressive strains in the webs of transverse frames. The panel...compressive strains in the webs of transverse frames. The panel was divided into 60 sub-panel areas, six rows of,-ten frames, over which uniform pressures...the Web and the Selection of Gage Spacing . . .............. 18 4.3 Across the Frame Influence on Strain .......... 20 4.4 Construction of the Data
15 CFR 950.8 - Satellite Data Services Division (SDSD).
Code of Federal Regulations, 2012 CFR
2012-01-01
... Technology Satellites (ATS) I and III geostationary research spacecraft; tens of thousands of images from the... geostationary spacecraft. In addition to visible light imagery, infrared data are available from the NIMBUS...
15 CFR 950.8 - Satellite Data Services Division (SDSD).
Code of Federal Regulations, 2014 CFR
2014-01-01
... Technology Satellites (ATS) I and III geostationary research spacecraft; tens of thousands of images from the... geostationary spacecraft. In addition to visible light imagery, infrared data are available from the NIMBUS...
15 CFR 950.8 - Satellite Data Services Division (SDSD).
Code of Federal Regulations, 2013 CFR
2013-01-01
... Technology Satellites (ATS) I and III geostationary research spacecraft; tens of thousands of images from the... geostationary spacecraft. In addition to visible light imagery, infrared data are available from the NIMBUS...
15 CFR 950.8 - Satellite Data Services Division (SDSD).
Code of Federal Regulations, 2011 CFR
2011-01-01
... Technology Satellites (ATS) I and III geostationary research spacecraft; tens of thousands of images from the... geostationary spacecraft. In addition to visible light imagery, infrared data are available from the NIMBUS...
15 CFR 950.8 - Satellite Data Services Division (SDSD).
Code of Federal Regulations, 2010 CFR
2010-01-01
... Technology Satellites (ATS) I and III geostationary research spacecraft; tens of thousands of images from the... geostationary spacecraft. In addition to visible light imagery, infrared data are available from the NIMBUS...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duchaineau, M.; Wolinsky, M.; Sigeti, D.E.
Terrain visualization is a difficult problem for applications requiring accurate images of large datasets at high frame rates, such as flight simulation and ground-based aircraft testing using synthetic sensor stimulation. On current graphics hardware, the problem is to maintain dynamic, view-dependent triangle meshes and texture maps that produce good images at the required frame rate. We present an algorithm for constructing triangle meshes that optimizes flexible view-dependent error metrics, produces guaranteed error bounds, achieves specified triangle counts directly, and uses frame-to-frame coherence to operate at high frame rates for thousands of triangles per frame. Our method, dubbed Real-time Optimally Adaptingmore » Meshes (ROAM), uses two priority queues to drive split and merge operations that maintain continuous triangulations built from pre-processed bintree triangles. We introduce two additional performance optimizations: incremental triangle stripping and priority-computation deferral lists. ROAM execution time is proportionate to the number of triangle changes per frame, which is typically a few percent of the output mesh size, hence ROAM performance is insensitive to the resolution and extent of the input terrain. Dynamic terrain and simple vertex morphing are supported.« less
High-resolution 18 CM spectra of OH/IR stars
NASA Astrophysics Data System (ADS)
Fix, John D.
1987-02-01
High-velocity-resolution, high-signal-to-noise spectra have been obtained for the 18 cm maser emission lines from a number of optically visible OH/IR stars. The spectra have been interpreted in terms of a recent model by Alcock and Ross (1986), in which OH/IR stars lose mass in discrete elements rather than by a continuous wind. Comparison of the observed spectra with synthetic spectra shows that the lines are the composite emission from thousands or tens of thousands of individual elements.
77 FR 19648 - Receipt of Application for a Permit Modification
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-02
... primarily in the region of the Florida coast from Naples to Key West, encompassing the Ten Thousand Islands... following offices: Permits and Conservation Division, Office of Protected Resources, NMFS, 1315 East-West...
US Army Evaluations: A Study of Inaccurate and Inflated Reporting
2012-04-26
decisions such as promotions directly impacting the careers of tens of thousands of the Anny’s leaders, both officer and NCO, has few equals in the...accurate, and equitable perfonnance ratings throughout the Army.13 Many ofthe revisions were caused by the inability of selection boards to discern a...should be : assigned a numerical percentage; superior equals top ten percent, excellence equals top twenty- five percent, success equals top fifty
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-24
... susceptibility of enteric bacteria to antimicrobial agents of medical importance. The NARMS program, established... infected with these bacteria, resulting in tens of thousands of hospitalizations and hundreds of deaths...
75 FR 75845 - National Impaired Driving Prevention Month, 2010
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-07
... United States of America A Proclamation Every day, millions of Americans travel on our Nation's roadways... hand this first day of December, in the year of our Lord two thousand ten, and of the Independence of...
76 FR 45395 - National Korean War Veterans Armistice Day, 2011
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-29
... Agreement at Panmunjom secured the border near the 38th parallel. Together, American service members and... cause of freedom and stability in East Asia and around the world. Today, we honor the tens of thousands...
Simple sequence repeats in Escherichia coli: abundance, distribution, composition, and polymorphism.
Gur-Arie, R; Cohen, C J; Eitan, Y; Shelef, L; Hallerman, E M; Kashi, Y
2000-01-01
Computer-based genome-wide screening of the DNA sequence of Escherichia coli strain K12 revealed tens of thousands of tandem simple sequence repeat (SSR) tracts, with motifs ranging from 1 to 6 nucleotides. SSRs were well distributed throughout the genome. Mononucleotide SSRs were over-represented in noncoding regions and under-represented in open reading frames (ORFs). Nucleotide composition of mono- and dinucleotide SSRs, both in ORFs and in noncoding regions, differed from that of the genomic region in which they occurred, with 93% of all mononucleotide SSRs proving to be of A or T. Computer-based analysis of the fine position of every SSR locus in the noncoding portion of the genome relative to downstream ORFs showed SSRs located in areas that could affect gene regulation. DNA sequences at 14 arbitrarily chosen SSR tracts were compared among E. coli strains. Polymorphisms of SSR copy number were observed at four of seven mononucleotide SSR tracts screened, with all polymorphisms occurring in noncoding regions. SSR polymorphism could prove important as a genome-wide source of variation, both for practical applications (including rapid detection, strain identification, and detection of loci affecting key phenotypes) and for evolutionary adaptation of microbes.
Probing Reionization at z >~ 7 with HST's Near-Infrared Grisms
NASA Astrophysics Data System (ADS)
Schmidt, Kasper B.
The epoch of reionization, i.e. the phase transition of the inter-galactic medium from neutral to fully ionized, is essential for our understanding of the evolution of the Universe and the formation of the first stars and galaxies. The Grism Lens-Amplified Survey from Space (GLASS) has obtained spectra of ten thousands of objects in and behind 10 massive galaxy clusters, including the six Hubble Frontier Fields. The grism spectroscopy from GLASS results in hundreds of spectra of z >~ 7 galaxy candidates. Taking advantage of the lensing magnification from the foreground clusters, the GLASS spectra reaches unprecedented depths in the near-infrared with observed flux limits of ~ 5 × 10-18erg/s/cm2 before correcting for the lens magnification. This has resulted in several Lyα detections at z ~ 7 and tight limits on the emission line fluxes for non-detections. From an ensemble of different photometric selections, we have assembled more than 150 z >~ 7 galaxy candidates from six of the ten GLASS clusters. Among these more than 20 objects show emission lines consistent with being Lyα at z >~ 7. The spatial extent of Lyα estimated from a stack of the most promising Lyα emitters at
Hybrid Residual Flexibility/Mass-Additive Method for Structural Dynamic Testing
NASA Technical Reports Server (NTRS)
Tinker, M. L.
2003-01-01
A large fixture was designed and constructed for modal vibration testing of International Space Station elements. This fixed-base test fixture, which weighs thousands of pounds and is anchored to a massive concrete floor, initially utilized spherical bearings and pendulum mechanisms to simulate Shuttle orbiter boundary constraints for launch of the hardware. Many difficulties were encountered during a checkout test of the common module prototype structure, mainly due to undesirable friction and excessive clearances in the test-article-to-fixture interface bearings. Measured mode shapes and frequencies were not representative of orbiter-constrained modes due to the friction and clearance effects in the bearings. As a result, a major redesign effort for the interface mechanisms was undertaken. The total cost of the fixture design, construction and checkout, and redesign was over $2 million. Because of the problems experienced with fixed-base testing, alternative free-suspension methods were studied, including the residual flexibility and mass-additive approaches. Free-suspension structural dynamics test methods utilize soft elastic bungee cords and overhead frame suspension systems that are less complex and much less expensive than fixed-base systems. The cost of free-suspension fixturing is on the order of tens of thousands of dollars as opposed to millions, for large fixed-base fixturing. In addition, free-suspension test configurations are portable, allowing modal tests to be done at sites without modal test facilities. For example, a mass-additive modal test of the ASTRO-1 Shuttle payload was done at the Kennedy Space Center launch site. In this Technical Memorandum, the mass-additive and residual flexibility test methods are described in detail. A discussion of a hybrid approach that combines the best characteristics of each method follows and is the focus of the study.
Physical and Chemical Properties of Anthropogenic Aerosols: An Overview
Aerosol chemical composition is complex. Combustion aerosols can comprise tens of thousands of organic compounds, refractory brown and black carbon, heavy metals, cations, anions, salts, and other inorganic phases. Aerosol organic matter normally contains semivolatile material th...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-09
... Lock and Dam No. 5; (2) a 109-foot- wide, 40-foot-high lock frame module placed east of the movable... the grid. Each lock frame module would consist of ten 7-foot-diameter hydropower turbines each rated... the powerhouse on the east section of the dam in the existing levee of the east abutment; (2) a...
Optical Diagnostic System For Observation Of Laser-Produced Shock Waves
NASA Astrophysics Data System (ADS)
Wilke, Mark D.; Stone, Sidney N.
1980-11-01
Several standard plasma and gas dynamic diagnostic techniques have been integrated into a system for observing the formation and propagation of high-power Nd:glass-laser generated one- and two-dimensional shockwaves in air from 0.1 torr to atmospheric pres-sures. Diagnostics include either single-frame, two-wavelength holographic ruby-laser interferometry or single-frame, single-wavelength interferometry with ten frames of shadow-graphy. Streaks or ten frames of the early luminous shocked region also are taken on all shots, as well as time-resolved luminosity measurements using high-speed biplanar vacuum photodiodes with various wavelength interference filters. Shadowgraphy frames are 200-ns long at 1-μs intervals, while emission frames are variable with a maximum 10-ns exposure and 50-ns interval. Both the streak mode and emission measurements with the vacuum diode allow subnanosecond time resolution. The interferometry provides 20-ns exposures from 500 ns to late times. Methods for reducing and interpreting the data have been, or are currently being, developed. Interactive computer programs for digitizing the fringe patterns provide fringe-shift profiles for Abel inversion. This has provided neutral gas and electron density information in the spherical, one-dimensional cases. Diagrams and photographs of the experiment will be shown as well as examples of the data that have been taken. Methods for data reduction will be outlined and some of the results shown.
Improving data quality in neuronal population recordings
Harris, Kenneth D.; Quian Quiroga, Rodrigo; Freeman, Jeremy; Smith, Spencer
2017-01-01
Understanding how the brain operates requires understanding how large sets of neurons function together. Modern recording technology makes it possible to simultaneously record the activity of hundreds of neurons, and technological developments will soon allow recording of thousands or tens of thousands. As with all experimental techniques, these methods are subject to confounds that complicate the interpretation of such recordings, and could lead to erroneous scientific conclusions. Here, we discuss methods for assessing and improving the quality of data from these techniques, and outline likely future directions in this field. PMID:27571195
Aviation Security: Slow Progress in Addressing Long-Standing Screener Performance Problems
2000-03-16
aviation security , in particular airport screeners. Securing an air transportation system the size of this nation’s-with hundreds of airports, thousands of aircraft, and tens of thousands of flights daily carrying millions of passengers and pieces of baggage-is a difficult task. Events over the past decade have shown that the threat of terrorism against the United States is an ever-present danger. Aviation is an attractive target for terrorists, and because the air transportation system is critical to the nation’s well-being, protecting it is an important
Cosmic impact: What are the odds?
NASA Astrophysics Data System (ADS)
Harris, A. W.
2009-12-01
Firestone et al. (PNAS 104, 16016-16021, 2007) propose that the impact of a ~4 km diameter comet (or multiple bodies making up a similar mass) led to the Younger Dryas cooling and extinction of megafauna in North America, 12,900 years ago. Even more provocatively, Firestone et al. (Cycle of Cosmic Catastrophes, Bear & Co. Books, 2006, 392pp), suggest that a nearby supernova may have produced a comet shower leading to the impact event, either by disturbing the Oort Cloud or by direct injection of 4-km comet-like bodies to the solar neighborhood. Here we show: (a) A supernova shockwave or mass ejection is not capable of triggering a shower of comets from the Oort Cloud. (b) An Oort Cloud shower from whatever cause would take 100,000 years or more for the perturbed comets to arrive in the inner solar system, and the peak flux would persist for some hundreds of thousands more years. (c) Even if all 20 solar masses or so of ejected matter from a SN were in the form of 4-km diameter balls, the probability of even one such ball hitting the Earth from an event 100 light years away would be about 3e-5. (d) A 4-km diameter ball traveling fast enough to get here from 100 LY away in some tens of thousands of years would deliver the energy of a 50 km diameter impactor traveling at typical Earth-impact velocity (~20 km/sec). We review the current impact flux on the Earth from asteroids and comets, and show that the probability of an impact of a 4-km diameter asteroid in an interval of 13,000 years is about one in a thousand, and the probability of a comet impact of that size is a few in a million. An "impact shower" caused by the injection or breakup of comets or asteroids in the inner solar system by whatever means would take tens to hundreds of thousands of years to clear out, thus the population of NEOs we see now with our telescopic surveys is what we’ve had for the last few tens of thousands of years, at least. Faced with such low odds, the evidence that such a large cosmic impact is the cause of the Younger Dryas boundary and cooling, and that there is no other possible cause, needs to be extraordinary indeed.
Estimation of toxicity using the Toxicity Estimation Software Tool (TEST)
Tens of thousands of chemicals are currently in commerce, and hundreds more are introduced every year. Since experimental measurements of toxicity are extremely time consuming and expensive, it is imperative that alternative methods to estimate toxicity are developed.
Developing Non-Targeted Measurement Methods to Characterize the Human Exposome
The exposome represents all exposures experienced by an individual during their lifetime. Registered chemicals currently number in the tens-of-thousands, and therefore comprise a significant portion of the human exposome. To date, quantitative monitoring methods have been develop...
Atmospheric Science Data Center
2013-04-15
... of which occurred north of Khartoum. According to the Food and Agriculture Organization of the United Nations, tens of thousands of ... fled their homes, and the number of people in need of urgent food assistance in Sudan, estimated at three million earlier in the year, was ...
ERIC Educational Resources Information Center
Wilson, David L.
1994-01-01
College administrators recently appealed to students and faculty to change their computer passwords after security experts announced that tens of thousands had been stolen by computer hackers. Federal officials are investigating. Such attacks are not uncommon, but the most effective solutions are either inconvenient or cumbersome. (MSE)
Federal Initiative: Tick-Borne Disease Integrated Pest Management White Paper
The numbers of human cases of Lyme disease and other tick-borne diseases (TBDs) reported each year to CDC have been increasing steadily in the United States (US), currently totaling tens of thousands of diagnosed human cases annually.
Exposure-Based Prioritization of Chemicals for Risk Assessment
Manufactured chemicals are used extensively to produce a wide variety of consumer goods and are required by important industrial sectors. Presently, information is insufficient to estimate risks posed to human health and the environment from the over ten thousand chemical substan...
Ashy Aftermath of Indonesian Volcano Eruption seen by NASA Spacecraft
2014-02-23
On Feb. 13, 2014, violent eruption of Kelud stratovolcano in Java, Indonesia sent volcanic ash covering an area of 70,000 square miles, prompting the evacuation of tens of thousands of people. This image is from NASA Terra spacecraft.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buck, J.W.; Whelan, G.; Strenge, D.L.
This paper is in response to the US Nuclear Regulatory Commission (NRC) ten questions posed at the Modeling Workshop held November 13 and 14, 1997. The ten questions were developed in advance of the workshop to allow model developers to prepare a presentation at the Workshop. This paper is an expanded version of the Multimedia Environmental Pollutant Assessment System (MEPAS) presentation given at the Modeling Workshop by Pacific Northwest National Laboratory (PNNL) staff. This paper is organized by the ten questions asked by the NRC, each section devoted to a single question. The current version of methodology is MEPAS 3.2more » (NRC 1997) and the discussion in this paper will pertain to that version. In some cases, MEPAS 4.0, which is currently being developed under the Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) (Whelan et al. 1997), will be referenced to inform the reader of potential capabilities in the near future. A separate paper is included in the document that discusses the FRAMES concept.« less
ERIC Educational Resources Information Center
Perez Huber, Lindsay
2009-01-01
Using the critical race "testimonios" of ten Chicana undergraduate students at a top-tier research university, Lindsay Perez Huber interrogates and challenges the racist nativist framing of undocumented Latina/o immigrants as problematic, burdensome, and "illegal." Specifically, a community cultural wealth framework (Yosso, 2005) is utilized and…
Cicero and Burkholderia cepacia: What’s in a Name?
Williams, Frederick
2003-01-01
“ Then said they unto him, Say now Shibboleth: and he said Sibboleth: for he could not frame to pronounce it right. Then they took him and slew him at the passes of Jordan: and there fell at that time of the Ephraimites forty and two thousand.” Judges 12:6 PMID:12702238
Afghanistan and Multiculturalism in Khaled Hosseini's Novels: Study of Place and Diversity
ERIC Educational Resources Information Center
Agnello, Mary F.; Todd, Reese H.; Olaniran, Bolanle; Lucey, Thomas A.
2009-01-01
Purpose: The purpose of this paper is to frame Khaled Hosseini's novels, "The Kite Runner" and "A Thousand Splendid Suns", as literature to expand and enhance the American secondary curriculum with multicultural themes based on Afghanistan as a geographical and cultural place in a dynamic, diverse, and complex world more…
Predicting organ toxicity using in vitro bioactivity data and chemical structure
Animal testing alone cannot practically evaluate the health hazard posed by tens of thousands of environmental chemicals. Computational approaches together with high-throughput experimental data may provide more efficient means to predict chemical toxicity. Here, we use a superv...
Strategies to identify microRNA targets: New advances
USDA-ARS?s Scientific Manuscript database
MicroRNAs (miRNAs) are small regulatory RNA molecules functioning to modulate gene expression at the post-transcriptional level, and playing an important role in many developmental and physiological processes. Ten thousand miRNAs have been discovered in various organisms. Although considerable progr...
Acupuncture Reduces Breast Cancer Joint Pain | Division of Cancer Prevention
In the largest, most rigorous study of its kind, acupuncture was found to significantly reduce the debilitating joint pain experienced by tens of thousands of women each year while being treated for early stage breast cancer with aromatase inhibitors (AIs). |
TOXCAST: A TOOL FOR THE PRIORITIZATION OF CHEMICALS FOR TOXICOLOGICAL EVALUATION
Due to various legislatiave mandates, the US EPA is faced with evaluating the potential of tens of thousands of chemicals (e.g., high production volume chemicals, pestididal inerts, and drinking water contaminants) to cause adverse human health & environmental effects.
TOXCAST: A PROGRAM FOR PRIORTITIZING TOXICITY TESTING OF ENVIRONMENTAL CHEMICALS
Evaluating the potential of tens of thousands of chemicals for risk to human health and the environment is beyond the resource limits of the Environmental Protection Agency. The EPA's ToxCast program will explore alternative methods comprising computational chemistry, high-throug...
3 CFR 8695 - Proclamation 8695 of July 26, 2011. National Korean War Veterans Armistice Day, 2011
Code of Federal Regulations, 2012 CFR
2012-01-01
... Armistice Agreement at Panmunjom secured the border near the 38th parallel. Together, American service... cause of freedom and stability in East Asia and around the world. Today, we honor the tens of thousands...
Analysis of the chemical and physical properties of combustion aerosols: Properties overview
Aerosol chemical composition is remarkably complex. Combustion aerosols can comprise tens of thousands of organic compounds and fragments, refractory carbon, metals, cations, anions, salts, and other inorganic phases and substituents [Hays et al., 2004]. Aerosol organic matter no...
Etoile Project : Social Intelligent ICT-System for very large scale education in complex systems
NASA Astrophysics Data System (ADS)
Bourgine, P.; Johnson, J.
2009-04-01
The project will devise new theory and implement new ICT-based methods of delivering high-quality low-cost postgraduate education to many thousands of people in a scalable way, with the cost of each extra student being negligible (< a few Euros). The research will create an in vivo laboratory of one to ten thousand postgraduate students studying courses in complex systems. This community is chosen because it is large and interdisciplinary and there is a known requirement for courses for thousand of students across Europe. The project involves every aspect of course production and delivery. Within this the research focused on the creation of a Socially Intelligent Resource Mining system to gather large volumes of high quality educational resources from the internet; new methods to deconstruct these to produce a semantically tagged Learning Object Database; a Living Course Ecology to support the creation and maintenance of evolving course materials; systems to deliver courses; and a ‘socially intelligent assessment system'. The system will be tested on one to ten thousand postgraduate students in Europe working towards the Complex System Society's title of European PhD in Complex Systems. Étoile will have a very high impact both scientifically and socially by (i) the provision of new scalable ICT-based methods for providing very low cost scientific education, (ii) the creation of new mathematical and statistical theory for the multiscale dynamics of complex systems, (iii) the provision of a working example of adaptation and emergence in complex socio-technical systems, and (iv) making a major educational contribution to European complex systems science and its applications.
NASA Astrophysics Data System (ADS)
Draisma, Stefano G. A.; Prud'homme van Reine, Willem F.; Herandarudewi, Sekar M. C.; Hoeksema, Bert W.
2018-01-01
The Jakarta Bay - Thousand Islands reef complex extends to more than 80 km in northwest direction from the major conurbation Jakarta (Indonesia) along a pronounced inshore to offshore environmental gradient. The present study aims to determine to what extent environmental factors can explain the composition of macroalgal communities on the reefs off Jakarta. Therefore, the presence-absence of 67 macroalgal taxa was recorded for 27 sampling sites along the inshore-offshore disturbance gradient and analysed with substrate variables and water quality variables. The macroalgal richness pattern matches the pattern of other reef taxa. The 27 sites could be assigned to one of four geographical zones with 85% certainty based on their macroalgal taxon assemblages. These four zones (i.e., Jakarta Bay and, respectively, South, Central, and North Thousand Islands) had significantly different macroalgal assemblages, except for the North and South zones. Along the nearshore gradient there was a greater shift in taxon composition than within the central Thousand Islands. The patterns of ten habitat and water quality variables resembled the macroalgal diversity patterns by 56%. All ten variables together explained 69% of the variation in macroalgal composition. Shelf depth, % sand cover, gelbstoff/detrital material, chlorophyll a concentration, seawater surface temperature, and % dead coral cover were the best predictors of seaweed flora composition. Furthermore, 44 macroalgal species represented new records for the area. The present study provides important baseline data of macroalgae in the area for comparison in future biodiversity assessments in the area and elsewhere in the region.
Utility of acoustical detection of Coptotermes Formosanus (Isoptera: Rhinotermitidae)
USDA-ARS?s Scientific Manuscript database
The AED 2000 and 2010 are extremely sensitive listening devices which can effectively detect and monitor termite activity through a wave guide (e.g. bolt) both qualitatively and quantitatively. Experiments conducted with one to ten thousand termites from differing colonies infesting wood in buckets...
Mining Human Biomonitoring Data to Identify Prevalent Chemical Mixtures (SOT abstract)
Through food, water, air, and consumer products, humans are exposed to tens of thousands of environmental chemicals, and most of these have not been evaluated to determine their potential toxicities. In recent years, high-throughput screening (HTS) methods have been developed tha...
Exposure Considerations for Chemical Prioritization and Toxicity Testing
Globally there is a need to characterize potential risk to human health and the environment that arises from the manufacture and use of tens of thousands of chemicals. Currently, a significant research effort is underway to apply new technologies to screen and prioritize chemica...
Source-to-Dose Modeling of Phthalates: Lessons for Prioritization
Globally there is a need to characterize potential risk to human health and the environment that arises from the manufacture and use of tens of thousands of chemicals. The US EPA is developing methods for using computational chemistry, high-throughput screening, and toxicogenomi...
20180312 - Structure-based QSAR Models to Predict Systemic Toxicity Points of Departure (SOT)
Human health risk assessment associated with environmental chemical exposure is limited by the tens of thousands of chemicals with little or no experimental in vivo toxicity data. Data gap filling techniques, such as quantitative structure activity relationship (QSAR) models base...
Quantitative Prediction of Systemic Toxicity Points of Departure (OpenTox USA 2017)
Human health risk assessment associated with environmental chemical exposure is limited by the tens of thousands of chemicals little or no experimental in vivo toxicity data. Data gap filling techniques, such as quantitative models based on chemical structure information, are c...
Developing an Experimental Model of Vascular Toxicity in Embryonic Zebrafish
Developing an Experimental Model of Vascular Toxicity in Embryonic Zebrafish Tamara Tal, Integrated Systems Toxicology Division, U.S. EPA Background: There are tens of thousands of chemicals that have yet to be fully evaluated for their toxicity by validated in vivo testing ...
Extraterrestrial Communications.
ERIC Educational Resources Information Center
Deardorff, James W.
1987-01-01
Discusses the embargo hypothesis--the theory that Earth is apparently free from alien exploitation because of a presumed cosmic quarantine against this planet--which implies that, instead of being only a few hundred years technologically in advance of earthly civilization, extraterrestrials in charge are likely tens of thousands of years in…
NASA Astrophysics Data System (ADS)
Baek, Seung Ki; Minnhagen, Petter; Kim, Beom Jun
2011-07-01
In Korean culture, the names of family members are recorded in special family books. This makes it possible to follow the distribution of Korean family names far back in history. It is shown here that these name distributions are well described by a simple null model, the random group formation (RGF) model. This model makes it possible to predict how the name distributions change and these predictions are shown to be borne out. In particular, the RGF model predicts that for married women entering a collection of family books in a certain year, the occurrence of the most common family name 'Kim' should be directly proportional to the total number of married women with the same proportionality constant for all the years. This prediction is also borne out to a high degree. We speculate that it reflects some inherent social stability in the Korean culture. In addition, we obtain an estimate of the total population of the Korean culture down to the year 500 AD, based on the RGF model, and find about ten thousand Kims.
Perez, Cristina R; Moye, John K; Cacela, Dave; Dean, Karen M; Pritsos, Chris A
2017-12-01
In 2010, the Deepwater Horizon oil spill released 134 million gallons of crude oil into the Gulf of Mexico making it the largest oil spill in US history. The three month oil spill left tens of thousands of birds dead; however, the fate of tens of thousands of other migratory birds that were affected but did not immediately die is unknown. We used the homing pigeon as a surrogate species for migratory birds to investigate the effects of a single external oiling event on the flight performance of birds. Data from GPS data loggers revealed that lightly oiled pigeons took significantly longer to return home and spent more time stopped en route than unoiled birds. This suggests that migratory birds affected by the oil spill could have experienced long term flight impairment and delayed arrival to breeding, wintering, or crucial stopover sites and subsequently suffered reductions in survival and reproductive success. Copyright © 2017 Elsevier Inc. All rights reserved.
An aeromagnetic survey in the Valley of Ten Thousand Smokes, Alaska. M.S. Thesis
NASA Technical Reports Server (NTRS)
Anma, K.
1971-01-01
Geologic and magnetic studies of the Katmai area have further demonstrated the close relationship between the Katmai Caldera, Novarupta plug, and the pyroclastic flows in the Valley of Ten Thousand Smokes. The magnetic fields observed appear to be associated with the thickness of the pyroclastic flow and the different rock units within it for lower flight levels, and also the contrast between the valley fill and the rock units at the Valley margins. Consistent magnetic anomalies are associated with the larger fumarole lines, which were presumably sites of large scale activity, while the smaller fumaroles are not usually seen in the aeromagnetic map. A possible correlation between low positive anomalies and nuee ardente deposits was revealed by the aeromagnetic survey, but was not strong. A ground survey was also carried out in several parts of the Valley with a view to detailed delineation of the magnetic signatures of the pyroclastic flow, as an aid to interpreting the aeromagnetic date.
The Eleventh Plague: The Politics of Biological and Chemical Warfare
NASA Astrophysics Data System (ADS)
Kovac, Jeffrey
1997-07-01
Leonard A. Cole. W. H. Freeman: New York, 1997. 250 pp. ISBN 0-7167-2950-4. $22.95 hc. The Eleventh Plague begins with a recitation of the ten plagues brought down upon Egypt, part of the Passover Seder celebrated each spring by Jews all over the world. Spring is also the anniversary of the first use of chemical weapons. On April 22, 1915, German soldiers released chlorine gas from 5,739 cylinders installed along the battle line at Ypres in southeastern Belgium. Germany achieved complete surprise. The gas drifted across no man's land, causing widespread terror and creating ten thousand serious casualties and five thousand deaths. Chlorine, of course, was a poor weapon, easily neutralized, but German scientists, including future Nobel laureates Fritz Haber, Otto Hahn, and James Franck, and the German chemical industry created ever more dangerous chemical weapons, culminating with the introduction of mustard gas in 1917. Despite cries of moral outrage, the Allies countered with their own chemical weapons efforts. The eleventh plague had been unleashed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holland, M.
In the last fifteen years, the introduction of plane or diverging wave transmissions rather than line by line scanning focused beams has broken the conventional barriers of ultrasound imaging. By using such large field of view transmissions, the frame rate reaches the theoretical limit of physics dictated by the ultrasound speed and an ultrasonic map can be provided typically in tens of micro-seconds (several thousands of frames per second). Interestingly, this leap in frame rate is not only a technological breakthrough but it permits the advent of completely new ultrasound imaging modes, including shear wave elastography, electromechanical wave imaging, ultrafastmore » doppler, ultrafast contrast imaging, and even functional ultrasound imaging of brain activity (fUltrasound) introducing Ultrasound as an emerging full-fledged neuroimaging modality. At ultrafast frame rates, it becomes possible to track in real time the transient vibrations – known as shear waves – propagating through organs. Such “human body seismology” provides quantitative maps of local tissue stiffness whose added value for diagnosis has been recently demonstrated in many fields of radiology (breast, prostate and liver cancer, cardiovascular imaging, …). Today, Supersonic Imagine company is commercializing the first clinical ultrafast ultrasound scanner, Aixplorer with real time Shear Wave Elastography. This is the first example of an ultrafast Ultrasound approach surpassing the research phase and now widely spread in the clinical medical ultrasound community with an installed base of more than 1000 Aixplorer systems in 54 countries worldwide. For blood flow imaging, ultrafast Doppler permits high-precision characterization of complex vascular and cardiac flows. It also gives ultrasound the ability to detect very subtle blood flow in very small vessels. In the brain, such ultrasensitive Doppler paves the way for fUltrasound (functional ultrasound imaging) of brain activity with unprecedented spatial and temporal resolution compared to fMRI. Combined with contrast agents, our group demonstrated that Ultrafast Ultrasound Localization could provide a first in vivo and non invasive imaging modality at microscopic scales deep into organs. Many of these ultrafast modes should lead to major improvements in ultrasound screening, diagnosis, and therapeutic monitoring. Learning Objectives: Achieve familiarity with recent advances in ultrafast ultrasound imaging technology. Develop an understanding of potential applications of ultrafast ultrasound imaging for diagnosis and therapeutic monitoring. Dr. Tanter is a co-founder of Supersonic Imagine,a French company positioned in the field of medical ultrasound imaging and therapy.« less
WE-B-210-00: Carson/Zagzebski Distinguished Lectureship
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
In the last fifteen years, the introduction of plane or diverging wave transmissions rather than line by line scanning focused beams has broken the conventional barriers of ultrasound imaging. By using such large field of view transmissions, the frame rate reaches the theoretical limit of physics dictated by the ultrasound speed and an ultrasonic map can be provided typically in tens of micro-seconds (several thousands of frames per second). Interestingly, this leap in frame rate is not only a technological breakthrough but it permits the advent of completely new ultrasound imaging modes, including shear wave elastography, electromechanical wave imaging, ultrafastmore » doppler, ultrafast contrast imaging, and even functional ultrasound imaging of brain activity (fUltrasound) introducing Ultrasound as an emerging full-fledged neuroimaging modality. At ultrafast frame rates, it becomes possible to track in real time the transient vibrations – known as shear waves – propagating through organs. Such “human body seismology” provides quantitative maps of local tissue stiffness whose added value for diagnosis has been recently demonstrated in many fields of radiology (breast, prostate and liver cancer, cardiovascular imaging, …). Today, Supersonic Imagine company is commercializing the first clinical ultrafast ultrasound scanner, Aixplorer with real time Shear Wave Elastography. This is the first example of an ultrafast Ultrasound approach surpassing the research phase and now widely spread in the clinical medical ultrasound community with an installed base of more than 1000 Aixplorer systems in 54 countries worldwide. For blood flow imaging, ultrafast Doppler permits high-precision characterization of complex vascular and cardiac flows. It also gives ultrasound the ability to detect very subtle blood flow in very small vessels. In the brain, such ultrasensitive Doppler paves the way for fUltrasound (functional ultrasound imaging) of brain activity with unprecedented spatial and temporal resolution compared to fMRI. Combined with contrast agents, our group demonstrated that Ultrafast Ultrasound Localization could provide a first in vivo and non invasive imaging modality at microscopic scales deep into organs. Many of these ultrafast modes should lead to major improvements in ultrasound screening, diagnosis, and therapeutic monitoring. Learning Objectives: Achieve familiarity with recent advances in ultrafast ultrasound imaging technology. Develop an understanding of potential applications of ultrafast ultrasound imaging for diagnosis and therapeutic monitoring. Dr. Tanter is a co-founder of Supersonic Imagine,a French company positioned in the field of medical ultrasound imaging and therapy.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holland, M.
In the last fifteen years, the introduction of plane or diverging wave transmissions rather than line by line scanning focused beams has broken the conventional barriers of ultrasound imaging. By using such large field of view transmissions, the frame rate reaches the theoretical limit of physics dictated by the ultrasound speed and an ultrasonic map can be provided typically in tens of micro-seconds (several thousands of frames per second). Interestingly, this leap in frame rate is not only a technological breakthrough but it permits the advent of completely new ultrasound imaging modes, including shear wave elastography, electromechanical wave imaging, ultrafastmore » doppler, ultrafast contrast imaging, and even functional ultrasound imaging of brain activity (fUltrasound) introducing Ultrasound as an emerging full-fledged neuroimaging modality. At ultrafast frame rates, it becomes possible to track in real time the transient vibrations – known as shear waves – propagating through organs. Such “human body seismology” provides quantitative maps of local tissue stiffness whose added value for diagnosis has been recently demonstrated in many fields of radiology (breast, prostate and liver cancer, cardiovascular imaging, …). Today, Supersonic Imagine company is commercializing the first clinical ultrafast ultrasound scanner, Aixplorer with real time Shear Wave Elastography. This is the first example of an ultrafast Ultrasound approach surpassing the research phase and now widely spread in the clinical medical ultrasound community with an installed base of more than 1000 Aixplorer systems in 54 countries worldwide. For blood flow imaging, ultrafast Doppler permits high-precision characterization of complex vascular and cardiac flows. It also gives ultrasound the ability to detect very subtle blood flow in very small vessels. In the brain, such ultrasensitive Doppler paves the way for fUltrasound (functional ultrasound imaging) of brain activity with unprecedented spatial and temporal resolution compared to fMRI. Combined with contrast agents, our group demonstrated that Ultrafast Ultrasound Localization could provide a first in vivo and non invasive imaging modality at microscopic scales deep into organs. Many of these ultrafast modes should lead to major improvements in ultrasound screening, diagnosis, and therapeutic monitoring. Learning Objectives: Achieve familiarity with recent advances in ultrafast ultrasound imaging technology. Develop an understanding of potential applications of ultrafast ultrasound imaging for diagnosis and therapeutic monitoring. Dr. Tanter is a co-founder of Supersonic Imagine,a French company positioned in the field of medical ultrasound imaging and therapy.« less
THE SOURCE STRUCTURE OF 0642+449 DETECTED FROM THE CONT14 OBSERVATIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Ming H.; Wang, Guang L.; Heinkelmann, Robert
2016-11-01
The CONT14 campaign with state-of-the-art very long baseline interferometry (VLBI) data has observed the source 0642+449 with about 1000 observables each day during a continuous observing period of 15 days, providing tens of thousands of closure delays—the sum of the delays around a closed loop of baselines. The closure delay is independent of the instrumental and propagation delays and provides valuable additional information about the source structure. We demonstrate the use of this new “observable” for the determination of the structure in the radio source 0642+449. This source, as one of the defining sources in the second realization of themore » International Celestial Reference Frame, is found to have two point-like components with a relative position offset of −426 microarcseconds ( μ as) in R.A. and −66 μ as in decl. The two components are almost equally bright, with a flux-density ratio of 0.92. The standard deviation of closure delays for source 0642+449 was reduced from 139 to 90 ps by using this two-component model. Closure delays larger than 1 ns are found to be related to the source structure, demonstrating that structure effects for a source with this simple structure could be up to tens of nanoseconds. The method described in this paper does not rely on a priori source structure information, such as knowledge of source structure determined from direct (Fourier) imaging of the same observations or observations at other epochs. We anticipate our study to be a starting point for more effective determination of the structure effect in VLBI observations.« less
Simulation of Chronic Liver Injury Due to Environmental Chemicals
US EPA Virtual Liver (v-Liver) is a cellular systems model of hepatic tissues to predict the effects of chronic exposure to chemicals. Tens of thousands of chemicals are currently in commerce and hundreds more are introduced every year. Few of these chemicals have been adequate...
EDSP Prioritization: Collaborative Estrogen Receptor Activity Prediction Project (CERAPP) (SOT)
Humans are potentially exposed to tens of thousands of man-made chemicals in the environment. It is well known that some environmental chemicals mimic natural hormones and thus have the potential to be endocrine disruptors. Most of these environmental chemicals have never been te...
Invited presentation at Dalton College, Dalton, GA to the Alliance for Innovation & Sustainability, April 20, 2017. U.S. EPA’s Computational Toxicology Program: Innovation Powered by Chemistry It is estimated that tens of thousands of commercial and industrial chemicals are ...
Characterization of the potential adverse effects is lacking for tens of thousands of chemicals that are present in the environment, and characterization of developmental neurotoxicity (DNT) hazard lags behind that of other adverse outcomes (e.g. hepatotoxicity). This is due in p...
Characterization of the potential adverse effects is lacking for tens of thousands of chemicals that are present in the environment, and characterization of developmental neurotoxicity (DNT) hazard lags behind that of other adverse outcomes (e.g. hepatotoxicity). This is due in p...
Theoretical Framework to Extend Adverse Outcome Pathways to Include Pharmacokinetic Considerations
Adverse Outcome Pathways (AOPs) have generated intense interest for their utility in linking known population outcomes to a molecular initiating event (MIE) that can be quantified using in vitro methods. While there are tens of thousands of chemicals in commercial use, biology h...
Recessionary Layoffs in Museum Education: Survey Results and Implications
ERIC Educational Resources Information Center
Kley, Ron
2009-01-01
A recent survey of recession-driven museum staff reductions suggests the possible loss of tens of thousands of museum personnel nationwide and identifies educators as among those most severely impacted. Survey findings are summarized, and the implications for both affected personnel and downsized institutions are considered.
SPATIAL ASSOCIATION BETWEEN SPECIATED FINE PARTICLES AND MORTALITY
Particulate matter (PM) has been linked to a range of serious cardiovascular and respiratory health problems. Some of the recent epidemiologic studies suggest that exposures to PM may result in tens of thousands of excess deaths per year and many more cases of illness among the ...
2010-06-11
International Labour Organisation Office for the Coordination of Humanitarian Affairs Peacebuilding Support Office United Kingdom Cabinet Office... absentees numbered in the tens of thousands (GAO 2005b, 7). Additionally, the total number of forces is misleading because both the trained
Booth, Amanda C.; Soderqvist, Lars E.
2016-12-12
Freshwater flow to the Ten Thousand Islands estuary has been altered by the construction of the Tamiami Trail and the Southern Golden Gate Estates. The Picayune Strand Restoration Project, which is associated with the Comprehensive Everglades Restoration Plan, has been implemented to improve freshwater delivery to the Ten Thousand Islands estuary by removing hundreds of miles of roads, emplacing hundreds of canal plugs, removing exotic vegetation, and constructing three pump stations. Quantifying the tributary flows and salinity patterns prior to, during, and after the restoration is essential to assessing the effectiveness of upstream restoration efforts.Tributary flow and salinity patterns during preliminary restoration efforts and prior to the installation of pump stations were analyzed to provide baseline data and preliminary analysis of changes due to restoration efforts. The study assessed streamflow and salinity data for water years1 2007–2014 for the Faka Union River (canal flow included), East River, Little Wood River, Pumpkin River, and Blackwater River. Salinity data from the Palm River and Faka Union Boundary water-quality stations were also assessed.Faka Union River was the dominant contributor of freshwater during water years 2007–14 to the Ten Thousand Islands estuary, followed by Little Wood and East Rivers. Pumpkin River and Blackwater River were the least substantial contributors of freshwater flow. The lowest annual flow volumes, the highest annual mean salinities, and the highest percentage of salinity values greater than 35 parts per thousand (ppt) occurred in water year 2011 at all sites with available data, corresponding with the lowest annual rainfall during the study. The highest annual flow volumes and the lowest percentage of salinities greater than 35 ppt occurred in water year 2013 for all sites with available data, corresponding with the highest rainfall during the study.In water year 2014, the percentage of monitored annual flow contributed by East River increased and the percentage of flow contributed by Faka Union River decreased, compared to the earlier years. No changes in annual flow occurred at any sites west of Faka Union River. No changes in the relative flow contributions were observed during the wet season; however, the relative amounts of streamflow increased during the dry season at East River in 2014. East River had only 1 month of negative flow in 2014 compared to 6 months in 2011 and 7 months in 2008. Higher dry season flows in East River may be in response to restoration efforts. The sites to the west of Faka Union River had higher salinities on average than Faka Union River and East River. Faka Union River had the highest range in salinities, and Faka Union Boundary had the lowest range in salinities. Pumpkin River was the tributary with the lowest range in salinities.1Water year is defined as the 12-month period from October 1, for any given year, through September 30 of the following year.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wiley, H. S.
There comes a time in every field of science when things suddenly change. While it might not be immediately apparent that things are different, a tipping point has occurred. Biology is now at such a point. The reason is the introduction of high-throughput genomics-based technologies. I am not talking about the consequences of the sequencing of the human genome (and every other genome within reach). The change is due to new technologies that generate an enormous amount of data about the molecular composition of cells. These include proteomics, transcriptional profiling by sequencing, and the ability to globally measure microRNAs andmore » post-translational modifications of proteins. These mountains of digital data can be mapped to a common frame of reference: the organism’s genome. With the new high-throughput technologies, we can generate tens of thousands of data points from each sample. Data are now measured in terabytes and the time necessary to analyze data can now require years. Obviously, we can’t wait to interpret the data fully before the next experiment. In fact, we might never be able to even look at all of it, much less understand it. This volume of data requires sophisticated computational and statistical methods for its analysis and is forcing biologists to approach data interpretation as a collaborative venture.« less
NASA Astrophysics Data System (ADS)
Spinoglio, L.; Alonso-Herrero, A.; Armus, L.; Baes, M.; Bernard-Salas, J.; Bianchi, S.; Bocchio, M.; Bolatto, A.; Bradford, C.; Braine, J.; Carrera, F. J.; Ciesla, L.; Clements, D. L.; Dannerbauer, H.; Doi, Y.; Efstathiou, A.; Egami, E.; Fernández-Ontiveros, J. A.; Ferrara, A.; Fischer, J.; Franceschini, A.; Gallerani, S.; Giard, M.; González-Alfonso, E.; Gruppioni, C.; Guillard, P.; Hatziminaoglou, E.; Imanishi, M.; Ishihara, D.; Isobe, N.; Kaneda, H.; Kawada, M.; Kohno, K.; Kwon, J.; Madden, S.; Malkan, M. A.; Marassi, S.; Matsuhara, H.; Matsuura, M.; Miniutti, G.; Nagamine, K.; Nagao, T.; Najarro, F.; Nakagawa, T.; Onaka, T.; Oyabu, S.; Pallottini, A.; Piro, L.; Pozzi, F.; Rodighiero, G.; Roelfsema, P.; Sakon, I.; Santini, P.; Schaerer, D.; Schneider, R.; Scott, D.; Serjeant, S.; Shibai, H.; Smith, J.-D. T.; Sobacchi, E.; Sturm, E.; Suzuki, T.; Vallini, L.; van der Tak, F.; Vignali, C.; Yamada, T.; Wada, T.; Wang, L.
2017-11-01
IR spectroscopy in the range 12-230 μm with the SPace IR telescope for Cosmology and Astrophysics (SPICA) will reveal the physical processes governing the formation and evolution of galaxies and black holes through cosmic time, bridging the gap between the James Webb Space Telescope and the upcoming Extremely Large Telescopes at shorter wavelengths and the Atacama Large Millimeter Array at longer wavelengths. The SPICA, with its 2.5-m telescope actively cooled to below 8 K, will obtain the first spectroscopic determination, in the mid-IR rest-frame, of both the star-formation rate and black hole accretion rate histories of galaxies, reaching lookback times of 12 Gyr, for large statistically significant samples. Densities, temperatures, radiation fields, and gas-phase metallicities will be measured in dust-obscured galaxies and active galactic nuclei, sampling a large range in mass and luminosity, from faint local dwarf galaxies to luminous quasars in the distant Universe. Active galactic nuclei and starburst feedback and feeding mechanisms in distant galaxies will be uncovered through detailed measurements of molecular and atomic line profiles. The SPICA's large-area deep spectrophotometric surveys will provide mid-IR spectra and continuum fluxes for unbiased samples of tens of thousands of galaxies, out to redshifts of z 6.
77 FR 60605 - National Breast Cancer Awareness Month, 2012
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-04
... National Breast Cancer Awareness Month, 2012 By the President of the United States of America A Proclamation Breast cancer touches the lives of Americans from every background and in every community across...,000 women will be diagnosed with breast cancer this year, and tens of thousands are expected to lose...
Predictive Toxicology and In Vitro to In Vivo Extrapolation (AsiaTox2015)
A significant challenge in toxicology is the “too many chemicals” problem. Humans and environmental species are exposed to as many as tens of thousands of chemicals, few of which have been thoroughly tested using standard in vivo test methods. This talk will discuss several appro...
Humans are potentially exposed to tens of thousands of man-made chemicals in the environment. It is well known that some environmental chemicals mimic natural hormones and thus have the potential to be endocrine disruptors. Most of these environmental chemicals have never been te...
Humans are potentially exposed to tens of thousands of man-made chemicals in the environment, some of which may mimic natural endocrine hormones and thus have the potential to be endocrine disruptors. Predictive in silico tools can be used to quickly and efficiently evaluate thes...
Spatial Ability: A Neglected Talent in Educational and Occupational Settings
ERIC Educational Resources Information Center
Kell, Harrison J.; Lubinski, David
2013-01-01
For over 60 years, longitudinal research on tens of thousands of high ability and intellectually precocious youth has consistently revealed the importance of spatial ability for hands-on creative accomplishments and the development of expertise in science, technology, engineering, and mathematical (STEM) disciplines. Yet, individual differences in…
ERIC Educational Resources Information Center
Meisenhelder, Susan
2013-01-01
The push for increased use of online teaching in colleges and universities has been gaining momentum for some time, but even in that context the recent enthusiasm for MOOCs (Massive Open Online Courses), free online courses that often enroll tens of thousands of students, is remarkable and rightly dubbed "MOOC Mania." As with so many…
25 CFR 141.13 - Amusement company licenses.
Code of Federal Regulations, 2011 CFR
2011-04-01
... licenses. (a) No person may operate a portable dance pavilion, mechanical amusement device such as a ferris... amount not exceeding ten thousand dollars ($10,000) and a personal injury and property damage liability... to the tribe and for the protection of the public against personal injury and property damage by bond...
25 CFR 141.13 - Amusement company licenses.
Code of Federal Regulations, 2010 CFR
2010-04-01
... licenses. (a) No person may operate a portable dance pavilion, mechanical amusement device such as a ferris... amount not exceeding ten thousand dollars ($10,000) and a personal injury and property damage liability... to the tribe and for the protection of the public against personal injury and property damage by bond...
A Conceptual Framework for U.S. EPA’s National Exposure Research Laboratory
Fulfilling the U.S. EPA mission to protect human health and the environment carries with it the challenge of understanding exposures for tens of thousands of chemical contaminants, a wide range of biological stressors, and many physical stressors. The U.S. EPA’s National Exposur...
The EPA ToxCast Program: Developing Predictive Bioactivity Signatures for Chemicals
There are tens of thousands of chemicals used in the environment for which little or no toxicology information is known. Current testing paradigms that use large numbers of animals to perform in vivo toxicology are too slow and expensive to apply to this large number of chemicals...
The Second Phase of ToxCast and Initial Applications to Chemical Prioritization
Tens of thousands of chemicals and other contaminants exist in our environment, but only a fraction of these have been characterized for their potential hazard to humans. ToxCast is focused on closing this data gap and improving the management of chemical risk through a high thro...
Sudden Oak Death - Western (Pest Alert)
Susan Frankel
2002-01-01
Tens of thousands of tanoak (Lithocarpus densiflorus), coast live oak (Quercus agrifolia), California black oak (Quercus kelloggii), Shreve oak (Quercus parvula var. shrevei), and madrone (Arbutus menziesii) have been killed by a newly identified species, Phytophthora ramorum, which causes Sudden Oak Death. Sudden Oak Death was first reported in 1995 in central coastal...
Visualizing the Solute Vaporization Interference in Flame Atomic Absorption Spectroscopy
ERIC Educational Resources Information Center
Dockery, Christopher R.; Blew, Michael J.; Goode, Scott R.
2008-01-01
Every day, tens of thousands of chemists use analytical atomic spectroscopy in their work, often without knowledge of possible interferences. We present a unique approach to study these interferences by using modern response surface methods to visualize an interference in which aluminum depresses the calcium atomic absorption signal. Calcium…
Development of a Context-Rich Database of ToxCast Assay Annotations (SOT)
Major concerns exist for the large number of environmental chemicals which lack toxicity data. The tens of thousands of commercial substances in need of screening for potential human health effects would cost millions of dollars and several decades to test in traditional animal-b...
A Method for Identifying Prevalent Chemical Combinations in the US Population
Through the food and water they ingest, the air they breathe, and the consumer products with which they interact at home and at work, humans are exposed to tens of thousands of chemicals, many of which have not been evaluated to determine their potential toxicities. In recent yea...
Triaging Chemical Exposure Data Needs and Tools for Advancing Next-Generation Risk Assessment
The timely assessment of the risks posed to public health by tens of thousands of existing and emerging commercial chemicals is a critical challenge facing the U.S. Environmental Protection Agency and regulatory bodies worldwide. The pace of conducting risk assessments is limited...
Neuroimaging Research: from Null-Hypothesis Falsification to Out-Of-Sample Generalization
ERIC Educational Resources Information Center
Bzdok, Danilo; Varoquaux, Gaël; Thirion, Bertrand
2017-01-01
Brain-imaging technology has boosted the quantification of neurobiological phenomena underlying human mental operations and their disturbances. Since its inception, drawing inference on neurophysiological effects hinged on classical statistical methods, especially, the general linear model. The tens of thousands of variables per brain scan were…
ERIC Educational Resources Information Center
Simon, David
2008-01-01
Energy costs are projected to rise as much as 12 percent in 2008, and a facility's "carbon footprint" has become an issue of increasing importance. So, many schools and universities are taking a hard look at their energy consumption. Education facilities can save tens of thousands of dollars in yearly electric costs, and cut harmful emissions by…
Where's the Beef in Administrator Pay?
ERIC Educational Resources Information Center
Cunningham, William G.; Sperry, J. Brent
2001-01-01
Salary differences between educators and business leaders range from tens of thousands of dollars for principals to millions for superintendents. Employees valuing monetary incentives will not be attracted to or remain in the education field. Wealthy taxpayers get too many breaks. Progressive income taxes should replace skewed property taxes. (MLH)
The challenge of assessing the potential developmental health risks for the tens of thousands of environmental chemicals is beyond the capacity for resource-intensive animal protocols. Large data streams coming from high-throughput (HTS) and high-content (HCS) profiling of biolog...
Agricultural Career Education in the City of New York
ERIC Educational Resources Information Center
Chrein, George
1975-01-01
More than one thousand students in ten high schools throughout the City of New York are presently enrolled in an agricultural career program, specializing in farm production and management, ornamental horticulture, animal care, or conservation. More than 90 percent continue in occupational agriculture in the post-secondary schools. (Author/AJ)
Weight Misperception and Health Risk Behaviors among Early Adolescents
ERIC Educational Resources Information Center
Pasch, Keryn E.; Klein, Elizabeth G.; Laska, Melissa N.; Velazquez, Cayley E.; Moe, Stacey G.; Lytle, Leslie A.
2011-01-01
Objectives: To examine associations between weight misperception and youth health risk and protective factors. Methods: Three thousand ten US seventh-graders (72.1% white, mean age: 12.7 years) self-reported height, weight, risk, and protective factors. Analyses were conducted to determine cross-sectional and longitudinal associations between…
Operation and Maintenance Support Information (OMSI) Creation, Management, and Repurposing With XML
2004-09-01
engines that cost tens of thousands of dollars. There are many middleware applications on the commercial and open-source market . The “Big Four......planners can begin an incremental planning effort early in the facility construction phase. This thesis provides a non-proprietary, no- cost solution to
Big Results from Small Samples: Evaluation of Amplification Protocols for Gene Expression Profiling
Microarrays have revolutionized many areas of biology due to our technical ability to quantify tens of thousands of transcripts within a single experiment. However, there are still many areas that cannot benefit from this technology due to the amount of biological material needed...
Developing Young Children's Multidigit Number Sense.
ERIC Educational Resources Information Center
Diezmann, Carmel M.; English, Lyn D.
2001-01-01
This article describes a series of enrichment experiences designed to develop young (ages 5 to 8) gifted children's understanding of large numbers, central to their investigation of space travel. It describes activities designed to teach reading of large numbers and exploring numbers to a thousand and then a million. (Contains ten references.) (DB)
ERIC Educational Resources Information Center
Guilbert, Juliette
2006-01-01
This article focusses on defining the Parent Academy. The Parent Academy is a deeply ambitious, privately funded project aimed at improving students' education by improving their parents'. Since Miami-Dade County Public Schools superintendent Rudy Crew launched it last year, TPA has reached tens of thousands of parents through hundreds of free…
A Web-Hosted R Workflow to Simplify and Automate the Analysis of 16S NGS Data
Next-Generation Sequencing (NGS) produces large data sets that include tens-of-thousands of sequence reads per sample. For analysis of bacterial diversity, 16S NGS sequences are typically analyzed in a workflow that containing best-of-breed bioinformatics packages that may levera...
Core Principles for Transforming Remediation within a Comprehensive: Student Success Strategy
ERIC Educational Resources Information Center
Achieving the Dream, 2015
2015-01-01
Colleges and postsecondary systems across the nation are demonstrating remarkable progress in phasing out standalone or multi-course remediation sequences, resulting in tens of thousands of students more quickly enrolling in and completing college-level courses. These organizations have collaborated to describe the principles they see in common…
Over the past ten years, the US government has invested in high-throughput (HT) methods to screen chemicals for biological activity. Under the interagency Tox21 consortium and the US Environmental Protection Agency’s (EPA) ToxCast™ program, thousands of chemicals have...
Romano, Michael
2003-03-24
HealthSouth and its chief executive Richard Scrushy, left, find themselves coping with a public relations nightmare after federal officials last week charged the rehabilitation giant with "massive accounting fraud" and a systematic betrayal of tens of thousands of investors.
ERIC Educational Resources Information Center
Morrison, David
1982-01-01
Discusses the effects on astronomy courses/curriculum if equal time were given to the concept that the universe was created in its present form about ten thousand years ago. Includes the full text on a resolution concerning creationism passed by the Board of Directors of the Astronomical Society of the Pacific. (Author/JN)
Predictive Modeling of Apical Toxicity Endpoints Using Data From ToxCast
The US EPA and other regulatory agencies face a daunting challenge of evaluating potential toxicity for tens of thousands of environmental chemicals about which little is currently known. The EPA’s ToxCast program is testing a novel approach to this problem by screening compounds...
Faurie, Julia; Baudet, Mathilde; Assi, Kondo Claude; Auger, Dominique; Gilbert, Guillaume; Tournoux, Francois; Garcia, Damien
2017-02-01
Recent studies have suggested that intracardiac vortex flow imaging could be of clinical interest to early diagnose the diastolic heart function. Doppler vortography has been introduced as a simple color Doppler method to detect and quantify intraventricular vortices. This method is able to locate a vortex core based on the recognition of an antisymmetric pattern in the Doppler velocity field. Because the heart is a fast-moving organ, high frame rates are needed to decipher the whole blood vortex dynamics during diastole. In this paper, we adapted the vortography method to high-frame-rate echocardiography using circular waves. Time-resolved Doppler vortography was first validated in vitro in an ideal forced vortex. We observed a strong correlation between the core vorticity determined by high-frame-rate vortography and the ground-truth vorticity. Vortography was also tested in vivo in ten healthy volunteers using high-frame-rate duplex ultrasonography. The main vortex that forms during left ventricular filling was tracked during two-three successive cardiac cycles, and its core vorticity was determined at a sampling rate up to 80 duplex images per heartbeat. Three echocardiographic apical views were evaluated. Vortography-derived vorticities were compared with those returned by the 2-D vector flow mapping approach. Comparison with 4-D flow magnetic resonance imaging was also performed in four of the ten volunteers. Strong intermethod agreements were observed when determining the peak vorticity during early filling. It is concluded that high-frame-rate Doppler vortography can accurately investigate the diastolic vortex dynamics.
A photoelastic modulator-based birefringence imaging microscope for measuring biological specimens
NASA Astrophysics Data System (ADS)
Freudenthal, John; Leadbetter, Andy; Wolf, Jacob; Wang, Baoliang; Segal, Solomon
2014-11-01
The photoelastic modulator (PEM) has been applied to a variety of polarimetric measurements. However, nearly all such applications use point-measurements where each point (spot) on the sample is measured one at a time. The main challenge for employing the PEM in a camera-based imaging instrument is that the PEM modulates too fast for typical cameras. The PEM modulates at tens of KHz. To capture the specific polarization information that is carried on the modulation frequency of the PEM, the camera needs to be at least ten times faster. However, the typical frame rates of common cameras are only in the tens or hundreds frames per second. In this paper, we report a PEM-camera birefringence imaging microscope. We use the so-called stroboscopic illumination method to overcome the incompatibility of the high frequency of the PEM to the relatively slow frame rate of a camera. We trigger the LED light source using a field-programmable gate array (FPGA) in synchrony with the modulation of the PEM. We show the measurement results of several standard birefringent samples as a part of the instrument calibration. Furthermore, we show results observed in two birefringent biological specimens, a human skin tissue that contains collagen and a slice of mouse brain that contains bundles of myelinated axonal fibers. Novel applications of this PEM-based birefringence imaging microscope to both research communities and industrial applications are being tested.
ERIC Educational Resources Information Center
Santa Ana, Otto; Lopez, Layza; Munguia, Edgar
2010-01-01
This study examines two successive days of U.S. television news coverage of the May 1, 2007, immigration rights rally in Los Angeles. As thousands of demonstrators appealed peacefully for comprehensive immigration policy reform, they were assailed by 450 police officers firing munitions and using truncheons. We evaluated fifty-one television news…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-18
... is proposing to limit the time in which these parties have to decide to adjust to a price other than...) and 6.25.05. \\5\\ Please note that that limiting the time frame to ten minutes would also align the... professionals.\\8\\ \\8\\ Id. Finally, the Exchange believes that the proposal to change the time to ten minutes...
Developing seismogenic source models based on geologic fault data
Haller, Kathleen M.; Basili, Roberto
2011-01-01
Calculating seismic hazard usually requires input that includes seismicity associated with known faults, historical earthquake catalogs, geodesy, and models of ground shaking. This paper will address the input generally derived from geologic studies that augment the short historical catalog to predict ground shaking at time scales of tens, hundreds, or thousands of years (e.g., SSHAC 1997). A seismogenic source model, terminology we adopt here for a fault source model, includes explicit three-dimensional faults deemed capable of generating ground motions of engineering significance within a specified time frame of interest. In tectonically active regions of the world, such as near plate boundaries, multiple seismic cycles span a few hundred to a few thousand years. In contrast, in less active regions hundreds of kilometers from the nearest plate boundary, seismic cycles generally are thousands to tens of thousands of years long. Therefore, one should include sources having both longer recurrence intervals and possibly older times of most recent rupture in less active regions of the world rather than restricting the model to include only Holocene faults (i.e., those with evidence of large-magnitude earthquakes in the past 11,500 years) as is the practice in tectonically active regions with high deformation rates. During the past 15 years, our institutions independently developed databases to characterize seismogenic sources based on geologic data at a national scale. Our goal here is to compare the content of these two publicly available seismogenic source models compiled for the primary purpose of supporting seismic hazard calculations by the Istituto Nazionale di Geofisica e Vulcanologia (INGV) and the U.S. Geological Survey (USGS); hereinafter we refer to the two seismogenic source models as INGV and USGS, respectively. This comparison is timely because new initiatives are emerging to characterize seismogenic sources at the continental scale (e.g., SHARE in the Euro-Mediterranean, http://www.share-eu.org/; EMME in the Middle East, http://www.emme-gem.org/) and global scale (e.g., GEM, http://www.globalquakemodel.org/; Anonymous 2008). To some extent, each of these efforts is still trying to resolve the level of optimal detail required for this type of compilation. The comparison we provide defines a common standard for consideration by the international community for future regional and global seismogenic source models by identifying the necessary parameters that capture the essence of geological fault data in order to characterize seismogenic sources. In addition, we inform potential users of differences in our usage of common geological/seismological terms to avoid inappropriate use of the data in our models and provide guidance to convert the data from one model to the other (for detailed instructions, see the electronic supplement to this article). Applying our recommendations will permit probabilistic seismic hazard assessment codes to run seamlessly using either seismogenic source input. The USGS and INGV database schema compare well at a first-level inspection. Both databases contain a set of fields representing generalized fault three-dimensional geometry and additional fields that capture the essence of past earthquake occurrences. Nevertheless, there are important differences. When we further analyze supposedly comparable fields, many are defined differently. These differences would cause anomalous results in hazard prediction if one assumes the values are similarly defined. The data, however, can be made fully compatible using simple transformations.
Joint Service Aircrew Mask (JSAM) Extended Wear Comfort Evaluation
2009-10-01
JSAM eyeglass frames containing his optical prescription. Comfort Questionnaire (CQ). The CQ was completed ten times as scheduled by the USAF...and comfort of the JSAM while wearing eyeglass frames was very acceptable. Their ability to see was reported as acceptable. No fogging of the...Performance Wing acceleration and altitude test subject panels. The subjects were male, ranged in age from 28-39 years, physically fit , and considered
Mycelial actinobacteria in salt-affected soils of arid territories of Ukraine and Russia
NASA Astrophysics Data System (ADS)
Grishko, V. N.; Syshchikova, O. V.; Zenova, G. M.; Kozhevin, P. A.; Dubrova, M. S.; Lubsanova, D. A.; Chernov, I. Yu.
2015-01-01
A high population density (up to hundreds of thousands or millions CFU/g soil) of mycelial bacteria (actinomycetes) is determined in salt-affected soils of arid territories of Ukraine, Russia, and Turkmenistan. Of all the studied soils, the lowest amounts of actinomycetes (thousands and tens of thousands CFU/g soil) are isolated from sor (playa) and soda solonchaks developed on the bottoms of drying salt lakes in Buryatia and in the Amu Darya Delta. Actinomycetes of the Streptomyces, Micromonospora, and Nocardiopsis genera were recorded in the studied soils. It is found that conditions of preincubation greatly affect the activity of substrate consumption by the cultures of actinomycetes. This could be attributed to changes in the metabolism of actinomycetes as a mechanism of their adaptation to the increased osmotic pressure of the medium. The alkali tolerance of halotolerant actinomycetes isolated from the salt-affected soils is experimentally proved.
2012-02-06
This frame from an animation, which depicts the growth of the Kamoamoa Flow Field, Kilauea Volcano, Hawaii, was generated from a sequence of ten multispectral images acquired between September 3 and 17, 1995.
Motion in Jupiter's Atmospheric Vortices (Near-infrared filters)
1998-03-26
Two frame "movie" of a pair of vortices in Jupiter's southern hemisphere. The two frames are separated by ten hours. The right oval is rotating counterclockwise, like other anticyclonic bright vortices in Jupiter's atmosphere. The left vortex is a cyclonic (clockwise) vortex. The differences between them (their brightness, their symmetry, and their behavior) are clues to how Jupiter's atmosphere works. The frames span about fifteen degrees in latitude and longitude and are centered at 141 degrees west longitude and 36 degrees south planetocentric latitude. Both vortices are about 3500 kilometers in diameter in the north-south direction. The images were taken in near infrared light at 756 nanometers and show clouds that are at a pressure level of about 1 bar in Jupiter's atmosphere. North is at the top. The smallest resolved features are tens of kilometers in size. These images were taken on May 7, 1997, at a range of 1.5 million kilometers by the Solid State Imaging system on NASA's Galileo spacecraft. An animation is available at http://photojournal.jpl.nasa.gov/catalog/PIA01230
Increasing awareness about endocrine disrupting chemicals (EDCs) in the environment has driven concern about their potential impact on human health and wildlife. Tens of thousands of natural and synthetic xenobiotics are presently in commerce with little to no toxicity data and t...
HTS Data and In Silico Models for High-Throughout Risk Assessment (FutureTox II)
A significant challenge in toxicology is the “too many chemicals” problem. Humans and environmental species are exposed to as many as tens of thousands of chemicals, few of which have been thoroughly tested using standard in vivo test methods. This talk will discuss several appro...
Tens of thousands of chemicals and other man-made contaminants exist in our environment, but only a fraction of these have been characterized for their potential risk to humans and there is widespread interest in closing this data gap in order to better manage contaminant risk. C...
Enabling Easier Information Access in Online Discussion Forums
ERIC Educational Resources Information Center
Bhatia, Sumit
2013-01-01
Online discussion forums have become popular in recent times. They provide a platform for people from different parts of the world sharing a common interest to come together and topics of mutual interest and seek solutions to their problems. There are hundreds of thousands of internet forums containing tens of millions of discussion threads and…
Sudden Oak Death - Eastern (Pest Alert)
Joseph O' Brien; Manfred Mielke; Steve Oak; Bruce Moltzan
2002-01-01
A phenomenon known as Sudden Oak Death was first reported in 1995 in central coastal California. Since then, tens of thousands of tanoaks (Lithocarpus densiflorus), coast live oaks (Quercus agrifolia), and California black oaks (Quercus kelloggii) have been killed by a newly identified fungus, Phytophthora ramorum. On these hosts, the fungus causes a bleeding canker on...
Teacher as Trickster on the Learner's Journey
ERIC Educational Resources Information Center
Davis, Kenneth W.; Weeden, Scott R.
2009-01-01
For tens of thousands of years, teachers have used stories to promote learning. Today's teachers can do the same. In particular, we can employ Joseph Campbell's "monomyth"--with its stages of separation, initiation, and return--as a model for structuring learning experiences. Within the monomyth, one tempting role for teachers is the sage, but we…
Creating a Sustainable University and Community through a Common Experience
ERIC Educational Resources Information Center
Lopez, Omar S.
2013-01-01
Purpose: This article aims to provide an overview of Texas State University's Common Experience, an innovative initiative that engaged tens of thousands of people in shared consideration of sustainability as a single topic during academic year 2010-2011. Design/methodology/approach: The discourse begins with an overview of the Common Experience…
Ten Qualities of a Strong Community College Leader
ERIC Educational Resources Information Center
Wheelan, Belle
2012-01-01
There are thousands of articles, books, essays, dissertations, and more devoted to leadership in higher education. All of them highlight the importance of a person "out front" who is charged with moving the organization forward and people who follow to ensure that movement takes place. The author's favorite definition of leadership is not found in…
USDA-ARS?s Scientific Manuscript database
Mosquitoes of various species mate in swarms comprised of tens to thousands flying males. Yet little information is known about mosquito swarming mechanism. Discovering chemical cues involved in mosquito biology leads to better adaptation of disease control interventions. In this study, we aimed ...
The High Price of For-Profit Colleges
ERIC Educational Resources Information Center
Yeoman, Barry
2011-01-01
Critics say that for-profit career colleges--which, according to industry figures, enrolled 3.2 million students in the United States in 2009--have been plagued by deceptive recruiting practices that lure students into programs they could find elsewhere for much less money. Students often borrow tens of thousands of dollars to attend these…
Guest-Host Encounters in Diaspora-Heritage Tourism: The Taglit-Birthright Israel Mifgash (Encounter)
ERIC Educational Resources Information Center
Sasson, Theodore; Mittelberg, David; Hecht, Shahar; Saxe, Leonard
2011-01-01
More than 300,000 diaspora Jewish young adults and tens of thousands of their Israeli peers have participated in structured, cross-cultural encounters--"mifgashim"--in the context of an experiential education program known as Taglit-Birthright Israel. Drawing on field observations, interviews, and surveys, the formal and informal…
Characterization of the potential adverse effects is lacking for tens of thousands of chemicals that are present in the environment, and characterization of developmental neurotoxicity (DNT) hazard lags behind that of other adverse outcomes (e.g. hepatotoxicity). This is due in p...
Modeling belowground biomass of black cohosh, a medicinal forest product.
James Chamberlain; Gabrielle Ness; Christine Small; Simon Bonner; Elizabeth Hiebert
2014-01-01
Tens of thousands of kilograms of rhizomes and roots of Actaea racemosa L., a native Appalachian forest perennial, are harvested every year and used for the treatment of menopausal conditions. Sustainable management of this and other wild-harvested non-timber forest products requires the ability to effectively and reliably inventory marketable plant...
Extension Online: Utilizing Technology to Enhance Educational Outreach
ERIC Educational Resources Information Center
Green, Stephen
2012-01-01
Extension Online is an Internet-based online course platform that enables the Texas AgriLife Extension Service's Family Development and Resource Management (FDRM) unit to reach tens of thousands of users across the U.S. annually with research-based information. This article introduces readers to Extension Online by describing the history of its…
Time and Practice: Learning to Become a Geographer
ERIC Educational Resources Information Center
Downs, Roger M.
2014-01-01
A goal of geography education is fostering geographic literacy for all and building significant expertise for some. How much time and practice do students need to become literate or expert in geography? There is not an answer to this question. Using two concepts from cognitive psychology--the ideas of ten thousand hours and deliberate…
USDA-ARS?s Scientific Manuscript database
Background: Faced with tens of thousands of food choices, consumers frequently turn to promotional advertising, such as Sunday sales circulars, to make purchasing decisions. To date, little research has examined the content of sales circulars over multiple seasons. Methods: Food items from 12 months...
College Savings Plans: A Bad Gamble
ERIC Educational Resources Information Center
Carey, Kevin
2009-01-01
With all the economic pain and consternation--surging unemployment, enormous corporate bankruptcy, trillions becoming the new billions--it's easy to overlook the fact that tens of thousands of families have suddenly lost a great deal of the money they socked away to pay for college. They lost it because public officials told them to risk their…
77 FR 64019 - National School Lunch Week, 2012
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-17
... meals for tens of millions of students every day. These meals are a vital source of fruits, vegetables... Michelle Obama's Let's Move! initiative, we are continuing to bring together stakeholders at every level of... hand this twelfth day of October, in the year of our Lord two thousand twelve, and of the Independence...
The Big Fixes Now Needed for "No Child Left Behind"
ERIC Educational Resources Information Center
Stover, Del
2007-01-01
The underlying principles of No Child Left Behind (NCLB)--the demand for high standards, greater accountability, and the focus on long-overlooked student populations--are good. NCLB has done well for public education. Still, tens of thousands of educators nationwide are hoping that this year's reauthorization debate in Congress will lead to…
Tens of thousands of chemicals are currently in commerce, and hundreds more are introduced every year. Because current chemical testing is resource intensive, only a small fraction of chemicals have been adequately evaluated for potential human health effects. To address this ch...
Google Books Mutilates the Printed Past
ERIC Educational Resources Information Center
Musto, Ronald G.
2009-01-01
In this article, the author discusses a mutilation that he has encountered involving Google Book Search. That massive text-digitization project, working in collaboration with several of the world's most important library collections, has now made available, in both PDF and text view, tens of thousands of 19th-century titles while it awaits the…
There are tens of thousands of closed landfills in the United States, many of whicih are unlined and sited on alluvial deposits. Landfills are of concern because leachate contains a variety of pollutants that can contaminate ground and surface water. Data from chemical analysis...
High throughput toxicity testing (HTT) holds the promise of providing data for tens of thousands of chemicals that currently have no data due to the cost and time required for animal testing. Interpretation of these results require information linking the perturbations seen in vi...
Abstract: There are tens of thousands of man-made chemicals to which humans are exposed, but only a fraction of these have the extensive in vivo toxicity data used in most traditional risk assessments. This lack of data, coupled with concerns about testing costs and animal use, a...
TSA and Standards-Based Learning through TECH-Know
ERIC Educational Resources Information Center
Taylor, Jerianne S.; Peterson, Richard E.; Ernst, Jeremy
2005-01-01
Career and technical student organizations (CTSOs) serve as an integral part of many career and technical education (CTE) programs across the country. Their activities and competitions make up many of the strongest CTE programs due to their co-curricular nature. With memberships ranging from tens of thousands to almost a half million, it is hard…
Betsy DeVos, the (Relatively Mainstream) Reformer
ERIC Educational Resources Information Center
McShane, Michael Q.
2017-01-01
A privatization extremist. A religious zealot. A culture warrior. The new Secretary of Education, Betsy DeVos, was painted as any or all of these things in the fevered weeks between the 2016 presidential election and her confirmation hearing. In the days following that hearing, tens of thousands of people flooded the lines of congressional…
Tens of thousands of stream kilometers around the world are degraded by a legacy of environmental impacts and acid mine drainage (AMD) caused by abandoned underground and surface mines, piles of discarded coal wastes, and tailings. Increased acidity, high concentrations of metals...
Ready, Aim, Perform! Targeted Micro-Training for Performance Intervention
ERIC Educational Resources Information Center
Carpenter, Julia; Forde, Dahlia S.; Stevens, Denise R.; Flango, Vincent; Babcock, Lisa K.
2016-01-01
The Department of Veterans Affairs has an immediate problem at hand. Tens of thousands of employees are working in a high-stress work environment where fast-paced daily production requirements are critical. Employees are faced with a tremendous backlog of veterans' claims. Unfortunately, not only are the claims extremely complex, but there is…
ERIC Educational Resources Information Center
Education Commission of the States, 2015
2015-01-01
Colleges and postsecondary systems across the nation have demonstrated remarkable progress since "Core Principles for Transforming Remediation" was published in 2012. States and institutions are phasing out stand alone or multi-course remediation sequences, resulting in tens of thousands of students more quickly enrolling in and…
Libraries Achieving Greatness: Technology at the Helm
ERIC Educational Resources Information Center
Muir, Scott P.
2009-01-01
Libraries have been around for thousands of years. Many of them are considered great because of their magnificent architecture or because of the size of their collections. This paper offers ten case studies of libraries that have used technology to achieve greatness. Because almost any library can implement technology, a library does not have to…
Violence: innate or acquired? A survey and some opinions.
Bacciagaluppi, Marco
2004-01-01
Freud's psychoanalysis and Lorenz's ethology consider human aggressiveness to be innate. According to recent archaeological excavations and evolutionary studies, human groups in the Upper Paleolithic and Early Neolithic were peaceful and cooperative. This culture was replaced ten thousand years ago by a predatory hierarchical structure, which is here viewed as a cultural variant.
CAMUS: Automatically Mapping Cyber Assets to Mission and Users (PREPRINT)
2009-10-01
which machines regularly use a particular mail server. Armed with these basic data sources – LDAP, NetFlow traffic and user logs – fuselets were created... NetFlow traffic used in the demonstration has over ten thousand unique IP Addresses and is over one gigabyte in size. A number of high performance
ERIC Educational Resources Information Center
Barnett, R. Michael
2013-01-01
After half a century of waiting, the drama was intense. Physicists slept overnight outside the auditorium to get seats for the seminar at the CERN lab in Geneva, Switzerland. Ten thousand miles away on the other side of the planet, at the world's most prestigious international particle physics conference, hundreds of physicists from every corner…
School-Aged Victims of Sexual Abuse: Implications for Educators.
ERIC Educational Resources Information Center
Wishon, Phillip M.
Each year in the United States, thousands of school-aged children become involved in sexual activities arranged by adults for purposes of pleasure and profit. Nationwide, annual profits from the child pornography industry and from female and male child prostitution are in the tens of millions of dollars. Heretofore, the majority of…
There is an urgent need to characterize potential risk to human health and the environment that arises from the manufacture and use of tens of thousands of chemicals. Computational tools and approaches for characterizing and prioritizing exposure are required: to provide input f...
Analyzing Pulse-Code Modulation On A Small Computer
NASA Technical Reports Server (NTRS)
Massey, David E.
1988-01-01
System for analysis pulse-code modulation (PCM) comprises personal computer, computer program, and peripheral interface adapter on circuit board that plugs into expansion bus of computer. Functions essentially as "snapshot" PCM decommutator, which accepts and stores thousands of frames of PCM data, sifts through them repeatedly to process according to routines specified by operator. Enables faster testing and involves less equipment than older testing systems.
NASA Astrophysics Data System (ADS)
Moyer, R. P.; Khan, N.; Radabaugh, K.; Engelhart, S. E.; Smoak, J. M.; Horton, B.; Rosenheim, B. E.; Kemp, A.; Chappel, A. R.; Schafer, C.; Jacobs, J. A.; Dontis, E. E.; Lynch, J.; Joyse, K.; Walker, J. S.; Halavik, B. T.; Bownik, M.
2017-12-01
Since 2014, our collaborative group has been working in coastal marshes and mangroves across Southwest Florida, including Tampa Bay, Charlotte Harbor, Ten Thousand Islands, Biscayne Bay, and the lower Florida Keys. All existing field sites were located within 50 km of Hurricane Irma's eye path, with a few sites in the Lower Florida Keys and Naples/Ten Thousand Islands region suffering direct eyewall hits. As a result, we have been conducting storm-impact and damage assessments at these locations with the primary goal of understanding how major hurricanes contribute to and/or modify the sedimentary record of mangroves and salt marshes. We have also assessed changes to the vegetative structure of the mangrove forests at each site. Preliminary findings indicate a reduction in mangrove canopy cover from 70-90% pre-storm, to 30-50% post-Irma, and a reduction in tree height of approximately 1.2 m. Sedimentary deposits consisting of fine carbonate mud up to 12 cm thick were imported into the mangroves of the lower Florida Keys, Biscayne Bay, and the Ten Thousand Islands. Import of siliciclastic mud up to 5 cm thick was observed in Charlotte Harbor. In addition to fine mud, all sites had imported tidal wrack consisting of a mixed seagrass and mangrove leaf litter, with some deposits as thick as 6 cm. In areas with newly opened canopy, a microbial layer was coating the surface of the imported wrack layer. Overwash and shoreline erosion were also documented at two sites in the lower Keys and Biscayne Bay, and will be monitored for change and recovery over the next few years. Because active research was being conducted, a wealth of pre-storm data exists, thus these locations are uniquely positioned to quantify hurricane impacts to the sedimentary record and standing biomass across a wide geographic area. Due to changes in intensity along the storm path, direct comparisons of damage metrics can be made to environmental setting, wind speed, storm surge, and distance to eyewall.
Ongoing hydrothermal heat loss from the 1912 ash-flow sheet, Valley of Ten Thousand Smokes, Alaska
Hogeweg, N.; Keith, T.E.C.; Colvard, E.M.; Ingebritsen, S.E.
2005-01-01
The June 1912 eruption of Novarupta filled nearby glacial valleys on the Alaska Peninsula with ash-flow tuff (ignimbrite), and post-eruption observations of thousands of steaming fumaroles led to the name 'Valley of Ten Thousand Smokes' (VTTS). By the late 1980s most fumarolic activity had ceased, but the discovery of thermal springs in mid-valley in 1987 suggested continued cooling of the ash-flow sheet. Data collected at the mid-valley springs between 1987 and 2001 show a statistically significant correlation between maximum observed chloride (Cl) concentration and temperature. These data also show a statistically significant decline in the maximum Cl concentration. The observed variation in stream chemistry across the sheet strongly implies that most solutes, including Cl, originate within the area of the VTTS occupied by the 1912 deposits. Numerous measurements of Cl flux in the Ukak River just below the ash-flow sheet suggest an ongoing heat loss of ???250 MW. This represents one of the largest hydrothermal heat discharges in North America. Other hydrothermal discharges of comparable magnitude are related to heat obtained from silicic magma bodies at depth, and are quasi-steady on a multidecadal time scale. However, the VTTS hydrothermal flux is not obviously related to a magma body and is clearly declining. Available data provide reasonable boundary and initial conditions for simple transient modeling. Both an analytical, conduction-only model and a numerical model predict large rates of heat loss from the sheet 90 years after deposition.
BEAM web server: a tool for structural RNA motif discovery.
Pietrosanto, Marco; Adinolfi, Marta; Casula, Riccardo; Ausiello, Gabriele; Ferrè, Fabrizio; Helmer-Citterich, Manuela
2018-03-15
RNA structural motif finding is a relevant problem that becomes computationally hard when working on high-throughput data (e.g. eCLIP, PAR-CLIP), often represented by thousands of RNA molecules. Currently, the BEAM server is the only web tool capable to handle tens of thousands of RNA in input with a motif discovery procedure that is only limited by the current secondary structure prediction accuracies. The recently developed method BEAM (BEAr Motifs finder) can analyze tens of thousands of RNA molecules and identify RNA secondary structure motifs associated to a measure of their statistical significance. BEAM is extremely fast thanks to the BEAR encoding that transforms each RNA secondary structure in a string of characters. BEAM also exploits the evolutionary knowledge contained in a substitution matrix of secondary structure elements, extracted from the RFAM database of families of homologous RNAs. The BEAM web server has been designed to streamline data pre-processing by automatically handling folding and encoding of RNA sequences, giving users a choice for the preferred folding program. The server provides an intuitive and informative results page with the list of secondary structure motifs identified, the logo of each motif, its significance, graphic representation and information about its position in the RNA molecules sharing it. The web server is freely available at http://beam.uniroma2.it/ and it is implemented in NodeJS and Python with all major browsers supported. marco.pietrosanto@uniroma2.it. Supplementary data are available at Bioinformatics online.
Straw man trade between multi-junction, gallium arsenide, and silicon solar cells
NASA Technical Reports Server (NTRS)
Gaddy, Edward M.
1995-01-01
Multi-junction (MJ), gallium arsenide (GaAs), and silicon (Si) solar cells have respective test efficiencies of approximately 24%, 18.5% and 14.8%. Multi-junction and gallium arsenide solar cells weigh more than silicon solar cells and cost approximately five times as much per unit power at the cell level. A straw man trade is performed for the TRMM spacecraft to determine which of these cell types would have offered an overall performance and price advantage to the spacecraft. A straw man trade is also performed for the multi-junction cells under the assumption that they will cost over ten times that of silicon cells at the cell level. The trade shows that the TRMM project, less the cost of the instrument, ground systems and mission operations, would spend approximately $552 thousand dollars per kilogram to launch and service science in the case of the spacecraft equipped with silicon solar cells. If these cells are changed out for gallium arsenide solar cells, an additional 31 kilograms of science can be launched and serviced at a price of approximately $90 thousand per kilogram. The weight reduction is shown to derive from the smaller area of the array and hence reductions in the weight of the array substrate and supporting structure. If the silicon solar cells are changed out for multi-junction solar cells, an additional 45 kilograms of science above the silicon base line can be launched and serviced at a price of approximately $58 thousand per kilogram. The trade shows that even if the multi-junction arrays are priced over ten times that of silicon cells, a price that is much higher than projected, that the additional 45 kilograms of science are launched and serviced at $182 thousand per kilogram. This is still much less than original $552 thousand per kilogram to launch and service the science. Data and qualitative factors are presented to show that these figures are subject to a great deal of uncertainty. Nonetheless, the benefit of the higher efficiency solar cells for TRMM is far greater than the uncertainties in the analysis.
A national voice network with satellite and small transceivers
NASA Technical Reports Server (NTRS)
Reilly, N. B.; Smith, J. G.
1978-01-01
A geostationary satellite utilizing a large multiple-beam UHF antenna is shown to be potentially capable of providing tens of thousands of voice channels for hundreds of thousands of mobile ground terminals using hand-held or vehicular-mounted transceivers with whip antennas. Inclusion of on-board network switching facilities permits full interconnection between any terminal pair within the continental United States (CONUS). Configuration tradeoff studies at selected frequencies from 150 to 1500 MHz, with antenna diameters ranging from 20 to 200 m, and CONUS-coverage multiple beams down to 0.3 deg beamwidth, establish that monthly system user costs in the range of $90 to $150, including leased and maintained ground equipment, are feasible.
A significant challenge in toxicology is the “too many chemicals” problem. Humans and environmental species are exposed to as many as tens of thousands of chemicals, only a small percentage of which have been tested thoroughly using standard in vivo test methods. This paper revie...
Single-cell genomics for the masses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tringe, Susannah G.
In this issue of Nature Biotechnology, Lan et al. describe a new tool in the toolkit for studying uncultivated microbial communities, enabling orders of magnitude higher single cell genome throughput than previous methods. This is achieved by a complex droplet microfluidics workflow encompassing steps from physical cell isolation through genome sequencing, producing tens of thousands of lowcoverage genomes from individual cells.
Single-cell genomics for the masses
Tringe, Susannah G.
2017-07-12
In this issue of Nature Biotechnology, Lan et al. describe a new tool in the toolkit for studying uncultivated microbial communities, enabling orders of magnitude higher single cell genome throughput than previous methods. This is achieved by a complex droplet microfluidics workflow encompassing steps from physical cell isolation through genome sequencing, producing tens of thousands of lowcoverage genomes from individual cells.
ERIC Educational Resources Information Center
Steinberg, Adria; Almeida, Cheryl
2015-01-01
Few Americans know the importance of community-based organizations, or CBOs, in helping tens of thousands of undereducated, underemployed young people find a job or go back to school. But the role of CBOs is growing more critical as the business, education, and philanthropic sectors increasingly recognize the need to enable the nation's millions…
2003-07-25
This is the first Deep Imaging Survey image taken by NASA Galaxy Evolution Explorer. On June 22 and 23, 2003, the spacecraft obtained this near ultraviolet image of the Groth region by adding multiple orbits for a total exposure time of 14,000 seconds. Tens of thousands of objects can be identified in this picture. http://photojournal.jpl.nasa.gov/catalog/PIA04627
ERIC Educational Resources Information Center
Hoang, Hai; Huang, Melrose; Sulcer, Brian; Yesilyurt, Suleyman
2017-01-01
College math is a gateway course that has become a constraining gatekeeper for tens of thousands of students annually. Every year, over 500,000 students fail developmental mathematics, preventing them from achieving their college and career goals. The Carnegie Math Pathways initiative offers students an alternative. It comprises two Pathways…
The National Association of Charter School Authorizers' Index of Essential Practices
ERIC Educational Resources Information Center
National Association of Charter School Authorizers (NJ1), 2011
2011-01-01
Authorizers are as varied as the schools they oversee. Some are responsible for just one charter, while others monitor hundreds of charters serving tens of thousands of students. Some are school districts, while others are independent statewide boards, universities, not-for-profits, or state education agencies. Regardless of their size and type,…
Inferences of Recent and Ancient Human Population History Using Genetic and Non-Genetic Data
ERIC Educational Resources Information Center
Kitchen, Andrew
2008-01-01
I have adopted complementary approaches to inferring human demographic history utilizing human and non-human genetic data as well as cultural data. These complementary approaches form an interdisciplinary perspective that allows one to make inferences of human history at varying timescales, from the events that occurred tens of thousands of years…
Announcing the First Results from Daya Bay: Discovery of a New Kind of
collaboration observed tens of thousands of interactions of electron antineutrinos, caught by six massive was the sizable disappearance, equal to about six percent. Although disappearance has been observed in . "Even with only the six detectors already operating, we have more target mass than any similar
Update: Report on Innovations in Developmental Mathematics--Moving Mathematical Graveyards
ERIC Educational Resources Information Center
Merseth, Katherine K.
2011-01-01
Every year tens of thousands of students step foot on community college campuses, many for the first time. These students all have one thing in common: hope. They enter these institutions with lofty goals and a fervent expectation that the educative experience they are about to embark upon will fundamentally improve their lives. Yet, their hopes…
Israel’s Efforts to Defeat Iran’s Nuclear Program: An Integrated Use of National Power
2013-05-03
such as Chernobyl , Fukushima, Three Mile Island or Bhopal;” would likely cause the deaths of tens of thousands of noncombatants; and spread...a Chernobyl or Fukushima type disaster transpire. Most Iranians are not aware of the potential risks to which they and their country are being
Be That Teacher! Breaking the Cycle for Struggling Readers
ERIC Educational Resources Information Center
Risko, Victoria J.; Walker-Dalhouse, Doris
2012-01-01
Tens of thousands of students begin each new school year with the hope that they will finally find "the" teacher who will help them succeed as readers, writers, and learners. This book shows how teachers can provide the type of differentiated instruction that struggling readers need by drawing on students' individual and cultural backgrounds, as…
ERIC Educational Resources Information Center
Ellison, L. Marc
2013-01-01
This study explores the current ability of higher education to effectively educate and support college students diagnosed with Asperger's Disorder. As the prevalence of autism spectrum disorders increased dramatically during the past decade, it is estimated that tens of thousands of individuals diagnosed with Asperger's Disorder are…
Emerging Economies Make Ripe Markets for Recruiting Industry
ERIC Educational Resources Information Center
Overland, Martha Ann
2008-01-01
Tens of thousands of international students every year use local recruiters in their homeland to help them get into colleges abroad. Despite the proliferation of the Internet, with e-mail and applications that can be submitted online, students in the developing world still heavily depend on commissioned agents to help them navigate what is to many…
Publishing landscape ecology research in the 21st Century
Eric J. Gustafson
2011-01-01
With the proliferation of journals and scientific papers, it has become impossible to sustain a familiarity with the corpus of ecological literature, which totals tens of thousands of pages per year. Given the number of papers that a well-read ecologist should read, it takes an inordinate amount of time to extract the critical details necessary to superficially...
Emergency Systems Save Tens of Thousands of Lives
NASA Technical Reports Server (NTRS)
2013-01-01
To improve distress signal communications, NASA pioneered the Search and Rescue Satellite Aided Tracking (SARSAT) system. Since its inception, the international system known as Cospas-Sarsat has resulted in the rescue of more than 30,000 people. Techno-Sciences Inc., of Beltsville, Maryland, has been involved with the ground station component of the system from its earliest days.
Student Learning, Student Achievement: How Do Teachers Measure up?
ERIC Educational Resources Information Center
National Board for Professional Teaching Standards, 2011
2011-01-01
The National Board for Professional Teaching Standards (NBPTS) welcomes the efforts of federal, state, and local policymakers to find new ways to ensure an accomplished teacher for every student in America. The National Board has advanced this mission since its inception in 1987. Today, that mission is carried out by the tens of thousands of…
Thrilling but Pointless: General JO Shelby’s 1863 Cavalry Raid
2013-12-13
painfully acute. The air seems filled with exquisite music ; cities and towns rise up on every hand, crowned with spires and radiant with ten thousand...Raid. By the end of festivities , at nearly 2 a.m., Captain Hart recited a prepared poem entitled “Jo Shelby’s Raid.” The spirit of Shelby’s Brigade
How Military Service Affects Student Veteran Success at Community Colleges
ERIC Educational Resources Information Center
O'Rourke, Patrick C., Jr.
2013-01-01
Increasingly more service members are separating from the military as the United States draws down the force and moves towards a post-war era. Tens of thousands of these veterans will leverage their GI Bill tuition and housing benefits in an attempt to access Southern California community colleges and bolster their transition into mainstream…
ERIC Educational Resources Information Center
Cenziper, Debbie; Grotto, Jason
This series of articles examines the condition of public schools and public school construction in Florida's Miami and Dade Counties. To prepare the series, the Miami Herald studied thousands of pages of construction records, correspondence, school district reports, and accounting statements over 15 years. It analyzed state and national…
ERIC Educational Resources Information Center
Carlson, Scott; Lipka, Sara
2009-01-01
In today's tough economy, students and parents alike are looking for ways to save on college tuition. With sticker prices well into the tens of thousands per year at any private liberal-arts institution, the prospect of shaving a year off the typical four-year journey is an added attraction at a number of colleges, like Franklin & Marshall,…
Gravity waves in the thermosphere observed by the AE satellites
NASA Technical Reports Server (NTRS)
Gross, S. H.; Reber, C. A.; Huang, F. T.
1983-01-01
Atmospheric Explorer (AE) satellite data were used to investigate the spectra characteristics of wave-like structure observed in the neutral and ionized components of the thermosphere. Power spectral analysis derived by the maximum entropy method indicate the existence of a broad spectrum of scale sizes for the fluctuations ranging from tens to thousands of kilometers.
Case Study: Youth Transitions Task Force--A Ten-Year Retrospective, Spring 2015
ERIC Educational Resources Information Center
Poulos, Jennifer; d'Entremont, Chad; Culbertson, Nina
2015-01-01
In 2004, Boston Public Schools reported that more than 8% of its students dropped out of school that year. The city faced a crisis. Thousands of students were failing to earn a high-school diploma, a necessary credential for entrance into postsecondary education and/or the twenty-first century workforce. Factors driving students' decisions to…
ERIC Educational Resources Information Center
Hauser, Daniel C.; Johnston, Alison
2016-01-01
American students graduate from college with tens of thousands of dollars in debt, leading to substantial repayment burdens and potentially inefficient shifts in spending patterns and career choices. A political trend towards austerity coupled with the rising student debt make the effective allocation of federal higher education resources and…
Earth Observations taken by the Expedition 16 Crew
2008-01-01
ISS016-E-023723 (January 2008) --- This nocturnal view of the Glendale/Phoenix/Mesa, Arizona area was photographed by one of the Expedition 16 crewmembers aboard the International Space Station. During the last week, this area has been teeming with tens of thousands of football fans here for a big football game in Glendale on Feb. 3.
Abstract: There are tens of thousands of man-made chemicals to which humans are exposed, but only a fraction of these have the extensive in vivo toxicity data used in most traditional risk assessments. This lack of data, coupled with concerns about testing costs and animal use, a...
Tripping with Stephen Gaskin: An Exploration of a Hippy Adult Educator
ERIC Educational Resources Information Center
Morley, Gabriel Patrick
2012-01-01
For the last 40 years, Stephen Gaskin has been an adult educator on the fringe, working with tens of thousands of adults in the counterculture movement in pursuit of social change regarding marijuana legalization, women's rights, environmental justice issues and beyond. Gaskin has written 11 books about his experiences teaching and learning…
Con Artists Attack Colleges with Fake Help-Desk E-Mail
ERIC Educational Resources Information Center
Young, Jeffrey R.
2008-01-01
An e-mail scam has hit tens of thousands of users at dozens of colleges over the past few weeks, leaving network administrators scrambling to respond before campus computer accounts are taken over by spammers. Students, professors, and staff members at the affected colleges received e-mail messages that purported to come from the colleges' help…
Pentecost, M; Ross, F C; Macnab, A
2018-02-01
Pregnant women, children under 2 and the first thousand days of life have been principal targets for Developmental Origins of Health and Disease interventions. This paradigm has been criticized for laying responsibility for health outcomes on pregnant women and mothers and through the thousand days focus inadvertently deflecting attention from other windows for intervention. Drawing on insights from the South African context, this commentary argues for integrated and inclusive interventions that encompass broader social framings. First, future interventions should include a wider range of actors. Second, broader action frameworks should encompass life-course approaches that identify multiple windows of opportunity for intervention. Using two examples - the inclusion of men, and engagement with adolescents - this commentary offers strategies for producing more inclusive interventions by using a broader social framework.
NASA Astrophysics Data System (ADS)
Junaidi, Junaidi; Yandra, Alexsander; Hamuddin, Budianto
2018-05-01
Indonesia is a maritime country with the largest numbers of islands in the world which covering more than seventeen thousands islands. There are thousand tribes and ethnics with their cultures suppose to be enriching the diversity of Indonesia. However a series of riots happening in Indonesia including Aksi Bela Islam (ABI) recently challenging the unity in diversity of Indonesia. This present study tries to describe Aksi Bela Islam (ABI) (The Peace Action of Defend Islam) rally that run peacefully brings impacts on Indonesia economic sector as stable social and political condition will bring significant impact on Indonesia economic sector. The rally is a long journey of Indonesia of Muslim majority to seek justice as the Republic of Indonesia State Police seems not to be serious to handle the Islamic blasphemy case. Through the framing approach, it will be described how ABI brings impacts on the economic sector focusing on the study on the media perspectives. From the perspective of one leading economic magazines in Indonesia Bisnis Indonesia.The framing from the media evidently showed that the rallyfollowed by millions of Indonesian Muslims does not bring negative impacts on the economic sector of Indonesia sinceit run peacefully, well manage and safe so the market gives a positive response and appreciationto the action.
Pulse Code Modulation (PCM) data storage and analysis using a microcomputer
NASA Technical Reports Server (NTRS)
Massey, D. E.
1986-01-01
A PCM storage device/data analyzer is described. This instrument is a peripheral plug-in board especially built to enable a personal computer to store and analyze data from a PCM source. This board and custom written software turns a computer into a snapshot PCM decommutator. This instrument will take in and store many hundreds or thousands of PCM telemetry data frames, then sift through them over and over again. The data can be converted to any number base and displayed, examined for any bit dropouts or changes in particular words or frames, graphically plotted, or statistically analyzed. This device was designed and built for use on the NASA Sounding Rocket Program for PCM encoder configuration and testing.
WE-B-210-02: The Advent of Ultrafast Imaging in Biomedical Ultrasound
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tanter, M.
In the last fifteen years, the introduction of plane or diverging wave transmissions rather than line by line scanning focused beams has broken the conventional barriers of ultrasound imaging. By using such large field of view transmissions, the frame rate reaches the theoretical limit of physics dictated by the ultrasound speed and an ultrasonic map can be provided typically in tens of micro-seconds (several thousands of frames per second). Interestingly, this leap in frame rate is not only a technological breakthrough but it permits the advent of completely new ultrasound imaging modes, including shear wave elastography, electromechanical wave imaging, ultrafastmore » doppler, ultrafast contrast imaging, and even functional ultrasound imaging of brain activity (fUltrasound) introducing Ultrasound as an emerging full-fledged neuroimaging modality. At ultrafast frame rates, it becomes possible to track in real time the transient vibrations – known as shear waves – propagating through organs. Such “human body seismology” provides quantitative maps of local tissue stiffness whose added value for diagnosis has been recently demonstrated in many fields of radiology (breast, prostate and liver cancer, cardiovascular imaging, …). Today, Supersonic Imagine company is commercializing the first clinical ultrafast ultrasound scanner, Aixplorer with real time Shear Wave Elastography. This is the first example of an ultrafast Ultrasound approach surpassing the research phase and now widely spread in the clinical medical ultrasound community with an installed base of more than 1000 Aixplorer systems in 54 countries worldwide. For blood flow imaging, ultrafast Doppler permits high-precision characterization of complex vascular and cardiac flows. It also gives ultrasound the ability to detect very subtle blood flow in very small vessels. In the brain, such ultrasensitive Doppler paves the way for fUltrasound (functional ultrasound imaging) of brain activity with unprecedented spatial and temporal resolution compared to fMRI. Combined with contrast agents, our group demonstrated that Ultrafast Ultrasound Localization could provide a first in vivo and non invasive imaging modality at microscopic scales deep into organs. Many of these ultrafast modes should lead to major improvements in ultrasound screening, diagnosis, and therapeutic monitoring. Learning Objectives: Achieve familiarity with recent advances in ultrafast ultrasound imaging technology. Develop an understanding of potential applications of ultrafast ultrasound imaging for diagnosis and therapeutic monitoring. Dr. Tanter is a co-founder of Supersonic Imagine,a French company positioned in the field of medical ultrasound imaging and therapy.« less
Simple Sequence Repeats in Escherichia coli: Abundance, Distribution, Composition, and Polymorphism
Gur-Arie, Riva; Cohen, Cyril J.; Eitan, Yuval; Shelef, Leora; Hallerman, Eric M.; Kashi, Yechezkel
2000-01-01
Computer-based genome-wide screening of the DNA sequence of Escherichia coli strain K12 revealed tens of thousands of tandem simple sequence repeat (SSR) tracts, with motifs ranging from 1 to 6 nucleotides. SSRs were well distributed throughout the genome. Mononucleotide SSRs were over-represented in noncoding regions and under-represented in open reading frames (ORFs). Nucleotide composition of mono- and dinucleotide SSRs, both in ORFs and in noncoding regions, differed from that of the genomic region in which they occurred, with 93% of all mononucleotide SSRs proving to be of A or T. Computer-based analysis of the fine position of every SSR locus in the noncoding portion of the genome relative to downstream ORFs showed SSRs located in areas that could affect gene regulation. DNA sequences at 14 arbitrarily chosen SSR tracts were compared among E. coli strains. Polymorphisms of SSR copy number were observed at four of seven mononucleotide SSR tracts screened, with all polymorphisms occurring in noncoding regions. SSR polymorphism could prove important as a genome-wide source of variation, both for practical applications (including rapid detection, strain identification, and detection of loci affecting key phenotypes) and for evolutionary adaptation of microbes.[The sequence data described in this paper have been submitted to the GenBank data library under accession numbers AF209020–209030 and AF209508–209518.] PMID:10645951
NASA Astrophysics Data System (ADS)
Yamamoto, H.; Nakajima, K.; Zhang, K.; Nanai, S.
2015-12-01
Powerful numerical codes that are capable of modeling complex coupled processes of physics and chemistry have been developed for predicting the fate of CO2 in reservoirs as well as its potential impacts on groundwater and subsurface environments. However, they are often computationally demanding for solving highly non-linear models in sufficient spatial and temporal resolutions. Geological heterogeneity and uncertainties further increase the challenges in modeling works. Two-phase flow simulations in heterogeneous media usually require much longer computational time than that in homogeneous media. Uncertainties in reservoir properties may necessitate stochastic simulations with multiple realizations. Recently, massively parallel supercomputers with more than thousands of processors become available in scientific and engineering communities. Such supercomputers may attract attentions from geoscientist and reservoir engineers for solving the large and non-linear models in higher resolutions within a reasonable time. However, for making it a useful tool, it is essential to tackle several practical obstacles to utilize large number of processors effectively for general-purpose reservoir simulators. We have implemented massively-parallel versions of two TOUGH2 family codes (a multi-phase flow simulator TOUGH2 and a chemically reactive transport simulator TOUGHREACT) on two different types (vector- and scalar-type) of supercomputers with a thousand to tens of thousands of processors. After completing implementation and extensive tune-up on the supercomputers, the computational performance was measured for three simulations with multi-million grid models, including a simulation of the dissolution-diffusion-convection process that requires high spatial and temporal resolutions to simulate the growth of small convective fingers of CO2-dissolved water to larger ones in a reservoir scale. The performance measurement confirmed that the both simulators exhibit excellent scalabilities showing almost linear speedup against number of processors up to over ten thousand cores. Generally this allows us to perform coupled multi-physics (THC) simulations on high resolution geologic models with multi-million grid in a practical time (e.g., less than a second per time step).
Ten Years of Speckle Interferometry at SOAR
NASA Astrophysics Data System (ADS)
Tokovinin, Andrei
2018-03-01
Since 2007, close binary and multiple stars are observed by speckle interferometry at the 4.1 m Southern Astrophysical Research (SOAR) telescope. The HRCam instrument, observing strategy and planning, data processing and calibration methods, developed and improved during ten years, are presented here in a concise way. Thousands of binary stars were measured with diffraction-limited resolution (29 mas at 540 nm wavelength) and a high accuracy reaching 1 mas; 200 new pairs or subsystems were discovered. To date, HRCam has performed over 11,000 observations with a high efficiency (up to 300 stars per night). An overview of the main results delivered by this instrument is given.
Pettengill, James B; Pightling, Arthur W; Baugher, Joseph D; Rand, Hugh; Strain, Errol
2016-01-01
The adoption of whole-genome sequencing within the public health realm for molecular characterization of bacterial pathogens has been followed by an increased emphasis on real-time detection of emerging outbreaks (e.g., food-borne Salmonellosis). In turn, large databases of whole-genome sequence data are being populated. These databases currently contain tens of thousands of samples and are expected to grow to hundreds of thousands within a few years. For these databases to be of optimal use one must be able to quickly interrogate them to accurately determine the genetic distances among a set of samples. Being able to do so is challenging due to both biological (evolutionary diverse samples) and computational (petabytes of sequence data) issues. We evaluated seven measures of genetic distance, which were estimated from either k-mer profiles (Jaccard, Euclidean, Manhattan, Mash Jaccard, and Mash distances) or nucleotide sites (NUCmer and an extended multi-locus sequence typing (MLST) scheme). When analyzing empirical data (whole-genome sequence data from 18,997 Salmonella isolates) there are features (e.g., genomic, assembly, and contamination) that cause distances inferred from k-mer profiles, which treat absent data as informative, to fail to accurately capture the distance between samples when compared to distances inferred from differences in nucleotide sites. Thus, site-based distances, like NUCmer and extended MLST, are superior in performance, but accessing the computing resources necessary to perform them may be challenging when analyzing large databases.
Pettengill, James B.; Pightling, Arthur W.; Baugher, Joseph D.; ...
2016-11-10
The adoption of whole-genome sequencing within the public health realm for molecular characterization of bacterial pathogens has been followed by an increased emphasis on real-time detection of emerging outbreaks (e.g., food-borne Salmonellosis). In turn, large databases of whole-genome sequence data are being populated. These databases currently contain tens of thousands of samples and are expected to grow to hundreds of thousands within a few years. For these databases to be of optimal use one must be able to quickly interrogate them to accurately determine the genetic distances among a set of samples. Being able to do so is challenging duemore » to both biological (evolutionary diverse samples) and computational (petabytes of sequence data) issues. We evaluated seven measures of genetic distance, which were estimated from either k-mer profiles (Jaccard, Euclidean, Manhattan, Mash Jaccard, and Mash distances) or nucleotide sites (NUCmer and an extended multi-locus sequence typing (MLST) scheme). Finally, when analyzing empirical data (wholegenome sequence data from 18,997 Salmonella isolates) there are features (e.g., genomic, assembly, and contamination) that cause distances inferred from k-mer profiles, which treat absent data as informative, to fail to accurately capture the distance between samples when compared to distances inferred from differences in nucleotide sites. Thus, site-based distances, like NUCmer and extended MLST, are superior in performance, but accessing the computing resources necessary to perform them may be challenging when analyzing large databases.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pettengill, James B.; Pightling, Arthur W.; Baugher, Joseph D.
The adoption of whole-genome sequencing within the public health realm for molecular characterization of bacterial pathogens has been followed by an increased emphasis on real-time detection of emerging outbreaks (e.g., food-borne Salmonellosis). In turn, large databases of whole-genome sequence data are being populated. These databases currently contain tens of thousands of samples and are expected to grow to hundreds of thousands within a few years. For these databases to be of optimal use one must be able to quickly interrogate them to accurately determine the genetic distances among a set of samples. Being able to do so is challenging duemore » to both biological (evolutionary diverse samples) and computational (petabytes of sequence data) issues. We evaluated seven measures of genetic distance, which were estimated from either k-mer profiles (Jaccard, Euclidean, Manhattan, Mash Jaccard, and Mash distances) or nucleotide sites (NUCmer and an extended multi-locus sequence typing (MLST) scheme). Finally, when analyzing empirical data (wholegenome sequence data from 18,997 Salmonella isolates) there are features (e.g., genomic, assembly, and contamination) that cause distances inferred from k-mer profiles, which treat absent data as informative, to fail to accurately capture the distance between samples when compared to distances inferred from differences in nucleotide sites. Thus, site-based distances, like NUCmer and extended MLST, are superior in performance, but accessing the computing resources necessary to perform them may be challenging when analyzing large databases.« less
Video-based measurements for wireless capsule endoscope tracking
NASA Astrophysics Data System (ADS)
Spyrou, Evaggelos; Iakovidis, Dimitris K.
2014-01-01
The wireless capsule endoscope is a swallowable medical device equipped with a miniature camera enabling the visual examination of the gastrointestinal (GI) tract. It wirelessly transmits thousands of images to an external video recording system, while its location and orientation are being tracked approximately by external sensor arrays. In this paper we investigate a video-based approach to tracking the capsule endoscope without requiring any external equipment. The proposed method involves extraction of speeded up robust features from video frames, registration of consecutive frames based on the random sample consensus algorithm, and estimation of the displacement and rotation of interest points within these frames. The results obtained by the application of this method on wireless capsule endoscopy videos indicate its effectiveness and improved performance over the state of the art. The findings of this research pave the way for a cost-effective localization and travel distance measurement of capsule endoscopes in the GI tract, which could contribute in the planning of more accurate surgical interventions.
Southern Pine Beetle Ecology: Populations within Stands
Matthew P. Ayres; Sharon J. Martinson; Nicholas A. Friedenberg
2011-01-01
Populations of southern pine beetle (SPB) are typically substructured into local aggregations, each with tens of thousands of individual beetles. These aggregations, known as âspotsâ because of their appearance during aerial surveys, are the basic unit for the monitoring and management of SPB populations in forested regions. They typically have a maximum lifespan of 1...
ERIC Educational Resources Information Center
Carnegie Foundation for the Advancement of Teaching, 2017
2017-01-01
The Carnegie Foundation launched its Math Pathways initiative nearly six years ago at 29 colleges across the country with the aim of improving success rates in developmental math. Tens of thousands of students a year, who need additional preparation for college-level math, are shut out of earning degrees and fulfilling careers due to the huge…
ERIC Educational Resources Information Center
Ollerenshaw, Alison; Aidman, Eugene; Kidd, Garry
1997-01-01
This study examined comprehension in four groups of undergraduates under text only, multimedia, and two diagram conditions of text supplementation. Results indicated that effects of text supplementation are mediated by prior knowledge and learning style: multimedia appears more beneficial to surface learners with little prior knowledge and makes…
Variable Selection Strategies for Small-area Estimation Using FIA Plots and Remotely Sensed Data
Andrew Lister; Rachel Riemann; James Westfall; Mike Hoppus
2005-01-01
The USDA Forest Service's Forest Inventory and Analysis (FIA) unit maintains a network of tens of thousands of georeferenced forest inventory plots distributed across the United States. Data collected on these plots include direct measurements of tree diameter and height and other variables. We present a technique by which FIA plot data and coregistered...
James L. Chamberlain; Gabrielle Ness; Christine J. Small; Simon J. Bonner; Elizabeth B. Hiebert
2013-01-01
Non-timber forest products, particularly herbaceous understory plants, support a multi-billion dollar industry and are extracted from forests worldwide for their therapeutic value. Tens of thousands of kilograms of rhizomes and roots of Actaea racemosa L., a native Appalachian forest perennial, are harvested every year and used for the treatment of...
The Choctaw Nation: Changing the Appearance of American Higher Education, 1830-1907
ERIC Educational Resources Information Center
Crum, Steven
2007-01-01
In September 1830 the U.S. government negotiated the Treaty of Dancing Rabbit Creek with some leaders of the Choctaw Nation. The treaty reinforced the congressional Indian Removal Act of 1830, which paved the way for the large-scale physical removal of tens of thousands of tribal people of the southeast, including many of the Choctaw. It provided…
Leaving No Worker Behind: Community Colleges Retrain the Michigan Workforce--and Themselves
ERIC Educational Resources Information Center
Hilliard, Tom
2011-01-01
In 2007, Michigan undertook a bold mission: to retrain tens of thousands of adults to qualify for jobs in emerging and expanding sectors of the economy. The state's proposal to jobless, dislocated, and low-income residents was simple but appealing: enroll in up to two years of postsecondary education, and Michigan would cover up to $5,000 in…
Which Learning Style is Most Effective in Learning Chinese as a Second Language
ERIC Educational Resources Information Center
Ren, Guanxin
2013-01-01
Chinese is not only a tonal but also a visual language represented by tens of thousands of characters which are pictographic in nature. This presents a great challenge to learners whose mother tongue is alphabetical-based such as English. To assist English-speaking background learners to learn Chinese as a Second Language (CSL) well, a good…
Sen. Udall, Mark [D-CO
2012-07-16
Senate - 07/16/2012 Submitted in the Senate, considered, and agreed to without amendment and with a preamble by Unanimous Consent. (All Actions) Tracker: This bill has the status Agreed to in SenateHere are the steps for Status of Legislation:
ERIC Educational Resources Information Center
O'Gorman, Lyndal
2017-01-01
Through the multiple languages of the arts, many ideas about sustainability can be explored with young children. This paper discusses the ethical issues involved in the implementation of a research study that uses artist Chris Jordan's confronting images about sustainability. Jordan's images typically depict tens of thousands of objects such as…
3 CFR 8543 - Proclamation 8543 of July 26, 2010. National Korean War Veterans Armistice Day, 2010
Code of Federal Regulations, 2011 CFR
2011-01-01
... of the United States of America A Proclamation Today we celebrate the signing of the Military... respect, and this partnership is vital to peace and stability in Asia and the world. Since our Nation’s... rallied to the young republic’s defense. Tens of thousands of our Nation’s servicemembers lost their lives...
Ten Things You Should Know about Today's Student Veteran
ERIC Educational Resources Information Center
Lighthall, Alison
2012-01-01
With America's military out of Iraq, and funding for global military operations on the decline, thousands of newly discharged men and women are trying to figure out "What's next?" Most of the Soldiers, Marines, Airmen, and Sailors joined the military before their 21st birthday, and it is often the only job they have ever held. While it is true…
Opening Doors to Nursing Degrees: Time for Action. A Proposal from Ontario's Colleges
ERIC Educational Resources Information Center
Colleges Ontario, 2015
2015-01-01
This report argues that Ontario must expand the educational options for people who want to become registered nurses (RNs). It argues that the change Ontario requires is to authorize colleges to offer their own high-quality nursing degrees. Until 2005, about 70 per cent of Ontario's RNs were educated at colleges. Today, tens of thousands of RNs who…
Microlearning as Innovative Pedagogy for Mobile Learning in MOOCs
ERIC Educational Resources Information Center
Kamilali, Despina; Sofianopoulou, Chryssa
2015-01-01
MOOCs are open online courses offered by major universities, free to everyone, anywhere in the world. Hundreds or tens of thousands of learners enrollee in MOOCs but completion rate is extremely low, sometimes less than 10%. There is a need to explore new and more engaging forms of pedagogy to improve retention. Focusing on this need, this paper,…
ERIC Educational Resources Information Center
Reder, Stephen
2012-01-01
Professor Stephen Reder presented the Longitudinal Study of Adult Learning (LSAL) at The Centre's 2011 Fall Institute--IALS: Its Meaning and Impact for Policy and Practice--whose findings had implications far beyond assessment. Based on evidence from the ten-year study of more than a thousand adult high school drop-outs, Dr. Reder challenges many…
Generation of Global Geodetic Networks for GGOS
NASA Astrophysics Data System (ADS)
MacMillan, Daniel; Pavlis, Erricos C.; Kuzmicz-Cieslak, Magda; Koenig, Daniel
2016-12-01
We simulated future networks of VLBI+SLR sites to assess their performance. The objective is to build a global network of geographically well distributed, co-located next-generation sites from each of the space geodetic techniques. The network is being designed to meet the GGOS terrestrial reference frame goals of 1 mm in accuracy and 0.1 mm/yr in stability. We simulated the next generation networks that should be available in five years and in ten years to assess the likelihood that these networks will meet the reference frame goals. Simulations were based on the expectation that 17 broadband VLBI stations will be available in five years and 27 stations in ten years. We also consider the improvement resulting from expanding the network by six additional VLBI sites to improve the global distribution of the network. In the simulations, the networks will operate continuously, but we account for station downtime for maintenance or because of bad weather. We ran SLR+VLBI combination TRF solutions, where site ties were used to connect the two networks in the same way as in combination solutions with observed data. The strengths of VLBI and SLR allows them to provide the necessary reference frame accuracy in scale, geocenter, and orientation. With the +10-year extended network operating for ten years, simulations indicate that scale, origin, and orientation accuracies will be at the level of 0.02 ppb, 0.2 mm, and 6 μas. Combining the +5-year and +10-year network realizations will provide better estimates of accuracy and estimates of stability.
Updating of visual orientation in a gravity-based reference frame.
Niehof, Nynke; Tramper, Julian J; Doeller, Christian F; Medendorp, W Pieter
2017-10-01
The brain can use multiple reference frames to code line orientation, including head-, object-, and gravity-centered references. If these frames change orientation, their representations must be updated to keep register with actual line orientation. We tested this internal updating during head rotation in roll, exploiting the rod-and-frame effect: The illusory tilt of a vertical line surrounded by a tilted visual frame. If line orientation is stored relative to gravity, these distortions should also affect the updating process. Alternatively, if coding is head- or frame-centered, updating errors should be related to the changes in their orientation. Ten subjects were instructed to memorize the orientation of a briefly flashed line, surrounded by a tilted visual frame, then rotate their head, and subsequently judge the orientation of a second line relative to the memorized first while the frame was upright. Results showed that updating errors were mostly related to the amount of subjective distortion of gravity at both the initial and final head orientation, rather than to the amount of intervening head rotation. In some subjects, a smaller part of the updating error was also related to the change of visual frame orientation. We conclude that the brain relies primarily on a gravity-based reference to remember line orientation during head roll.
Evaluation of complete streets policy implementation by metropolitan planning organizations.
DOT National Transportation Integrated Search
2015-09-01
Over the last ten years, communities around the country have begun to implement comprehensive reforms : designed to ensure that roadway users of all ages and abilities can safely utilize the transportation system. : This complete streets policy frame...
Reduction of capsule endoscopy reading times by unsupervised image mining.
Iakovidis, D K; Tsevas, S; Polydorou, A
2010-09-01
The screening of the small intestine has become painless and easy with wireless capsule endoscopy (WCE) that is a revolutionary, relatively non-invasive imaging technique performed by a wireless swallowable endoscopic capsule transmitting thousands of video frames per examination. The average time required for the visual inspection of a full 8-h WCE video ranges from 45 to 120min, depending on the experience of the examiner. In this paper, we propose a novel approach to WCE reading time reduction by unsupervised mining of video frames. The proposed methodology is based on a data reduction algorithm which is applied according to a novel scheme for the extraction of representative video frames from a full length WCE video. It can be used either as a video summarization or as a video bookmarking tool, providing the comparative advantage of being general, unbounded by the finiteness of a training set. The number of frames extracted is controlled by a parameter that can be tuned automatically. Comprehensive experiments on real WCE videos indicate that a significant reduction in the reading times is feasible. In the case of the WCE videos used this reduction reached 85% without any loss of abnormalities.
European Security and Defense Policy (ESDP) After Ten Years - Current Situation and Perspectives
2010-01-01
SUPPLEMENTARY NOTES 12a. DISTRIBUTION / AVAILABILITY STATEMENT 12b. DISTRIBUTION CODE 13 . ABSTRACT After ten years the ESDP has reached an important...premier_ministre/2008/11- novembre /16 juncker/index.html - accessed 14 September 2009. 2 Javier Solana, “Preface,” in: What Ambitions for European Defense in...security of the Union, including the eventual framing of a common defense policy ….” 13 First and foremost the Treaty required member nations to build
Pulse Code Modulation (PCM) data storage and analysis using a microcomputer
NASA Technical Reports Server (NTRS)
Massey, D. E.
1986-01-01
The current widespread use of microcomputers has led to the creation of some very low-cost instrumentation. A Pulse Code Modulation (PCM) storage device/data analyzer -- a peripheral plug-in board especially constructed to enable a personal computer to store and analyze data from a PCM source -- was designed and built for use on the NASA Sounding Rocket Program for PMC encoder configuration and testing. This board and custom-written software turns a computer into a snapshot PCM decommutator which will accept and store many hundreds or thousands of PCM telemetry data frames, then sift through them repeatedly. These data can be converted to any number base and displayed, examined for any bit dropouts or changes (in particular, words or frames), graphically plotted, or statistically analyzed.
ERIC Educational Resources Information Center
Moore, Michael G.
2016-01-01
A systems methodology was employed to design and deliver a highly successful demonstration of the effectiveness of distance education as a means of providing high quality training to tens of thousands of teachers in the most remote areas of Brazil. Key elements in the success of the program were significant funding, top political buy-in, and…
Focusing on function to mine cancer genome data | Center for Cancer Research
CCR scientists have devised a strategy to sift through the tens of thousands of mutations in cancer genome data to find mutations that actually drive the disease. They have used the method to discover that the JNK signaling pathway, which in different contexts can either spur cancerous growth or rein it in, acts as a tumor suppressor in gastric cancers.
A Very Small Astrometry Satellite, Nano-JASMINE: Its Telescope and Mission Goals
NASA Astrophysics Data System (ADS)
Hatsutori, Yoichi; Suganuma, Masahiro; Kobayashi, Yukiyasu; Gouda, Naoteru; Yano, Taihei; Yamada, Yoshiyuki; Yamauchi, Masahiro
This paper introduces a small astrometry satellite, Nano-JASMINE. Nano-JASMINE is mounted a 5-cm effective diameter telescope and aims to measure positions of ten or twenty thousands of stars of z ≤ 8 mag for all-sky with the accuracy of a few milli-arcseconds. The mission goals are clarified and the current status of development of the telescope is reported.
L.B. Brown; B. Allen-Diaz
2009-01-01
Sudden oak death (SOD), caused by the recently discovered non-native invasive pathogen, Phytophthora ramorum, has already killed tens of thousands of native coast live oak and tanoak trees in California. Little is known of potential short and long term impacts of this novel plantâpathogen interaction on forest structure and composition. Coast live...
Major Software Vendor Puts Students on Many Campuses at Risk of Identity Theft
ERIC Educational Resources Information Center
Foster, Andrea
2008-01-01
At least 18 colleges are scrambling to inform tens of thousands of students that they are at risk of having their identities stolen after SunGard, a leading software vendor, reported that a laptop owned by one of its consultants was stolen. The extent of the problem is still unknown, though many of the campuses that have been identified are in…
Risk Management and At-Risk Students: Pernicious Fantasies of Educator Omnipotence. The Cutting Edge
ERIC Educational Resources Information Center
Clabaugh, Gary K.
2004-01-01
For tens of thousands of years human beings relied on oracles, prophets, medicine men, and resignation to try to manage unknown risks. Then, in the transformative 200-year period from the mid-17th through the mid-19th centuries, a series of brilliant insights created groundbreaking tools for rational risk taking. Discoveries such as the theory of…
Deformation and Failure of Protein Materials in Physiologically Extreme Conditions and Disease
2009-03-01
resonance (NMR) spectroscopy and X- ray crystallography have advanced our ability to identify 3D protein structures57. Site-specific studies using NMR, a... ray crystallography, providing structural and temporal information about mechanisms of deformation and assembly (for example in intermediate...tens of thousands of 3D atomistic protein structures, identifying the structure of numerous proteins from varying species sources60. X- ray
ERIC Educational Resources Information Center
Granato, Mona; Krekel, Elisabeth M.; Ulrich, Joachim Gerd
2015-01-01
Every year, tens of thousands of young people in Germany fail to find access to dual vocational education and training (VET), because they cannot find a company to hire them as apprentices. This particularly affects persons with poor school leaving qualifications, socially deprived persons or people with a migrant background. In order to improve…
ERIC Educational Resources Information Center
Miech, Edward J.; Nave, Bill; Mosteller, Frederick
2005-01-01
This article describes what a structured abstract is and how a structured abstract can help researchers sort out information. Today over 1,000 education journals publish more than 20,000 articles in the English language each year. No systematic tool is available at present to get the research findings from these tens of thousands of articles to…
Registering the Human Terrain: A Valuation of Cadastre
2008-01-01
which is also an intelligence topic of increasing salience. Ethno-linguistic maps, such as Figure 1 depicting languages spoken or religions ...Desert] to Congo, tens of thousands of people are at war. You might think these struggles are about religion , or ethnicity, or even political diff...Nazi pseudoscience responsible for 70 million deaths. Academia quickly distanced itself from environmental determinism, the theory behind Geopolitik
Bryan D. Watts; Dana S. Bradshaw
2005-01-01
Within the mid-Atlantic Coastal Plain, lands owned or controlled by government agencies and organizations within the Partners in Flight (PIF) program are highly fragmented. These lands represent tens of thousands of habitat patches that are managed by hundreds of individuals responding to a diversity of directives. Moving this patchwork of lands forward to achieve...
Shared Data Reveal the Invisible Achievement Gap of Students in Foster Care
ERIC Educational Resources Information Center
WestEd, 2014
2014-01-01
At any given time, tens of thousands of children and youth in the U.S. are in the foster care system. Many have been abused, neglected, or abandoned, and they face a challenging journey of uncertainty, often not knowing where they will live next, where they will go to school, or whether they will have contact with friends and relatives. Child…
A Numerical Study on the Streams of Star Debris after Tidal Disruption
NASA Astrophysics Data System (ADS)
Camacho Olachea, Priscila; Ramirez-Ruiz, Enrico; Law-Smith, Jamie
2017-01-01
Lurking at the centers of most galaxies are gigantic star and gas devouring monsters. These monsters are supermassive black holes (SMBHs), some of which are larger than our solar system and ten billion times as massive as our own Sun. The vast majority of stars in the universe live for tens of billions of years, eventually dying from old age as the nuclearreactions that power them become progressively less effective. But for every ten thousand stars that die peacefully, one star will be brutally torn apart by the extreme tidal forces present as it passes near a SMBH. My recent work has been to develop computational tools necessary to study the fates of stars disrupted by SMBHs. In this research project I presentthe results of my numerical study aimed at understanding the streams of star debris that result after disruption.
Interconnection requirements in avionic systems
NASA Astrophysics Data System (ADS)
Vergnolle, Claude; Houssay, Bruno
1991-04-01
The future aircraft generation will have thousand smart electromagnetic sensors distributed allover. Each sensor is connected with fibers links to the main-frame computer in charge of the real time signal''s correlation. Such a computer must be compactly built and massively parallel: it needs the use of 3 D optical free-space interconnect between neighbouring boards and reconfigurable interconnects via holographic backplane. The optical interconnect facilities will be also used to build fault-tolerant computer through large redundancy.
Hinkle, Stephen R; Böhlke, J K; Fisher, Lawrence H
2008-12-15
Septic tank systems are an important source of NO3(-) to many aquifers, yet characterization of N mass balance and isotope systematics following septic tank effluent discharge into unsaturated sediments has received limited attention. In this study, samples of septic tank effluent before and after transport through single-pass packed-bed filters (sand filters) were evaluated to elucidate mass balance and isotope effects associated with septic tank effluent discharge to unsaturated sediments. Chemical and isotopic data from five newly installed pairs and ten established pairs of septic tanks and packed-bed filters serving single homes in Oregon indicate that aqueous solute concentrations are affected by variations in recharge (precipitation, evapotranspiration), NH4+ sorption (primarily in immature systems), nitrification, and gaseous N loss via NH3 volatilization and(or) N2 or N2O release during nitrification/denitrification. Substantial NH4+ sorption capacity was also observed in laboratory columns with synthetic effluent. Septic tank effluent delta15N-NH4+ values were almost constant and averaged +4.9 per thousand+/-0.4 per thousand (1 sigma). In contrast, delta15N values of NO3(-) leaving mature packed-bed filters were variable (+0.8 to +14.4 per thousand) and averaged +7.2 per thousand+/-2.6 per thousand. Net N loss in the two networks of packed-bed filters was indicated by average 10-30% decreases in Cl(-)-normalized N concentrations and 2-3 per thousand increases in delta15N, consistent with fractionation accompanying gaseous N losses and corroborating established links between septic tank effluent and NO3(-) in a local, shallow aquifer. Values of delta18O-NO3(-) leaving mature packed-bed filters ranged from -10.2 to -2.3 per thousand (mean -6.4 per thousand+/-1.8 per thousand), and were intermediate between a 2/3 H2O-O+1/3 O2-O conceptualization and a 100% H2O-O conceptualization of delta18O-NO3(-) generation during nitrification.
Five different types of framing effects in medical situation: a preliminary exploration.
Peng, Jiaxi; Li, Hongzheng; Miao, Danmin; Feng, Xi; Xiao, Wei
2013-02-01
Considerable reports concerned the framing effect in medical situations. But quite few of them noticed to explore the differences among the various kinds of framing effects. In the present study, five different types of framing effects were examined and the effect sizes of them were compared. Medical decision making problems concerning medicine effect evaluation, patient's compliance, treatment and doctor options selection were established. All the problems were described in both positive and negative frames. 500 undergraduates as participants were randomly divided into ten groups. Participants from each group were asked to finish one decision making task. ALL THE FRAMES THAT WERE EXAMINED LEADED TO SIGNIFICANT FRAMING EFFECTS: When the Asia Disease Problem was described in a positive frame, the participants preferred the conservative frame than the risky one, while if in a negative frame, the preference reversed (P < 0.01). If the drug effect was described as "of 100 patients taking this kind of medicine, 70 patients became better", people tended to make more positive evaluations, compared with described as "of 100 patients taking this kind of medicine, 30 patients didn't become better" (P < 0.01). Doctors' advices were respectively described in a baneful or beneficial frame and the former one resulted in a better compliance (P < 0.05). If treatment options were described with a survival rate, people tended to choose risky option, while if described with a mortality rate, people tended to choose conservative option (P < 0.05). The number sized framing effect was also tested to be significant (P < 0.01). The five types of framing effects were small to big in effect size. Medical decision making can be affected by frame descriptions. Attentions should be paid on the standardization of description in medical practice.
Single-cell barcoding and sequencing using droplet microfluidics.
Zilionis, Rapolas; Nainys, Juozas; Veres, Adrian; Savova, Virginia; Zemmour, David; Klein, Allon M; Mazutis, Linas
2017-01-01
Single-cell RNA sequencing has recently emerged as a powerful tool for mapping cellular heterogeneity in diseased and healthy tissues, yet high-throughput methods are needed for capturing the unbiased diversity of cells. Droplet microfluidics is among the most promising candidates for capturing and processing thousands of individual cells for whole-transcriptome or genomic analysis in a massively parallel manner with minimal reagent use. We recently established a method called inDrops, which has the capability to index >15,000 cells in an hour. A suspension of cells is first encapsulated into nanoliter droplets with hydrogel beads (HBs) bearing barcoding DNA primers. Cells are then lysed and mRNA is barcoded (indexed) by a reverse transcription (RT) reaction. Here we provide details for (i) establishing an inDrops platform (1 d); (ii) performing hydrogel bead synthesis (4 d); (iii) encapsulating and barcoding cells (1 d); and (iv) RNA-seq library preparation (2 d). inDrops is a robust and scalable platform, and it is unique in its ability to capture and profile >75% of cells in even very small samples, on a scale of thousands or tens of thousands of cells.
NASA Astrophysics Data System (ADS)
Schulthess, Thomas C.
2013-03-01
The continued thousand-fold improvement in sustained application performance per decade on modern supercomputers keeps opening new opportunities for scientific simulations. But supercomputers have become very complex machines, built with thousands or tens of thousands of complex nodes consisting of multiple CPU cores or, most recently, a combination of CPU and GPU processors. Efficient simulations on such high-end computing systems require tailored algorithms that optimally map numerical methods to particular architectures. These intricacies will be illustrated with simulations of strongly correlated electron systems, where the development of quantum cluster methods, Monte Carlo techniques, as well as their optimal implementation by means of algorithms with improved data locality and high arithmetic density have gone hand in hand with evolving computer architectures. The present work would not have been possible without continued access to computing resources at the National Center for Computational Science of Oak Ridge National Laboratory, which is funded by the Facilities Division of the Office of Advanced Scientific Computing Research, and the Swiss National Supercomputing Center (CSCS) that is funded by ETH Zurich.
Roebroeks, Wil; Soressi, Marie
2016-01-01
The last decade has seen a significant growth of our knowledge of the Neandertals, a population of Pleistocene hunter-gatherers who lived in (western) Eurasia between ∼400,000 and 40,000 y ago. Starting from a source population deep in the Middle Pleistocene, the hundreds of thousands of years of relative separation between African and Eurasian groups led to the emergence of different phenotypes in Late Pleistocene Europe and Africa. Both recently obtained genetic evidence and archeological data show that the biological and cultural gaps between these populations were probably smaller than previously thought. These data, reviewed here, falsify inferences to the effect that, compared with their near-modern contemporaries in Africa, Neandertals were outliers in terms of behavioral complexity. It is only around 40,000 y ago, tens of thousands of years after anatomically modern humans first left Africa and thousands of years after documented interbreeding between modern humans, Neandertals and Denisovans, that we see major changes in the archeological record, from western Eurasia to Southeast Asia, e.g., the emergence of representational imagery and the colonization of arctic areas and of greater Australia (Sahul). PMID:27274044
Dunes on Titan observed by Cassini Radar
Radebaugh, J.; Lorenz, R.D.; Lunine, J.I.; Wall, S.D.; Boubin, G.; Reffet, E.; Kirk, R.L.; Lopes, R.M.; Stofan, E.R.; Soderblom, L.; Allison, M.; Janssen, M.; Paillou, P.; Callahan, P.; Spencer, C.; ,
2008-01-01
Thousands of longitudinal dunes have recently been discovered by the Titan Radar Mapper on the surface of Titan. These are found mainly within ??30?? of the equator in optically-, near-infrared-, and radar-dark regions, indicating a strong proportion of organics, and cover well over 5% of Titan's surface. Their longitudinal duneform, interactions with topography, and correlation with other aeolian forms indicate a single, dominant wind direction aligned with the dune axis plus lesser, off-axis or seasonally alternating winds. Global compilations of dune orientations reveal the mean wind direction is dominantly eastwards, with regional and local variations where winds are diverted around topographically high features, such as mountain blocks or broad landforms. Global winds may carry sediments from high latitude regions to equatorial regions, where relatively drier conditions prevail, and the particles are reworked into dunes, perhaps on timescales of thousands to tens of thousands of years. On Titan, adequate sediment supply, sufficient wind, and the absence of sediment carriage and trapping by fluids are the dominant factors in the presence of dunes. ?? 2007 Elsevier Inc. All rights reserved.
Inexpensive Cable Space Launcher of High Capability
NASA Technical Reports Server (NTRS)
Bolonkin, Alexander
2002-01-01
This paper proposes a new method and transportation system to fly into space, to the Moon, Mars, and other planets. This transportation system uses a mechanical energy transfer and requires only minimal energy so that it provides a 'Free Trip' into space. The method uses the rotary and kinetic energy of planets, asteroids, moons, satellites and other natural space bodies. computations for the following projects: 1. Non-Rocket Method for free launch of payload in Space and to other planets. The low cost project will accommodate one hundred thousand tourists annually. 2. Free Trips to the Mars for two thousand annually. 3. Free Trips to the Moon for ten thousand people annually. The projects use artificial materials like nanotubes and whiskers that have a ratio of tensile strength to density equal 4 million meters. In the future, nanotubes will be produced that can reach a specific stress up 100 millions meter and will significantly improve the parameters of suggested projects. The author is prepared to discuss the problems with serious organizations that want to research and develop these inventions.
Mining for metals in society's waste
Smith, Kathleen S.; Plumlee, Geoffrey S.; Hageman, Philip L.
2015-01-01
Metals and minerals are natural resources that human beings have been mining for thousands of years. Contemporary metal mining is dominated by iron ore, copper and gold, with 2 billion tons of iron ore, nearly 20 million tons of copper and 2,000 tons of gold produced every year. Tens to hundreds of tons of other metals that are essential components for electronics, green energy production, and high-technology products are produced annually.
Ph.D.'s Spend Big Bucks Hunting for Academic Jobs, with No Guaranteed Results
ERIC Educational Resources Information Center
Patton, Stacey
2013-01-01
Ph.D.'s are used to shelling out tens of thousands of dollars in the name of education. But earning the top graduate degree doesn't mean their spending has come to an end. An industry designed to help aspiring academics manage the job-application process and land tenure-track jobs is growing, and reaping the benefits of a tight market in many…
UCLA High Speed, High Volume Laboratory Network for Infectious Diseases. Addendum
2009-08-01
s) and should not be construed as an official Department of the Army position, policy or decision unless so designated by other documentation... Design : Because of current public health and national security threats, influenza surveillance and analysis will be the initial focus. In the upcoming...throughput and automated systems will enable processing of tens of thousands of samples and provide critical laboratory capacity. Its overall design and
Strategic Studies Quarterly. Volume 4, Number 1, Spring 2010
2010-01-01
scientists to produce more infectious pathogens through the use of genetic manipulation. Indeed, the reproduc tive capacity of bacteria and viruses...eries will give potential bioterrorists the ability to genetically engineer and produce new biological weapons for only tens of thousands of dollars...United States, because of its dominant economy and political clout, was able to levy neo-liberal policy prescriptions under the rubric of the
Burma: Assessing Options for U.S. Engagement
2009-06-01
2009). 28 Christina Fink, Living Silence: Burma Under Military Rule (Bangkok: White Lotus Company Ltd. 2001) 125. 29 John F. Cady, The United...death.’ Deprived of food , shelter, and medical treatment, tens of thousands died laying tracks through the fever-ridden mountainous jungle. The...hamlet’ operations in Vietnam, Ne Win implemented a policy called “Four Cuts” which was intended to cut all links to food , funds, intelligence, and
Removing the Stigma: For God and Country
2013-03-01
Virginians to the Puritans. George Washington exemplified the sentiments of our founding fathers in his response to the address from the Hebrew Congregation...distribution of Bibles 18 to the Japanese people. He declared, “We must have ten thousand Christian missionaries and a million bibles to complete the...crosses and Christian messages were painted on military vehicles driving through Iraq; images of U.S. soldiers holding rifles and bibles were posted on
NASA Technical Reports Server (NTRS)
Kalelkar, A. S.; Fiksel, J.; Rosenfield, D.; Richardson, D. L.; Hagopian, J.
1980-01-01
The risks associated with electrical effects arising from carbon fibers released from commercial aviation aircraft fires were estimated for 1993. The expected annual losses were estimated to be about $470 (1977 dollars) in 1993. The chances of total losses from electrical effects exceeding $100,000 (1977 dollars) in 1993 were established to be about one in ten thousand.
ERIC Educational Resources Information Center
Udell, Monique A. R.; Wynne, C. D. L.
2008-01-01
Dogs likely were the first animals to be domesticated and as such have shared a common environment with humans for over ten thousand years. Only recently, however, has this species' behavior been subject to scientific scrutiny. Most of this work has been inspired by research in human cognitive psychology and suggests that in many ways dogs are…
ERIC Educational Resources Information Center
Joo, Hee-Jung Serenity
2015-01-01
In the last two decades, the issue of comfort women--the women and girls who were forced into sex slavery for the Japanese army before and during WWII--has risen to global attention. Tens of thousands of comfort women (the average estimate is anywhere between 80,000 and 200,000) were confined at comfort stations managed by the Japanese Imperial…
ERIC Educational Resources Information Center
Shultz, Ginger V.; Gottfried, Amy C.; Winschel, Grace A.
2015-01-01
General chemistry is a gateway course that impacts the STEM trajectory of tens of thousands of students each year, and its role in the introductory curriculum as well as its pedagogical design are the center of an ongoing debate. To investigate the role of general chemistry in the curriculum, we report the results of a posthoc analysis of 10 years…
STS-47 Pilot Brown on OV-105's flight deck ten minutes after SSME cutoff
1992-09-12
STS047-28-002 (20 Sept. 1992) --- Astronaut Curtis L. Brown, Jr., STS-47 pilot, is photographed at the Space Shuttle Endeavour's pilot station about ten minutes after main engine cutoff on launch day of the eight-day Spacelab-J mission. Wearing the partial-pressure launch and entry suit, Brown shared the forward cabin with astronaut Robert L. Gibson (out of frame at left), mission commander. Endeavour was beginning its second mission in space, this one devoted to research supporting the Spacelab-J mission.
NASA Astrophysics Data System (ADS)
Allen, R. L.
2016-12-01
Computer enhancing of side scanning sonar plots revealed images of massive art, apparent ruins of cities, and subsea temples. Some images are about four to twenty kilometers in length. Present water depths imply that many of the finds must have been created over ten thousand years ago. Also, large carvings of giant sloths, Ice Age elk, mammoths, mastodons, and other cold climate creatures concurrently indicate great age. In offshore areas of North America, some human faces have beards and what appear to be Caucasian characteristics that clearly contrast with the native tribal images. A few images have possible physical appearances associated with Polynesians. Contacts and at least limited migrations must have occurred much further in the ancient past than previously believed. Greatly rising sea levels and radical changes away from late Ice Age climates had to be devastating to very ancient civilizations. Many images indicate that these cultures were capable of construction and massive art at or near the technological level of the Old Kingdom in Egypt. Paleo astronomy is obvious in some plots. Major concerns are how to further evaluate, catalog, protect, and conserve the creations of those cultures.
Long-term evolution of an Oligocene/Miocene maar lake from Otago, New Zealand
NASA Astrophysics Data System (ADS)
Fox, B. R. S.; Wartho, J.; Wilson, G. S.; Lee, D. E.; Nelson, F. E.; Kaulfuss, U.
2015-01-01
Foulden Maar is a highly resolved maar lake deposit from the South Island of New Zealand comprising laminated diatomite punctuated by numerous diatomaceous turbidites. Basaltic clasts found in debris flow deposits near the base of the cored sedimentary sequence yielded two new 40Ar/39Ar dates of 24.51 ± 0.24 and 23.38 ± 0.24 Ma (2σ). The younger date agrees within error with a previously published 40Ar/39Ar date of 23.17 ± 0.19 Ma from a basaltic dyke adjacent to the maar crater. The diatomite is inferred to have been deposited over several tens of thousands of years in the latest Oligocene/earliest Miocene, and may have been coeval with the period of rapid glaciation and subsequent deglaciation of Antarctica known as the Mi-1 event. Sediment magnetic properties and SEM measurements indicate that the magnetic signal is dominated by pseudo-single domain pyrrhotite. The most likely source of detrital pyrrhotite is schist country rock fragments from the inferred tephra ring created by the phreatomagmatic eruption that formed the maar. Variations in magnetic mineral concentration indicate a decrease in erosional input throughout the depositional period, suggesting long-term (tens of thousands of years) environmental change in New Zealand in the latest Oligocene/earliest Miocene.
The Galaxy platform for accessible, reproducible and collaborative biomedical analyses: 2018 update.
Afgan, Enis; Baker, Dannon; Batut, Bérénice; van den Beek, Marius; Bouvier, Dave; Cech, Martin; Chilton, John; Clements, Dave; Coraor, Nate; Grüning, Björn A; Guerler, Aysam; Hillman-Jackson, Jennifer; Hiltemann, Saskia; Jalili, Vahid; Rasche, Helena; Soranzo, Nicola; Goecks, Jeremy; Taylor, James; Nekrutenko, Anton; Blankenberg, Daniel
2018-05-22
Galaxy (homepage: https://galaxyproject.org, main public server: https://usegalaxy.org) is a web-based scientific analysis platform used by tens of thousands of scientists across the world to analyze large biomedical datasets such as those found in genomics, proteomics, metabolomics and imaging. Started in 2005, Galaxy continues to focus on three key challenges of data-driven biomedical science: making analyses accessible to all researchers, ensuring analyses are completely reproducible, and making it simple to communicate analyses so that they can be reused and extended. During the last two years, the Galaxy team and the open-source community around Galaxy have made substantial improvements to Galaxy's core framework, user interface, tools, and training materials. Framework and user interface improvements now enable Galaxy to be used for analyzing tens of thousands of datasets, and >5500 tools are now available from the Galaxy ToolShed. The Galaxy community has led an effort to create numerous high-quality tutorials focused on common types of genomic analyses. The Galaxy developer and user communities continue to grow and be integral to Galaxy's development. The number of Galaxy public servers, developers contributing to the Galaxy framework and its tools, and users of the main Galaxy server have all increased substantially.
Flow Cytometry: Impact on Early Drug Discovery.
Edwards, Bruce S; Sklar, Larry A
2015-07-01
Modern flow cytometers can make optical measurements of 10 or more parameters per cell at tens of thousands of cells per second and more than five orders of magnitude dynamic range. Although flow cytometry is used in most drug discovery stages, "sip-and-spit" sampling technology has restricted it to low-sample-throughput applications. The advent of HyperCyt sampling technology has recently made possible primary screening applications in which tens of thousands of compounds are analyzed per day. Target-multiplexing methodologies in combination with extended multiparameter analyses enable profiling of lead candidates early in the discovery process, when the greatest numbers of candidates are available for evaluation. The ability to sample small volumes with negligible waste reduces reagent costs, compound usage, and consumption of cells. Improved compound library formatting strategies can further extend primary screening opportunities when samples are scarce. Dozens of targets have been screened in 384- and 1536-well assay formats, predominantly in academic screening lab settings. In concert with commercial platform evolution and trending drug discovery strategies, HyperCyt-based systems are now finding their way into mainstream screening labs. Recent advances in flow-based imaging, mass spectrometry, and parallel sample processing promise dramatically expanded single-cell profiling capabilities to bolster systems-level approaches to drug discovery. © 2015 Society for Laboratory Automation and Screening.
Flow Cytometry: Impact On Early Drug Discovery
Edwards, Bruce S.; Sklar, Larry A.
2015-01-01
Summary Modern flow cytometers can make optical measurements of 10 or more parameters per cell at tens-of-thousands of cells per second and over five orders of magnitude dynamic range. Although flow cytometry is used in most drug discovery stages, “sip-and-spit” sampling technology has restricted it to low sample throughput applications. The advent of HyperCyt sampling technology has recently made possible primary screening applications in which tens-of-thousands of compounds are analyzed per day. Target-multiplexing methodologies in combination with extended multi-parameter analyses enable profiling of lead candidates early in the discovery process, when the greatest numbers of candidates are available for evaluation. The ability to sample small volumes with negligible waste reduces reagent costs, compound usage and consumption of cells. Improved compound library formatting strategies can further extend primary screening opportunities when samples are scarce. Dozens of targets have been screened in 384- and 1536-well assay formats, predominantly in academic screening lab settings. In concert with commercial platform evolution and trending drug discovery strategies, HyperCyt-based systems are now finding their way into mainstream screening labs. Recent advances in flow-based imaging, mass spectrometry and parallel sample processing promise dramatically expanded single cell profiling capabilities to bolster systems level approaches to drug discovery. PMID:25805180
Reconciling short recurrence intervals with minor deformation in the New Madrid seismic zone
Schweig, E.S.; Ellis, M.A.
1994-01-01
At least three great earthquakes occurred in the New Madrid seismic zone in 1811 and 1812. Estimates of present-day strain rates suggest that such events may have a repeat time of 1000 years or less. Paleoseismological data also indicate that earthquakes large enough to cause soil liquefaction have occurred several times in the past 5000 years. However, pervasive crustal deformation expected from such a high frequency of large earthquakes is not observed. This suggests that the seismic zone is a young feature, possibly as young as several tens of thousands of years old and no more than a few million years old.At least three great earthquakes occurred in the New Madrid seismic zone in 1811 and 1812. Estimates of present-day strain rates suggest that such events may have a repeat time of 1000 years or less. Paleoseismological data also indicate that earthquakes large enough to cause soil liquefaction have occurred several times in the past 5000 years. However, pervasive crustal deformation expected from such a high frequency of large earthquakes is not observed. This suggests that the seismic zone is a young feature, possibly as young as several tens of thousands of years old and no more than a few million years old.
Bhaskar, Anand; Wang, Y X Rachel; Song, Yun S
2015-02-01
With the recent increase in study sample sizes in human genetics, there has been growing interest in inferring historical population demography from genomic variation data. Here, we present an efficient inference method that can scale up to very large samples, with tens or hundreds of thousands of individuals. Specifically, by utilizing analytic results on the expected frequency spectrum under the coalescent and by leveraging the technique of automatic differentiation, which allows us to compute gradients exactly, we develop a very efficient algorithm to infer piecewise-exponential models of the historical effective population size from the distribution of sample allele frequencies. Our method is orders of magnitude faster than previous demographic inference methods based on the frequency spectrum. In addition to inferring demography, our method can also accurately estimate locus-specific mutation rates. We perform extensive validation of our method on simulated data and show that it can accurately infer multiple recent epochs of rapid exponential growth, a signal that is difficult to pick up with small sample sizes. Lastly, we use our method to analyze data from recent sequencing studies, including a large-sample exome-sequencing data set of tens of thousands of individuals assayed at a few hundred genic regions. © 2015 Bhaskar et al.; Published by Cold Spring Harbor Laboratory Press.
International Laser Ranging Service (ILRS) 2003-2004 Annual Report
NASA Technical Reports Server (NTRS)
Pearlman, Michael (Editor); Noll, Carey (Editor)
2005-01-01
The International Laser Ranging Service (ILRS) organizes and coordinates Satellite Laser Ranging (SLR) and Lunar Laser Ranging (LLR) to support programs in geodetic, geophysical, and lunar research activities and provides the International Earth Rotation and Reference Systems Service (IERS) with products important to the maintenance of an accurate International Terrestrial Reference Frame (ITRF). This reference frame provides the stability through which systematic measurements of the Earth can be made over thousands of kilometers, decades of time, and evolution of measurement technology. This 2003-2004 ILRS annual report is comprised of individual contributions from ILRS components within the international geodetic community for the years 2003-2004. The report documents changes and progress of the ILRS and is also available on the ILRS Web site at http://ilrs.gsfc.nasa.gov/reports/ilrs_reports/ilrsar_2003.html.
NASA Astrophysics Data System (ADS)
Ciufolini, Ignazio; Pavlis, Erricos C.; Sindoni, Giampiero; Ries, John C.; Paolozzi, Antonio; Matzner, Richard; Koenig, Rolf; Paris, Claudio
2017-08-01
In the previous paper we have introduced the LARES 2 space experiment. The LARES 2 laser-ranged satellite is planned for a launch in 2019 with the new VEGA C launch vehicle of the Italian Space Agency (ASI), ESA and ELV. The main objectives of the LARES 2 experiment are accurate measurements of General Relativity, gravitational and fundamental physics and accurate determinations in space geodesy and geodynamics. In particular LARES 2 is aimed to achieve a very accurate test of frame-dragging, an intriguing phenomenon predicted by General Relativity. Here we report the results of Monte Carlo simulations and covariance analyses fully confirming an error budget of a few parts in one thousand in the measurement of frame-dragging with LARES 2 as calculated in our previous paper.
Feasibility of pulse wave velocity estimation from low frame rate US sequences in vivo
NASA Astrophysics Data System (ADS)
Zontak, Maria; Bruce, Matthew; Hippke, Michelle; Schwartz, Alan; O'Donnell, Matthew
2017-03-01
The pulse wave velocity (PWV) is considered one of the most important clinical parameters to evaluate CV risk, vascular adaptation, etc. There has been substantial work attempting to measure the PWV in peripheral vessels using ultrasound (US). This paper presents a fully automatic algorithm for PWV estimation from the human carotid using US sequences acquired with a Logic E9 scanner (modified for RF data capture) and a 9L probe. Our algorithm samples the pressure wave in time by tracking wall displacements over the sequence, and estimates the PWV by calculating the temporal shift between two sampled waves at two distinct locations. Several recent studies have utilized similar ideas along with speckle tracking tools and high frame rate (above 1 KHz) sequences to estimate the PWV. To explore PWV estimation in a more typical clinical setting, we used focused-beam scanning, which yields relatively low frame rates and small fields of view (e.g., 200 Hz for 16.7 mm filed of view). For our application, a 200 Hz frame rate is low. In particular, the sub-frame temporal accuracy required for PWV estimation between locations 16.7 mm apart, ranges from 0.82 of a frame for 4m/s, to 0.33 for 10m/s. When the distance is further reduced (to 0.28 mm between two beams), the sub-frame precision is in parts per thousand (ppt) of the frame (5 ppt for 10m/s). As such, the contributions of our algorithm and this paper are: 1. Ability to work with low frame-rate ( 200Hz) and decreased lateral field of view. 2. Fully automatic segmentation of the wall intima (using raw RF images). 3. Collaborative Speckle Tracking of 2D axial and lateral carotid wall motion. 4. Outlier robust PWV calculation from multiple votes using RANSAC. 5. Algorithm evaluation on volunteers of different ages and health conditions.
Attitudes towards assisted dying are influenced by question wording and order: a survey experiment.
Magelssen, Morten; Supphellen, Magne; Nortvedt, Per; Materstvedt, Lars Johan
2016-04-27
Surveys on attitudes towards assisted dying play an important role in informing public debate, policy and legislation. Unfortunately, surveys are often designed with insufficient attention to framing effects; that is, effects on the respondents' stated attitudes caused by question wording and context. The purpose of this study was to demonstrate and measure such framing effects. Survey experiment in which an eight-question survey on attitudes towards assisted dying was distributed to Norwegian citizens through a web-based panel. Two variations of question wording as well as two variations of question order were employed. Respondents were randomized to receive one of four questionnaire versions. Three thousand and fifty responses were received. There were moderate to large question wording and question order effects. A majority of Norwegian citizens favour the legalization of assisted dying for patients with terminal or chronic disease. Stakeholders in the assisted dying debate need to acknowledge potential framing effects, and accordingly should interpret survey results with caution. The same holds for researchers who conduct attitude surveys in the field of bioethics.
Ravì, Daniele; Szczotka, Agnieszka Barbara; Shakir, Dzhoshkun Ismail; Pereira, Stephen P; Vercauteren, Tom
2018-06-01
Probe-based confocal laser endomicroscopy (pCLE) is a recent imaging modality that allows performing in vivo optical biopsies. The design of pCLE hardware, and its reliance on an optical fibre bundle, fundamentally limits the image quality with a few tens of thousands fibres, each acting as the equivalent of a single-pixel detector, assembled into a single fibre bundle. Video registration techniques can be used to estimate high-resolution (HR) images by exploiting the temporal information contained in a sequence of low-resolution (LR) images. However, the alignment of LR frames, required for the fusion, is computationally demanding and prone to artefacts. In this work, we propose a novel synthetic data generation approach to train exemplar-based Deep Neural Networks (DNNs). HR pCLE images with enhanced quality are recovered by the models trained on pairs of estimated HR images (generated by the video registration algorithm) and realistic synthetic LR images. Performance of three different state-of-the-art DNNs techniques were analysed on a Smart Atlas database of 8806 images from 238 pCLE video sequences. The results were validated through an extensive image quality assessment that takes into account different quality scores, including a Mean Opinion Score (MOS). Results indicate that the proposed solution produces an effective improvement in the quality of the obtained reconstructed image. The proposed training strategy and associated DNNs allows us to perform convincing super-resolution of pCLE images.
NASA Astrophysics Data System (ADS)
Mor, Ilan; Vartsky, David; Dangendorf, Volker; Tittelmeier, Kai.; Weierganz, Mathias; Goldberg, Mark Benjamin; Bar, Doron; Brandis, Michal
2018-06-01
We describe an analysis procedure for automatic unambiguous detection of fast-neutron-induced recoil proton tracks in a micro-capillary array filled with organic liquid scintillator. The detector is viewed by an intensified CCD camera. This imaging neutron detector possesses the capability to perform high position-resolution (few tens of μm), energy-dispersive transmission-imaging using ns-pulsed beams. However, when operated with CW or DC beams, it also features medium-quality spectroscopic capabilities for incident neutrons in the energy range 2-20 MeV. In addition to the recoil proton events which display a continuous extended track structure, the raw images exhibit complex ion-tracks from nuclear interactions of fast-neutrons in the scintillator, capillaries quartz-matrix and CCD. Moreover, as expected, one also observes a multitude of isolated scintillation spots of varying intensity (henceforth denoted "blobs") that originate from several different sources, such as: fragmented proton tracks, gamma-rays, heavy-ion reactions as well as events and noise that occur in the image-intensifier and CCD. In order to identify the continuous-track recoil proton events and distinguish them from all these background events, a rapid, computerized and automatic track-recognition-procedure was developed. Based on an appropriately weighted analysis of track parameters such as: length, width, area and overall light intensity, the method is capable of distinguishing a single continuous-track recoil proton from typically surrounding several thousands of background events that are found in each CCD frame.
Designing Agent Utilities for Coordinated, Scalable and Robust Multi-Agent Systems
NASA Technical Reports Server (NTRS)
Tumer, Kagan
2005-01-01
Coordinating the behavior of a large number of agents to achieve a system level goal poses unique design challenges. In particular, problems of scaling (number of agents in the thousands to tens of thousands), observability (agents have limited sensing capabilities), and robustness (the agents are unreliable) make it impossible to simply apply methods developed for small multi-agent systems composed of reliable agents. To address these problems, we present an approach based on deriving agent goals that are aligned with the overall system goal, and can be computed using information readily available to the agents. Then, each agent uses a simple reinforcement learning algorithm to pursue its own goals. Because of the way in which those goals are derived, there is no need to use difficult to scale external mechanisms to force collaboration or coordination among the agents, or to ensure that agents actively attempt to appropriate the tasks of agents that suffered failures. To present these results in a concrete setting, we focus on the problem of finding the sub-set of a set of imperfect devices that results in the best aggregate device. This is a large distributed agent coordination problem where each agent (e.g., device) needs to determine whether to be part of the aggregate device. Our results show that the approach proposed in this work provides improvements of over an order of magnitude over both traditional search methods and traditional multi-agent methods. Furthermore, the results show that even in extreme cases of agent failures (i.e., half the agents failed midway through the simulation) the system's performance degrades gracefully and still outperforms a failure-free and centralized search algorithm. The results also show that the gains increase as the size of the system (e.g., number of agents) increases. This latter result is particularly encouraging and suggests that this method is ideally suited for domains where the number of agents is currently in the thousands and will reach tens or hundreds of thousands in the near future.
Young, Gregory J; Zhang, Shiping; Mirsky, Henry P; Cressman, Robert F; Cong, Bin; Ladics, Gregory S; Zhong, Cathy X
2012-10-01
Before a genetically modified (GM) crop can be commercialized it must pass through a rigorous regulatory process to verify that it is safe for human and animal consumption, and to the environment. One particular area of focus is the potential introduction of a known or cross-reactive allergen not previously present within the crop. The assessment of possible allergenicity uses the guidelines outlined by the Food and Agriculture Organization (FAO) and World Health Organization's (WHO) Codex Alimentarius Commission (Codex) to evaluate all newly expressed proteins. Some regulatory authorities have broadened the scope of the assessment to include all DNA reading frames between stop codons across the insert and spanning the insert/genomic DNA junctions. To investigate the utility of this bioinformatic assessment, all naturally occurring stop-to-stop frames in the non-transgenic genomes of maize, rice, and soybean, as well as the human genome, were compared against the AllergenOnline (www.allergenonline.org) database using the Codex criteria. We discovered thousands of frames that exceeded the Codex defined threshold for potential cross-reactivity suggesting that evaluating hypothetical ORFs (stop-to-stop frames) has questionable value for making decisions on the safety of GM crops. Copyright © 2012 Elsevier Ltd. All rights reserved.
JOB REDESIGN FOR OLDER WORKERS, TEN CASE STUDIES.
ERIC Educational Resources Information Center
MITNICK, EDWARD; ROTHBERG, HERMAN
AFTER IDENTIFYING FIRMS WHICH HAD USED JOB REDESIGN TO SALVAGE THE SKILL OF OLDER EMPLOYEES, RESEARCH INVESTIGATORS MADE 10 INTENSIVE CASE STUDIES IN FIRMS PRODUCING AIRCRAFT ENGINES, ALUMINUM FRAMING, BUILDING MATERIALS, CARPETS, COMPUTERS, COPPER PIPE FITTINGS, FOOTWEAR, HEAVY IRON PIPE, PRECISION INSTRUMENTS, AND PRINTED NOVELTIES. EACH STUDY…
Automotive Body Repair Competencies.
ERIC Educational Resources Information Center
D'Armond, Jack; And Others
Designed to provide a model curriculum and guidelines, this manual presents tasks that were identified by employers, employees, and teachers as important in a postsecondary auto body repair curriculum. The tasks are divided into ten major component areas of instruction: metalworking and fiberglass, painting, frame and suspension, glass and trim,…
Star Students Make Connections
ERIC Educational Resources Information Center
Marshall, Anne Marie; Superfine, Alison Castro; Canty, Reality S.
2010-01-01
Ms. Beyer's first graders have been working for several weeks on solving problems that encourage the use of such multiple representations as ten frames and number lines. The class is using Math Trailblazers, a National Science Foundation-supported elementary school math curriculum developed to reflect recent reform efforts in mathematics…
NASA Astrophysics Data System (ADS)
Shaw, Glenn E.
1988-02-01
Tropospheric aerosols with the diameter range of half a micron reside in the atmosphere for tens of days and teleconnect Antarctica with other regions by transport that reaches planetary scales of distances; thus, the aerosol on the Antarctic ice represents 'memory modules' of events that took place at regions separated from Antarctica by tens of thousands of kilometers. In terms of aerosol mass, the aerosol species include insoluble crustal products (less than 5 percent), transported sea-salt residues (highly variable but averaging about 10 percent), Ni-rich meteoric material, and anomalously enriched material with an unknown origin. Most (70-90 percent by mass) of the aerosol over the Antarctic ice shield, however, is the 'natural acid sulfate aerosol', apparently deriving from biological processes taking place in the surrounding oceans.
The Prophylactic Extraction of Third Molars: A Public Health Hazard
Friedman, Jay W.
2007-01-01
Ten million third molars (wisdom teeth) are extracted from approximately 5 million people in the United States each year at an annual cost of over $3 billion. In addition, more than 11 million patient days of “standard discomfort or disability”—pain, swelling, bruising, and malaise—result postoperatively, and more than 11000 people suffer permanent paresthesia—numbness of the lip, tongue, and cheek—as a consequence of nerve injury during the surgery. At least two thirds of these extractions, associated costs, and injuries are unnecessary, constituting a silent epidemic of iatrogenic injury that afflicts tens of thousands of people with lifelong discomfort and disability. Avoidance of prophylactic extraction of third molars can prevent this public health hazard. PMID:17666691
NASA Astrophysics Data System (ADS)
Townsley, Leisa
2016-09-01
Massive star-forming regions (MSFRs) are engines of change across the Galaxy, providing its ionization, fueling the hot ISM, and seeding spiral arms with tens of thousands of new stars. Galactic MSFRs are springboards for understanding their extragalactic counterparts, which provide the basis for star formation rate calibrations and form the building blocks of starburst galaxies. This archive program will extend Chandra's lexicon of the Galaxy's MSFRs with in-depth analysis of 16 complexes, studying star formation and evolution on scales of tenths to tens of parsecs, distances <1 to >10 kpc, and ages <1 to >15 Myr. It fuses a "Physics of the Cosmos" mission with "Cosmic Origins" science, bringing new insight into star formation and feedback through Chandra's unique X-ray perspective.
More MAGiX in the Chandra Archive
NASA Astrophysics Data System (ADS)
Townsley, Leisa
2017-09-01
Massive star-forming regions (MSFRs) are engines of change across the Galaxy, providing its ionization, fueling the hot ISM, and seeding spiral arms with tens of thousands of new stars. Resolvable MSFRs are microscopes for understanding their more distant extragalactic counterparts, which provide the basis for star formation rate calibrations and form the building blocks of starburst galaxies. This archive program will extend Chandra's lexicon of MSFRs with in-depth analysis of 16 complexes, studying star formation and evolution on scales of tenths to tens of parsecs, distances <1 to >50 kpc, and ages <1 to 25 Myr. It fuses a "Physics of the Cosmos" mission with "Cosmic Origins" science, bringing new insight into star formation and feedback through Chandra's unique X-ray perspective.
Dynamic characteristics of far-field radiation of current modulated phase-locked diode laser arrays
NASA Technical Reports Server (NTRS)
Elliott, R. A.; Hartnett, K.
1987-01-01
A versatile and powerful streak camera/frame grabber system for studying the evolution of the near and far field radiation patterns of diode lasers was assembled and tested. Software needed to analyze and display the data acquired with the steak camera/frame grabber system was written and the total package used to record and perform preliminary analyses on the behavior of two types of laser, a ten emitter gain guided array and a flared waveguide Y-coupled array. Examples of the information which can be gathered with this system are presented.
Teacher Beliefs, Knowledge, and Practice of Self-Regulated Learning
ERIC Educational Resources Information Center
Spruce, Robin; Bol, Linda
2015-01-01
This study examined teacher beliefs, knowledge, and classroom practice of self-regulated learning for ten elementary and middle school teachers. Using Zimmerman's SRL model to frame our method and results, we administered questionnaires, observed classrooms and conducted interviews with these teachers. Teachers had positive beliefs about the role…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heffner, M.; Riot, V.; Fabris, L.
Medium to large channel count detectors are usually faced with a few unattractive options for data acquisition (DAQ). Small to medium sized TPC experiments, for example, can be too small to justify the high expense and long development time of application specific integrated circuit (ASIC) development. In some cases an experiment can piggy-back on a larger experiment and the associated ASIC development, but this puts the time line of development out of the hands of the smaller experiment. Another option is to run perhaps thousands of cables to rack mounted equipment, which is clearly undesirable. The development of commercial high-speedmore » high-density FPGAs and ADCs combined with the small discrete components and robotic assembly open a new option that scales to tens of thousands of channels and is only slightly larger than ASICs using off-the-shelf components.« less
Rechargeable nickel-3D zinc batteries: An energy-dense, safer alternative to lithium-ion.
Parker, Joseph F; Chervin, Christopher N; Pala, Irina R; Machler, Meinrad; Burz, Michael F; Long, Jeffrey W; Rolison, Debra R
2017-04-28
The next generation of high-performance batteries should include alternative chemistries that are inherently safer to operate than nonaqueous lithium-based batteries. Aqueous zinc-based batteries can answer that challenge because monolithic zinc sponge anodes can be cycled in nickel-zinc alkaline cells hundreds to thousands of times without undergoing passivation or macroscale dendrite formation. We demonstrate that the three-dimensional (3D) zinc form-factor elevates the performance of nickel-zinc alkaline cells in three fields of use: (i) >90% theoretical depth of discharge (DOD Zn ) in primary (single-use) cells, (ii) >100 high-rate cycles at 40% DOD Zn at lithium-ion-commensurate specific energy, and (iii) the tens of thousands of power-demanding duty cycles required for start-stop microhybrid vehicles. Copyright © 2017, American Association for the Advancement of Science.
Statistical Detection of Atypical Aircraft Flights
NASA Technical Reports Server (NTRS)
Statler, Irving; Chidester, Thomas; Shafto, Michael; Ferryman, Thomas; Amidan, Brett; Whitney, Paul; White, Amanda; Willse, Alan; Cooley, Scott; Jay, Joseph;
2006-01-01
A computational method and software to implement the method have been developed to sift through vast quantities of digital flight data to alert human analysts to aircraft flights that are statistically atypical in ways that signify that safety may be adversely affected. On a typical day, there are tens of thousands of flights in the United States and several times that number throughout the world. Depending on the specific aircraft design, the volume of data collected by sensors and flight recorders can range from a few dozen to several thousand parameters per second during a flight. Whereas these data have long been utilized in investigating crashes, the present method is oriented toward helping to prevent crashes by enabling routine monitoring of flight operations to identify portions of flights that may be of interest with respect to safety issues.
Nuclear Energy in Southeast Asia: Pull Rods or Scram
2009-06-01
December 29, 2008); Seth Mydans, “Tens of thousands join Myanmar protest,” International Harold Tribune, September 24, 2007, http://www.iht.com...articles/2007/09/24/news/myanmar.php (accessed December 29, 2008); Seth Mydans; “Myanmar monk protest contained by Junta forces,” The New York Times...Nuclear Plant for Electricity.” Associated Press, September 26, 2008. http://www.ap.org (accessed October 20, 2008). Mydans, Seth . “Myanmar monk
Dental Calculus and the Evolution of the Human Oral Microbiome.
Warinner, Christina
2016-07-01
Characterizing the evolution of the oral microbiome is a challenging, but increasingly feasible, task. Recently, dental calculus has been shown to preserve ancient biomolecules from the oral microbiota, host tissues and diet for tens of thousands of years. As such, it provides a unique window into the ancestral oral microbiome. This article reviews recent advancements in ancient dental calculus research and emerging insights into the evolution and ecology of the human oral microbiome.
Sub-Saharan Africa Report, No. 2828
1983-08-03
EEC Food Aid 59 - d - TEN MILLION AFRICANS MAY STARVE THIS WINTER Johannesburg THE STAR in English 7 Jul 83 p 1 INTER-AFRICAN AFFAIRS [Text...BULÄWAYÖ - At least 10 million people in five South- ern African countries will need emergency food aid if they are to survive the win- ter. This...about half the population of about one million are already receiving emergen- cy food rations while thousands of cattle are being slaughtered
Probability of illness definition for the Skylab flight crew health stabilization program
NASA Technical Reports Server (NTRS)
1974-01-01
Management and analysis of crew and environmental microbiological data from SMEAT and Skylab are discussed. Samples were collected from ten different body sites on each SMEAT and Skylab crew-member on approximately 50 occasions and since several different organisms could be isolated from each sample, several thousand lab reports were generated. These lab reports were coded and entered in a computer file and from the file various tabular summaries were constructed.
Big Data Quality Case Study Preliminary Findings, U.S. Army MEDCOM MODS
2013-09-01
captured in electronic form is relatively small, on the order of hundreds of thousands of health profiles at say around 500K per profile, or in the...in electronic form, then different language identification, handwriting recognition, and Natural Language Processing (NLP) techniques could be used...and patterns” [15]. Volume - The free text fields vary in length from say ten characters to several hundred characters. Other materials can be much
Israel: Possible Military Strike Against Iran’s Nuclear Facilities
2012-03-27
centrifuge facility and a larger commercial facility located at this site. The commercial facility is reportedly hardened by steel-reinforced concrete , buried...prime minister has had to contemplate. A strike against Iran’s nuclear facilities could lead to regional conflagration , tens of thousands of...high explosives, and can penetrate more than 6 feet of reinforced concrete . The GBU-28 5000-lb class weapon penetrates at least 20 feet of concrete
Reclamation of Wood Materials Coated with Lead-Based Paint
2008-05-01
Tens of thousands of temporary wooden buildings from the World War II (WWII) era, consisting of more than 50 million sf of floor area , await...Camp Roberts. The amount of heartwood and sapwood is an important quality characteristic in the redwood species, with a higher content of...landfilling that material as C&D debris. There are two general areas of opportunity to improve efficiency and reduce costs in a deconstruction and
2016-04-01
AFRL-AFOSR-VA-TR-2016-0145 Quasi-continuum reduction of field theories: A route to seamlessly bridge quantum and atomistic length-scales with...field theories: A route to seamlessly bridge quantum and atomistic length-scales with continuum Principal Investigator: Vikram Gavini Department of...calculations on tens of thousands of atoms, and enable continuing efforts towards a seamless bridging of the quantum and continuum length-scales
Effect of semen preparation on casa motility results in cryopreserved bull spermatozoa.
Contri, Alberto; Valorz, Claudio; Faustini, Massimo; Wegher, Laura; Carluccio, Augusto
2010-08-01
Computer-assisted sperm analyzers (CASA) have become the standard tool for evaluating sperm motility and kinetic patterns because they provide objective data for thousands of sperm tracks. However, these devices are not ready-to-use and standardization of analytical practices is a fundamental requirement. In this study, we evaluated the effects of some settings, such as frame rate and frames per field, chamber and time of analysis, and samples preparations, including thawing temperature, sperm sample concentration, and media used for dilution, on the kinetic results of bovine frozen-thawed semen using a CASA. In Experiment 1, the frame rate (30-60 frame/s) significantly affected motility parameters, whereas the number of frames per field (30 or 45) did not seem to affect sperm kinetics. In Experiment 2, the thawing protocol affects sperm motility and kinetic parameters. Sperm sample concentration significantly limited the opportunity to perform the analysis and the kinetic results. A concentration of 100 and 50 x 10(6) sperm/mL limited the device's ability to perform the analysis or gave wrong results, whereas 5, 10, 20, and 30 x 10(6) sperm/mL concentrations allowed the analysis to be performed, but with different results (Experiment 3). The medium used for the dilution of the sample, which is fundamental for a correct sperm head detection, affects sperm motility results (Experiment 4). In this study, Makler and Leja chambers were used to perform the semen analysis with CASA devices. The chamber used significantly affected motility results (Experiment 5). The time between chamber loading and analysis affected sperm velocities, regardless of chamber used. Based on results recorded in this study, we propose that the CASA evaluation of motility of bovine frozen-thawed semen using Hamilton-Thorne IVOS 12.3 should be performed using a frame rate of 60 frame/s and 30 frames per field. Semen should be diluted at least at 20 x 10(6) sperm/mL using PBS. Furthermore, it is necessary to consider the type of chamber used and perform the analysis within 1 or 2 min, regardless of the chamber used. Copyright 2010 Elsevier Inc. All rights reserved.
Track following of Ξ-hyperons in nuclear emulsion for the E07 experiment
NASA Astrophysics Data System (ADS)
Mishina, Akihiro; Nakazawa, Kazuma; Hoshino, Kaoru; Itonaga, Kazunori; Yoshida, Junya; Than Tint, Khin; Kyaw Soe, Myint; Kinbara, Shinji; Itoh, Hiroki; Endo, Yoko; Kobayashi, Hidetaka; Umehara, Kaori; Yokoyama, Hiroyuki; Nakashima, Daisuke; J-PARC E07 Collaboration
2014-09-01
Events of Double- Λ and Twin Single- Λ Hypernuclei are very important to understand Λ- Λ and Ξ--N interaction. We planned the E07 experiment to find Nuclear mass dependences of them with ten times higher statistics than before. In the experiment, the number of Ξ- hyperon stopping at rest is about ten thousands which is ten times larger than before. Such number of tracks for Ξ- hyperon candidates should be followed in nuclear emulsion plate up to their stopping point. To complete its job within one year, it is necessary for development of automated track following system. The important points for track following is Track connection in plate by plate. To carry out these points, we innovated image processing methods. Especially, we applied pattern match of K- beams for 2nd point. Position accuracy of this method was 1.4 +/-0.8 μm . If we succeed this application in about one minute for a track in each plate, all track following can be finished in one year.
Photon Counting Imaging with an Electron-Bombarded Pixel Image Sensor
Hirvonen, Liisa M.; Suhling, Klaus
2016-01-01
Electron-bombarded pixel image sensors, where a single photoelectron is accelerated directly into a CCD or CMOS sensor, allow wide-field imaging at extremely low light levels as they are sensitive enough to detect single photons. This technology allows the detection of up to hundreds or thousands of photon events per frame, depending on the sensor size, and photon event centroiding can be employed to recover resolution lost in the detection process. Unlike photon events from electron-multiplying sensors, the photon events from electron-bombarded sensors have a narrow, acceleration-voltage-dependent pulse height distribution. Thus a gain voltage sweep during exposure in an electron-bombarded sensor could allow photon arrival time determination from the pulse height with sub-frame exposure time resolution. We give a brief overview of our work with electron-bombarded pixel image sensor technology and recent developments in this field for single photon counting imaging, and examples of some applications. PMID:27136556
3-D Velocimetry of Strombolian Explosions
NASA Astrophysics Data System (ADS)
Taddeucci, J.; Gaudin, D.; Orr, T. R.; Scarlato, P.; Houghton, B. F.; Del Bello, E.
2014-12-01
Using two synchronized high-speed cameras we were able to reconstruct the three-dimensional displacement and velocity field of bomb-sized pyroclasts in Strombolian explosions at Stromboli Volcano. Relatively low-intensity Strombolian-style activity offers a rare opportunity to observe volcanic processes that remain hidden from view during more violent explosive activity. Such processes include the ejection and emplacement of bomb-sized clasts along pure or drag-modified ballistic trajectories, in-flight bomb collision, and gas liberation dynamics. High-speed imaging of Strombolian activity has already opened new windows for the study of the abovementioned processes, but to date has only utilized two-dimensional analysis with limited motion detection and ability to record motion towards or away from the observer. To overcome this limitation, we deployed two synchronized high-speed video cameras at Stromboli. The two cameras, located sixty meters apart, filmed Strombolian explosions at 500 and 1000 frames per second and with different resolutions. Frames from the two cameras were pre-processed and combined into a single video showing frames alternating from one to the other camera. Bomb-sized pyroclasts were then manually identified and tracked in the combined video, together with fixed reference points located as close as possible to the vent. The results from manual tracking were fed to a custom software routine that, knowing the relative position of the vent and cameras, and the field of view of the latter, provided the position of each bomb relative to the reference points. By tracking tens of bombs over five to ten frames at different intervals during one explosion, we were able to reconstruct the three-dimensional evolution of the displacement and velocity fields of bomb-sized pyroclasts during individual Strombolian explosions. Shifting jet directivity and dispersal angle clearly appear from the three-dimensional analysis.
Effects of visual information regarding allocentric processing in haptic parallelity matching.
Van Mier, Hanneke I
2013-10-01
Research has revealed that haptic perception of parallelity deviates from physical reality. Large and systematic deviations have been found in haptic parallelity matching most likely due to the influence of the hand-centered egocentric reference frame. Providing information that increases the influence of allocentric processing has been shown to improve performance on haptic matching. In this study allocentric processing was stimulated by providing informative vision in haptic matching tasks that were performed using hand- and arm-centered reference frames. Twenty blindfolded participants (ten men, ten women) explored the orientation of a reference bar with the non-dominant hand and subsequently matched (task HP) or mirrored (task HM) its orientation on a test bar with the dominant hand. Visual information was provided by means of informative vision with participants having full view of the test bar, while the reference bar was blocked from their view (task VHP). To decrease the egocentric bias of the hands, participants also performed a visual haptic parallelity drawing task (task VHPD) using an arm-centered reference frame, by drawing the orientation of the reference bar. In all tasks, the distance between and orientation of the bars were manipulated. A significant effect of task was found; performance improved from task HP, to VHP to VHPD, and HM. Significant effects of distance were found in the first three tasks, whereas orientation and gender effects were only significant in tasks HP and VHP. The results showed that stimulating allocentric processing by means of informative vision and reducing the egocentric bias by using an arm-centered reference frame led to most accurate performance on parallelity matching. © 2013 Elsevier B.V. All rights reserved.
Radiometrically accurate scene-based nonuniformity correction for array sensors.
Ratliff, Bradley M; Hayat, Majeed M; Tyo, J Scott
2003-10-01
A novel radiometrically accurate scene-based nonuniformity correction (NUC) algorithm is described. The technique combines absolute calibration with a recently reported algebraic scene-based NUC algorithm. The technique is based on the following principle: First, detectors that are along the perimeter of the focal-plane array are absolutely calibrated; then the calibration is transported to the remaining uncalibrated interior detectors through the application of the algebraic scene-based algorithm, which utilizes pairs of image frames exhibiting arbitrary global motion. The key advantage of this technique is that it can obtain radiometric accuracy during NUC without disrupting camera operation. Accurate estimates of the bias nonuniformity can be achieved with relatively few frames, which can be fewer than ten frame pairs. Advantages of this technique are discussed, and a thorough performance analysis is presented with use of simulated and real infrared imagery.
Five Different Types of Framing Effects in Medical Situation: A Preliminary Exploration
Peng, Jiaxi; Li, Hongzheng; Miao, Danmin; Feng, Xi; Xiao, Wei
2013-01-01
Background Considerable reports concerned the framing effect in medical situations. But quite few of them noticed to explore the differences among the various kinds of framing effects. Objectives In the present study, five different types of framing effects were examined and the effect sizes of them were compared. Materials and Methods Medical decision making problems concerning medicine effect evaluation, patient's compliance, treatment and doctor options selection were established. All the problems were described in both positive and negative frames. 500 undergraduates as participants were randomly divided into ten groups. Participants from each group were asked to finish one decision making task. Results All the frames that were examined leaded to significant framing effects: When the Asia Disease Problem was described in a positive frame, the participants preferred the conservative frame than the risky one, while if in a negative frame, the preference reversed (P < 0.01). If the drug effect was described as “of 100 patients taking this kind of medicine, 70 patients became better”, people tended to make more positive evaluations, compared with described as “of 100 patients taking this kind of medicine, 30 patients didn’t become better” (P < 0.01). Doctors’ advices were respectively described in a baneful or beneficial frame and the former one resulted in a better compliance (P < 0.05). If treatment options were described with a survival rate, people tended to choose risky option, while if described with a mortality rate, people tended to choose conservative option (P < 0.05). The number sized framing effect was also tested to be significant (P < 0.01). The five types of framing effects were small to big in effect size. Conclusions Medical decision making can be affected by frame descriptions. Attentions should be paid on the standardization of description in medical practice. PMID:23682330
ERIC Educational Resources Information Center
Howard, Elaine; Msengi, Clementine; Harris, Sandra
2017-01-01
Ten women superintendents in Texas were interviewed for this phenomenological narrative study to understand their mentoring experiences framed within transformational leadership theory. The research used a guided protocol to conduct face-to-face interviews. In this study, authors sought to answer questions about the influence of mentorship…
Talk as a Metacognitive Strategy during the Information Search Process of Adolescents
ERIC Educational Resources Information Center
Bowler, Leanne
2010-01-01
Introduction: This paper describes a metacognitive strategy related to the social dimension of the information search process of adolescents. Method: A case study that used naturalistic methods to explore the metacognitive thinking nd associated emotions of ten adolescents. The study was framed by Kuhlthau's Information Search Process model and…
ERIC Educational Resources Information Center
Xu, Yonghong
2015-01-01
This study investigates the underrepresentation of women in science, technology, engineering, and mathematics (STEM) occupations from the aspect of earning differentials. Using a national data source that tracked college graduates' work experiences over a ten-year time frame post-bachelor's degree, this study examines longitudinally the…
Increasing Mathematical Computation Skills for Students with Physical and Health Disabilities
ERIC Educational Resources Information Center
Webb, Paula
2017-01-01
Students with physical and health disabilities struggle with basic mathematical concepts. The purpose of this research study was to increase the students' mathematical computation skills through implementing new strategies and/or methods. The strategies implemented with the students was utilizing the ten-frame tiles and technology with the purpose…
High Speed White Dwarf Asteroseismology with the Herty Hall Cluster
NASA Astrophysics Data System (ADS)
Gray, Aaron; Kim, A.
2012-01-01
Asteroseismology is the process of using observed oscillations of stars to infer their interior structure. In high speed asteroseismology, we complete that by quickly computing hundreds of thousands of models to match the observed period spectra. Each model on a single processor takes five to ten seconds to run. Therefore, we use a cluster of sixteen Dell Workstations with dual-core processors. The computers use the Ubuntu operating system and Apache Hadoop software to manage workloads.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. House Committee on the Budget.
According to Congressman Charles E. Schumer in his opening statement, the deqrease in Federal housing funds is inextricably linked to the increase in homelessness. Since 1981 the Reagan Administration has been systematically dismantling the nation's housing programs, leaving tens of thousands of low-income people homeless. In 1982 there were 1,088…
Vortex Advisory System. Volume I. Effectiveness for Selected Airports.
1980-05-01
analysis of tens of thousands of vortex tracks. Wind velocity was found to be the primary determinant of vortex behavior. The VAS uses wind-velocity...and the correlation of vortex be- havior with the ambient winds. Analysis showed that a wind-rose criterion could be used to determine when interarrival...Washington DC. 2. Hallock, J.N., " Vortex Advisory System Safety Analysis , Vol. I: Analytical Model ," FAA-RD-78-68,1, Sep. 1978, DOT/ Transportation
A Public Trust: An Executive Summary of GREAT I,
1980-09-01
reconstruction and/or ex- cargo in 1975. Commodities such as tens of thousands of species of plants pmsion of locks arid dam 26. grains, fertilizer...our residen- open backwater areas to marshland. well as the many species of plants , tial, commercial, and industrial Conpario of an 1895 sounding of...appropriate States and based work done to dab idicate that recreational use of the river on the GREAT I site-specific recommendations. wi gmo and ta ned for
Assessing the Impact of Social Media on the 25 January 2011 Egyptian Revolution
2012-03-01
Ahmed Maher to support workers of Mahalla al Kubra. The group used the Internet (social media and blogs), mobile telephones, and word of mouth to...especially the poor, were united by their collective struggle. As a result, the word of mouth went viral and brought tens of thousands of Egyptians...distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words ) In the light of the dramatic events of the 25 January 2011
Genome-scale engineering of Saccharomyces cerevisiae with single-nucleotide precision.
Bao, Zehua; HamediRad, Mohammad; Xue, Pu; Xiao, Han; Tasan, Ipek; Chao, Ran; Liang, Jing; Zhao, Huimin
2018-07-01
We developed a CRISPR-Cas9- and homology-directed-repair-assisted genome-scale engineering method named CHAnGE that can rapidly output tens of thousands of specific genetic variants in yeast. More than 98% of target sequences were efficiently edited with an average frequency of 82%. We validate the single-nucleotide resolution genome-editing capability of this technology by creating a genome-wide gene disruption collection and apply our method to improve tolerance to growth inhibitors.
Development of mini VSAT system
NASA Astrophysics Data System (ADS)
Lu, Shyue-Ching; Chiu, Wu-Jhy; Lin, Hen-Dao; Shih, Mu-Piao
1992-03-01
This paper presents the mini VSAT (very small aperture terminal) system, which is a low cost networking system providing economical alternatives to the business world's datacom needs. The system is designed to achieve the highest possible performance/price ratio for private VSAT networks that range from a few tens of remote terminals to large networks of several thousands remote terminals. The paper describes the system architecture, major features, hardware and software structure, access protocol and performance of the developed system.
Dinov, Ivo D.; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Zamanyan, Alen; Torri, Federica; Macciardi, Fabio; Hobel, Sam; Moon, Seok Woo; Sung, Young Hee; Jiang, Zhiguo; Labus, Jennifer; Kurth, Florian; Ashe-McNalley, Cody; Mayer, Emeran; Vespa, Paul M.; Van Horn, John D.; Toga, Arthur W.
2013-01-01
The volume, diversity and velocity of biomedical data are exponentially increasing providing petabytes of new neuroimaging and genetics data every year. At the same time, tens-of-thousands of computational algorithms are developed and reported in the literature along with thousands of software tools and services. Users demand intuitive, quick and platform-agnostic access to data, software tools, and infrastructure from millions of hardware devices. This explosion of information, scientific techniques, computational models, and technological advances leads to enormous challenges in data analysis, evidence-based biomedical inference and reproducibility of findings. The Pipeline workflow environment provides a crowd-based distributed solution for consistent management of these heterogeneous resources. The Pipeline allows multiple (local) clients and (remote) servers to connect, exchange protocols, control the execution, monitor the states of different tools or hardware, and share complete protocols as portable XML workflows. In this paper, we demonstrate several advanced computational neuroimaging and genetics case-studies, and end-to-end pipeline solutions. These are implemented as graphical workflow protocols in the context of analyzing imaging (sMRI, fMRI, DTI), phenotypic (demographic, clinical), and genetic (SNP) data. PMID:23975276
Friend, Milton; Franson, J. Christian
1987-01-01
Individual disease outbreaks have killed many thousands of animals on numerous occasions. Tens of thousands of migratory birds have died in single die-offs with as many as 1,000 birds succumbing in 1 day. In mammals, individual disease outbreaks have killed hundreds to thousands of animals with, for example, hemorrhagic disease in white-tailed deer, distemper in raccoon, Errington's disease in muskrat, and sylvatic plague in wild rodents. The ability to successfully combat such explosive situations is highly dependent n the readiness of field personnel to deal with them. Because many disease agents can spread though wildlife populations very fast, advance preparation is essential in preventing infected animals from spreading disease to additional species and locations. Carefully though-out disease contingency plans should be developed as practical working documents for field personnel and updated as necessary. Such well-designed plans can prove invaluable in minimizing wildlife losses and costs associated with disease control activities. Although requirements for disease control operations vary and must be tailored to each situation, all disease contingency planning involved general concepts and basic biological information. This chapter, intended as a practical guide, identifies the major activities and needs of disease control operations, and relates them to disease contingency planning.
Miller, C. Dan; Sushyar, R.; ,; Hamidi, S.
1983-01-01
The Dieng Mountains region consists of a complex of late Quaternary to recent volcanic stratocones, parasitic vents, and explosion craters. Six age groups of volcanic centers, eruptive products, and explosion craters are recognized in the region based on their morphology, degree of dissection, stratigraphic relationships, and degree of weathering. These features range in age from tens of thousands of years to events that have occurred this century. No magmatic eruptions have occurred in the Dieng Mountains region for at least several thousand years; volcanic activity during this time interval has consisted of phreatic eruptions and non-explosive hydrothermal activity. If future volcanic events are similar to those of the last few thousand years, they will consist of phreatic eruptions, associated small hot mudflows, emission of suffocating gases, and hydrothermal activity. Future phreatic eruptions may follow, or accompany, periods of increased earthquake activity; the epicenters for the seismicity may suggest where eruptive activity will occur. Under such circumstances, the populace within several kilometers of a potential eruption site should be warned of a possible eruption, given instructions about what to do in the event of an eruption, or temporarily evacuated to a safer location.
NASA Astrophysics Data System (ADS)
Atubga, David; Wu, Huijuan; Lu, Lidong; Sun, Xiaoyan
2017-02-01
Typical fully distributed optical fiber sensors (DOFS) with dozens of kilometers are equivalent to tens of thousands of point sensors along the whole monitoring line, which means tens of thousands of data will be generated for one pulse launching period. Therefore, in an all-day nonstop monitoring, large volumes of data are created thereby triggering the demand for large storage space and high speed for data transmission. In addition, when the monitoring length and channel numbers increase, the data also increase extensively. The task of mitigating large volumes of data accumulation, large storage capacity, and high-speed data transmission is, therefore, the aim of this paper. To demonstrate our idea, we carried out a comparative study of two lossless methods, Huffman and Lempel Ziv Welch (LZW), with a lossy data compression algorithm, fast wavelet transform (FWT) based on three distinctive DOFS sensing data, such as Φ-OTDR, P-OTDR, and B-OTDA. Our results demonstrated that FWT yielded the best compression ratio with good consumption time, irrespective of errors in signal construction of the three DOFS data. Our outcomes indicate the promising potentials of FWT which makes it more suitable, reliable, and convenient for real-time compression of the DOFS data. Finally, it was observed that differences in the DOFS data structure have some influence on both the compression ratio and computational cost.
NASA Astrophysics Data System (ADS)
Barboni, Mélanie; Boehnke, Patrick; Schmitt, Axel K.; Harrison, T. Mark; Shane, Phil; Bouvier, Anne-Sophie; Baumgartner, Lukas
2016-12-01
Felsic magmatic systems represent the vast majority of volcanic activity that poses a threat to human life. The tempo and magnitude of these eruptions depends on the physical conditions under which magmas are retained within the crust. Recently the case has been made that volcanic reservoirs are rarely molten and only capable of eruption for durations as brief as 1,000 years following magma recharge. If the “cold storage” model is generally applicable, then geophysical detection of melt beneath volcanoes is likely a sign of imminent eruption. However, some arc volcanic centers have been active for tens of thousands of years and show evidence for the continual presence of melt. To address this seeming paradox, zircon geochronology and geochemistry from both the frozen lava and the cogenetic enclaves they host from the Soufrière Volcanic Center (SVC), a long-lived volcanic complex in the Lesser Antilles arc, were integrated to track the preeruptive thermal and chemical history of the magma reservoir. Our results show that the SVC reservoir was likely eruptible for periods of several tens of thousands of years or more with punctuated eruptions during these periods. These conclusions are consistent with results from other arc volcanic reservoirs and suggest that arc magmas are generally stored warm. Thus, the presence of intracrustal melt alone is insufficient as an indicator of imminent eruption, but instead represents the normal state of magma storage underneath dormant volcanoes.
Barboni, Mélanie; Boehnke, Patrick; Schmitt, Axel K; Harrison, T Mark; Shane, Phil; Bouvier, Anne-Sophie; Baumgartner, Lukas
2016-12-06
Felsic magmatic systems represent the vast majority of volcanic activity that poses a threat to human life. The tempo and magnitude of these eruptions depends on the physical conditions under which magmas are retained within the crust. Recently the case has been made that volcanic reservoirs are rarely molten and only capable of eruption for durations as brief as 1,000 years following magma recharge. If the "cold storage" model is generally applicable, then geophysical detection of melt beneath volcanoes is likely a sign of imminent eruption. However, some arc volcanic centers have been active for tens of thousands of years and show evidence for the continual presence of melt. To address this seeming paradox, zircon geochronology and geochemistry from both the frozen lava and the cogenetic enclaves they host from the Soufrière Volcanic Center (SVC), a long-lived volcanic complex in the Lesser Antilles arc, were integrated to track the preeruptive thermal and chemical history of the magma reservoir. Our results show that the SVC reservoir was likely eruptible for periods of several tens of thousands of years or more with punctuated eruptions during these periods. These conclusions are consistent with results from other arc volcanic reservoirs and suggest that arc magmas are generally stored warm. Thus, the presence of intracrustal melt alone is insufficient as an indicator of imminent eruption, but instead represents the normal state of magma storage underneath dormant volcanoes.
Barboni, Mélanie; Schmitt, Axel K.; Harrison, T. Mark; Shane, Phil; Bouvier, Anne-Sophie; Baumgartner, Lukas
2016-01-01
Felsic magmatic systems represent the vast majority of volcanic activity that poses a threat to human life. The tempo and magnitude of these eruptions depends on the physical conditions under which magmas are retained within the crust. Recently the case has been made that volcanic reservoirs are rarely molten and only capable of eruption for durations as brief as 1,000 years following magma recharge. If the “cold storage” model is generally applicable, then geophysical detection of melt beneath volcanoes is likely a sign of imminent eruption. However, some arc volcanic centers have been active for tens of thousands of years and show evidence for the continual presence of melt. To address this seeming paradox, zircon geochronology and geochemistry from both the frozen lava and the cogenetic enclaves they host from the Soufrière Volcanic Center (SVC), a long-lived volcanic complex in the Lesser Antilles arc, were integrated to track the preeruptive thermal and chemical history of the magma reservoir. Our results show that the SVC reservoir was likely eruptible for periods of several tens of thousands of years or more with punctuated eruptions during these periods. These conclusions are consistent with results from other arc volcanic reservoirs and suggest that arc magmas are generally stored warm. Thus, the presence of intracrustal melt alone is insufficient as an indicator of imminent eruption, but instead represents the normal state of magma storage underneath dormant volcanoes. PMID:27799558
Rietsch, Katrin; Godina, Elena; Scheffler, Christiane
2013-01-01
Obesity and a reduced physical activity are global developments. Physical activity affects the external skeletal robustness which decreased in German children. It was assumed that the negative trend of decreased external skeletal robustness can be found in other countries. Therefore anthropometric data of Russian and German children from the years 2000 and 2010 were compared. Russian (2000/2010 n = 1023/268) and German (2000/2010 n = 2103/1750) children aged 6-10 years were investigated. Height, BMI and external skeletal robustness (Frame-Index) were examined and compared for the years and the countries. Statistical analysis was performed by Mann-Whitney-Test. Comparison 2010 and 2000: In Russian children BMI was significantly higher; boys were significantly taller and exhibited a decreased Frame-Index (p = .002) in 2010. German boys showed significantly higher BMI in 2010. In both sexes Frame-Index (p = .001) was reduced in 2010. Comparison Russian and German children in 2000: BMI, height and Frame-Index were different between Russian and German children. German children were significantly taller but exhibited a lower Frame-Index (p<.001). Even German girls showed a significantly higher BMI. Comparison Russian and German children in 2010: BMI and Frame-Index were different. Russian children displayed a higher Frame-Index (p<.001) compared with Germans. In Russian children BMI has increased in recent years. Frame-Index is still higher in Russian children compared with Germans however in Russian boys Frame-Index is reduced. This trend and the physical activity should be observed in the future.
Intellectual property analysis of holographic materials business
NASA Astrophysics Data System (ADS)
Reingand, Nadya; Hunt, David
2006-02-01
The paper presents an overview of intellectual property in the field of holographic photosensitive materials and highlights the possibilities offered by patent searching and analysis. Thousands of patent documents relevant to holographic materials have been uncovered by the study. The search was performed in the following databases: U.S. Patent Office, European Patent Office, and Japanese Patent Office for the time frame of 1971 through November 2005. The patent analysis has unveiled trends in patent temporal distribution, leading IP portfolios, companies competition within the holographic materials market and other interesting insights.
Intellectual property (IP) analysis of embossed hologram business
NASA Astrophysics Data System (ADS)
Hunt, David; Reingand, Nadya; Cantrell, Robert
2006-02-01
This paper presents an overview of patents and patent applications on security embossed holograms, and highlights the possibilities offered by patent searching and analysis. Thousands of patent documents relevant to embossed holograms were uncovered by the study. The search was performed in the following databases: U.S. Patent Office, European Patent Office, Japanese Patent Office and Korean Patent Office for the time frame from 1971 through November 2005. The patent analysis unveils trends in patent temporal distribution, patent families formation, significant technological coverage within the embossed holography market and other interesting insights.
Intellectual property in holographic interferometry
NASA Astrophysics Data System (ADS)
Reingand, Nadya; Hunt, David
2006-08-01
This paper presents an overview of patents and patent applications on holographic interferometry, and highlights the possibilities offered by patent searching and analysis. Thousands of patent documents relevant to holographic interferometry were uncovered by the study. The search was performed in the following databases: U.S. Patent Office, European Patent Office, Japanese Patent Office and Korean Patent Office for the time frame from 1971 through May 2006. The patent analysis unveils trends in patent temporal distribution, patent families formation, significant technological coverage within the market of system that employ holographic interferometry and other interesting insights.
WCE video segmentation using textons
NASA Astrophysics Data System (ADS)
Gallo, Giovanni; Granata, Eliana
2010-03-01
Wireless Capsule Endoscopy (WCE) integrates wireless transmission with image and video technology. It has been used to examine the small intestine non invasively. Medical specialists look for signicative events in the WCE video by direct visual inspection manually labelling, in tiring and up to one hour long sessions, clinical relevant frames. This limits the WCE usage. To automatically discriminate digestive organs such as esophagus, stomach, small intestine and colon is of great advantage. In this paper we propose to use textons for the automatic discrimination of abrupt changes within a video. In particular, we consider, as features, for each frame hue, saturation, value, high-frequency energy content and the responses to a bank of Gabor filters. The experiments have been conducted on ten video segments extracted from WCE videos, in which the signicative events have been previously labelled by experts. Results have shown that the proposed method may eliminate up to 70% of the frames from further investigations. The direct analysis of the doctors may hence be concentrated only on eventful frames. A graphical tool showing sudden changes in the textons frequencies for each frame is also proposed as a visual aid to find clinically relevant segments of the video.
Microarray-based cancer prediction using soft computing approach.
Wang, Xiaosheng; Gotoh, Osamu
2009-05-26
One of the difficulties in using gene expression profiles to predict cancer is how to effectively select a few informative genes to construct accurate prediction models from thousands or ten thousands of genes. We screen highly discriminative genes and gene pairs to create simple prediction models involved in single genes or gene pairs on the basis of soft computing approach and rough set theory. Accurate cancerous prediction is obtained when we apply the simple prediction models for four cancerous gene expression datasets: CNS tumor, colon tumor, lung cancer and DLBCL. Some genes closely correlated with the pathogenesis of specific or general cancers are identified. In contrast with other models, our models are simple, effective and robust. Meanwhile, our models are interpretable for they are based on decision rules. Our results demonstrate that very simple models may perform well on cancerous molecular prediction and important gene markers of cancer can be detected if the gene selection approach is chosen reasonably.
Mechanisms and modelling of waste-cement and cement-host rock interactions
NASA Astrophysics Data System (ADS)
2017-06-01
Safe and sustainable disposal of hazardous and radioactive waste is a major concern in today's industrial societies. The hazardous waste forms originate from residues of thermal treatment of waste, fossil fuel combustion and ferrous/non-ferrous metal smelting being the most important ones in terms of waste production. Low- and intermediate-level radioactive waste is produced in the course of nuclear applications in research and energy production. For both waste forms encapsulation in alkaline, cement-based matrices is considered to ensure long-term safe disposal. Cementitious materials are in routine use as industrial materials and have mainly been studied with respect to their evolution over a typical service life of several decades. Use of these materials in waste management applications, however, requires assessments of their performance over much longer time periods on the order of thousands to several ten thousands of years.
NASA Astrophysics Data System (ADS)
Jia, Weile; Wang, Jue; Chi, Xuebin; Wang, Lin-Wang
2017-02-01
LS3DF, namely linear scaling three-dimensional fragment method, is an efficient linear scaling ab initio total energy electronic structure calculation code based on a divide-and-conquer strategy. In this paper, we present our GPU implementation of the LS3DF code. Our test results show that the GPU code can calculate systems with about ten thousand atoms fully self-consistently in the order of 10 min using thousands of computing nodes. This makes the electronic structure calculations of 10,000-atom nanosystems routine work. This speed is 4.5-6 times faster than the CPU calculations using the same number of nodes on the Titan machine in the Oak Ridge leadership computing facility (OLCF). Such speedup is achieved by (a) carefully re-designing of the computationally heavy kernels; (b) redesign of the communication pattern for heterogeneous supercomputers.
A Compact, Flexible, High Channel Count DAQ Built From Off-the-Shelf Components
Heffner, M.; Riot, V.; Fabris, L.
2013-06-01
Medium to large channel count detectors are usually faced with a few unattractive options for data acquisition (DAQ). Small to medium sized TPC experiments, for example, can be too small to justify the high expense and long development time of application specific integrated circuit (ASIC) development. In some cases an experiment can piggy-back on a larger experiment and the associated ASIC development, but this puts the time line of development out of the hands of the smaller experiment. Another option is to run perhaps thousands of cables to rack mounted equipment, which is clearly undesirable. The development of commercial high-speedmore » high-density FPGAs and ADCs combined with the small discrete components and robotic assembly open a new option that scales to tens of thousands of channels and is only slightly larger than ASICs using off-the-shelf components.« less
James, John S
2004-01-01
India changed its pharmaceutical patent law to conform to the U.S.-European system, just ahead of a Jan. 1 World Trade Organization deadline--meaning that most new medicines (patentable in 1995 or later) will be priced out of reach of the great majority of people in India--and in Africa and other poor regions as well. "The real issue for the multinational corporations is not the poor-country markets, which are financially small and unattractive, but the poor-country examples. How would thousands of people in rich countries, especially the U.S., be persuaded to accept death from cancer and other diseases because they cannot pay tens of thousands of dollars a year for a new generation of treatments that could save their lives--if companies in India could manufacture and sell the same treatments for a small fraction of the price?"
Zircon from historic eruptions in Iceland: reconstructing storage and evolution of silicic magmas
NASA Astrophysics Data System (ADS)
Carley, Tamara L.; Miller, Calvin F.; Wooden, Joseph L.; Bindeman, Ilya N.; Barth, Andrew P.
2011-10-01
Zoning patterns, U-Th disequilibria ages, and elemental compositions of zircon from eruptions of Askja (1875 AD), Hekla (1158 AD), Öræfajökull (1362 AD) and Torfajökull (1477 AD, 871 AD, 3100 BP, 7500 BP) provide insights into the complex, extended, histories of silicic magmatic systems in Iceland. Zircon compositions, which are correlated with proximity to the main axial rift, are distinct from those of mid-ocean ridge environments and fall at the low-Hf edge of the range of continental zircon. Morphology, zoning patterns, compositions, and U-Th ages all indicate growth and storage in subvolcanic silicic mushes or recently solidified rock at temperatures above the solidus but lower than that of the erupting magma. The eruptive products were likely ascending magmas that entrained a zircon "cargo" that formed thousands to tens of thousands of years prior to the eruptions.
Climate Change: Past, Present, and Future
NASA Astrophysics Data System (ADS)
Chapman, David S.; Davis, Michael G.
2010-09-01
Questions about global warming concern climate scientists and the general public alike. Specifically, what are the reliable surface temperature reconstructions over the past few centuries? And what are the best predictions of global temperature change the Earth might expect for the next century? Recent publications [National Research Council (NRC), 2006; Intergovernmental Panel on Climate Change (IPCC), 2007] permit these questions to be answered in a single informative illustration by assembling temperature reconstructions of the past thousand years with predictions for the next century. The result, shown in Figure 1, illustrates present and future warming in the context of natural variations in the past [see also Oldfield and Alverson, 2003]. To quote a Chinese proverb, “A picture's meaning can express ten thousand words.” Because it succinctly captures past inferences and future projections of climate, the illustration should be of interest to scientists, educators, policy makers, and the public.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shyer, E.B.
The New York State Development of Environmental Conservation`s Division of Mineral Resources is responsible for regulating the oil and gas industry and receiving operator`s annual well production reports. Production year 1970 and 627 active gas wells with reported production of 3 billion cubic feet by New York State operators. Ten years later in 1980, production had more than tripled to 15.5 billion cubic feet and reported active gas wells increased to 1,966. During 1990, reported gas production was 25 billion cubic feet from 5,536 active gas wells. The average production per gas well in 1970 was 4,773 thousand cubic feet.more » Average gas production per well peaked in 1978 with a reported production of 14 billion cubic feet by 1,431 active gas wells which averaged 9,821 thousand cubic feet per well. By 1994 the average production per well had decreased to 3,800 thousand cubic feet, a decrease of approximately 60%. The decrease in average well production is more a reflection of the majority of older wells reaching the lower end of their decline curve than a decrease in overall per well production. The number of completed gas wells increased following the rising price of gas. In 1970 gas was $0.30 per thousand cubic feet. By 1984 the price per thousand cubic feet had peaked at $4. After 1984 the price of gas started to decline while the number of active gas wells continued to increase. Sharp increases in gas production for certain counties such as Steuben in 1972 and 1973 and Chautauqua in 1980-83 reflects the discoveries of new fields such as Adrian Reef and Bass Island, respectively. The Stagecoach Field discovered in 1989 in Tioga County is the newest high producing field in New York State.« less
Olugasa, Babasola Oluseyi; Oshinowo, Oluwafunmilola Yemisi; Odigie, Eugene Amienwanlen
2015-01-01
Introduction As Ebola virus disease (EVD) continues to pose public health challenge in West Africa, with attending fears and socio-economic implications in the current epidemic challenges. It is compelling to estimate the social and preventive costs of EVD containment in a Nigerian city. Hence, this study was to determine the social and preventive cost implications of EVD among selected public institutions in Lagos, Nigeria, from July to December, 2014. Methods Questionnaires and key-informants interview were administered to respondents and administrators of selected hospitals, hotels and schools in Eti-Osa Local Government Area of Lagos State. Knowledge of disease transmission, mortality and protocols for prevention, including cost of specific preventive measures adopted against EVD were elicited from respondents. Descriptive statistics and categorical analysis were used to summarize and estimate social and preventive costs incurred by respective institutions. Results An estimated five million, nineteen thousand, three hundred and seventy-nine Naira and eighty kobo (N5,019,379.80) only was observed as direct and social cost implication of EVD prevention. This amount translated into a conservative estimate of one billion, twenty-seven million, ninety-four thousand, seven hundred and fifty-six Naira (N1,027,094,756.10) for a total of four thousand schools, two hundred and fifty-three hospitals and one thousand, four hundred and fifty one hotels in Lagos during the period (July 20-November 20, 2014). Conclusion The high cost of prevention of EVD within the short time-frame indicated high importance attached to a preventive policy against highly pathogenic zoonotic disease in Nigeria. PMID:26740848
Mining the human phenome using allelic scores that index biological intermediates.
Evans, David M; Brion, Marie Jo A; Paternoster, Lavinia; Kemp, John P; McMahon, George; Munafò, Marcus; Whitfield, John B; Medland, Sarah E; Montgomery, Grant W; Timpson, Nicholas J; St Pourcain, Beate; Lawlor, Debbie A; Martin, Nicholas G; Dehghan, Abbas; Hirschhorn, Joel; Smith, George Davey
2013-10-01
It is common practice in genome-wide association studies (GWAS) to focus on the relationship between disease risk and genetic variants one marker at a time. When relevant genes are identified it is often possible to implicate biological intermediates and pathways likely to be involved in disease aetiology. However, single genetic variants typically explain small amounts of disease risk. Our idea is to construct allelic scores that explain greater proportions of the variance in biological intermediates, and subsequently use these scores to data mine GWAS. To investigate the approach's properties, we indexed three biological intermediates where the results of large GWAS meta-analyses were available: body mass index, C-reactive protein and low density lipoprotein levels. We generated allelic scores in the Avon Longitudinal Study of Parents and Children, and in publicly available data from the first Wellcome Trust Case Control Consortium. We compared the explanatory ability of allelic scores in terms of their capacity to proxy for the intermediate of interest, and the extent to which they associated with disease. We found that allelic scores derived from known variants and allelic scores derived from hundreds of thousands of genetic markers explained significant portions of the variance in biological intermediates of interest, and many of these scores showed expected correlations with disease. Genome-wide allelic scores however tended to lack specificity suggesting that they should be used with caution and perhaps only to proxy biological intermediates for which there are no known individual variants. Power calculations confirm the feasibility of extending our strategy to the analysis of tens of thousands of molecular phenotypes in large genome-wide meta-analyses. We conclude that our method represents a simple way in which potentially tens of thousands of molecular phenotypes could be screened for causal relationships with disease without having to expensively measure these variables in individual disease collections.
Spectral domain polarization-sensitive optical coherence tomography at 850 nm
NASA Astrophysics Data System (ADS)
Cense, Barry; Chen, Teresa C.; Mujat, Mircea; Joo, Chulmin; Akkin, Taner; Park, B. H.; Pierce, Mark C.; Yun, Andy; Bouma, Brett E.; Tearney, Guillermo J.; de Boer, Johannes F.
2005-04-01
Spectral-Domain Polarization-Sensitive Optical Coherence Tomography (SD-PS-OCT) is a technique developed to measure the thickness and birefringence of the nerve fiber layer in vivo as a tool for the early diagnosis of glaucoma. A clinical SD-PS-OCT system was developed and scans were made around the optic nerve head (ONH) using ten concentric circles of increasing diameter. One healthy volunteer was imaged. Retinal nerve fiber layer thickness and birefringence information was extracted from the data. Polarization-sensitive OCT images were acquired at video rate (29 frames per second (fps), 1000 A-lines / frame) and at 7 fps (1000 A-lines / frame). The last setting improved the signal to noise ratio by approximately 6 dB. Birefringence measurements on the healthy volunteer gave similar results as earlier reported values that were obtained with a time-domain setup. The measurement time was reduced from more than a minute to less than a second.
Polarimetric Imaging using Two Photoelastic Modulators
NASA Technical Reports Server (NTRS)
Wang, Yu; Cunningham, Thomas; Diner, David; Davis, Edgar; Sun, Chao; Hancock, Bruce; Gutt, Gary; Zan, Jason; Raouf, Nasrat
2009-01-01
A method of polarimetric imaging, now undergoing development, involves the use of two photoelastic modulators in series, driven at equal amplitude but at different frequencies. The net effect on a beam of light is to cause (1) the direction of its polarization to rotate at the average of two excitation frequencies and (2) the amplitude of its polarization to be modulated at the beat frequency (the difference between the two excitation frequencies). The resulting modulated optical light beam is made to pass through a polarizing filter and is detected at the beat frequency, which can be chosen to equal the frame rate of an electronic camera or the rate of sampling the outputs of photodetectors in an array. The method was conceived to satisfy a need to perform highly accurate polarimetric imaging, without cross-talk between polarization channels, at frame rates of the order of tens of hertz. The use of electro-optical modulators is necessitated by a need to obtain accuracy greater than that attainable by use of static polarizing filters over separate fixed detectors. For imaging, photoelastic modulators are preferable to such other electrio-optical modulators as Kerr cells and Pockels cells in that photoelastic modulators operate at lower voltages, have greater angular acceptances, and are easier to use. Prior to the conception of the present method, polarimetric imaging at frame rates of tens of hertz using photoelastic modulators was not possible because the resonance frequencies of photoelastic modulators usually lie in the range from about 20 to about 100 kHz.
Motion in Jupiter's Atmospheric Vortices (Near-infrared filters)
NASA Technical Reports Server (NTRS)
1997-01-01
Two frame 'movie' of a pair of vortices in Jupiter's southern hemisphere. The two frames are separated by ten hours. The right oval is rotating counterclockwise, like other anticyclonic bright vortices in Jupiter's atmosphere. The left vortex is a cyclonic (clockwise) vortex. The differences between them (their brightness, their symmetry, and their behavior) are clues to how Jupiter's atmosphere works. The frames span about fifteen degrees in latitude and longitude and are centered at 141 degrees west longitude and 36 degrees south planetocentric latitude. Both vortices are about 3500 kilometers in diameter in the north-south direction.
The images were taken in near infrared light at 756 nanometers and show clouds that are at a pressure level of about 1 bar in Jupiter's atmosphere. North is at the top. The smallest resolved features are tens of kilometers in size. These images were taken on May 7, 1997, at a range of 1.5 million kilometers by the Solid State Imaging system on NASA's Galileo spacecraft.The Jet Propulsion Laboratory, Pasadena, CA manages the Galileo mission for NASA's Office of Space Science, Washington, DC. JPL is an operating division of California Institute of Technology (Caltech).This image and other images and data received from Galileo are posted on the World Wide Web, on the Galileo mission home page at URL http://galileo.jpl.nasa.gov. Background information and educational context for the images can be found at URL http://www.jpl.nasa.gov/galileo/sepoNASA Astrophysics Data System (ADS)
Wedeking, Gregory A.; Zierer, Joseph J.; Jackson, John R.
2010-07-01
The University of Texas, Center for Electromechanics (UT-CEM) is making a major upgrade to the robotic tracking system on the Hobby Eberly Telescope (HET) as part of theWide Field Upgrade (WFU). The upgrade focuses on a seven-fold increase in payload and necessitated a complete redesign of all tracker supporting structure and motion control systems, including the tracker bridge, ten drive systems, carriage frames, a hexapod, and many other subsystems. The cost and sensitivity of the scientific payload, coupled with the tracker system mass increase, necessitated major upgrades to personnel and hardware safety systems. To optimize kinematic design of the entire tracker, UT-CEM developed novel uses of constraints and drivers to interface with a commercially available CAD package (SolidWorks). For example, to optimize volume usage and minimize obscuration, the CAD software was exercised to accurately determine tracker/hexapod operational space needed to meet science requirements. To verify hexapod controller models, actuator travel requirements were graphically measured and compared to well defined equations of motion for Stewart platforms. To ensure critical hardware safety during various failure modes, UT-CEM engineers developed Visual Basic drivers to interface with the CAD software and quickly tabulate distance measurements between critical pieces of optical hardware and adjacent components for thousands of possible hexapod configurations. These advances and techniques, applicable to any challenging robotic system design, are documented and describe new ways to use commercially available software tools to more clearly define hardware requirements and help insure safe operation.
A Hardware Model Validation Tool for Use in Complex Space Systems
NASA Technical Reports Server (NTRS)
Davies, Misty Dawn; Gundy-Burlet, Karen L.; Limes, Gregory L.
2010-01-01
One of the many technological hurdles that must be overcome in future missions is the challenge of validating as-built systems against the models used for design. We propose a technique composed of intelligent parameter exploration in concert with automated failure analysis as a scalable method for the validation of complex space systems. The technique is impervious to discontinuities and linear dependencies in the data, and can handle dimensionalities consisting of hundreds of variables over tens of thousands of experiments.
NASA Astrophysics Data System (ADS)
Caballero, J. A.
2012-05-01
In the last few years, there have been several projects involving astronomy and classical music. But have a rock band ever appeared at a science conference or an astronomer at a rock concert? We present a project, Multiverso, in which we mix rock and astronomy, together with poetry and video art (Caballero, 2010). The project started in late 2009 and has already reached tens of thousands people in Spain through the release of an album, several concert-talks, television, radio, newspapers and the internet.
A structural perspective of the flavivirus life cycle.
Mukhopadhyay, Suchetana; Kuhn, Richard J; Rossmann, Michael G
2005-01-01
Dengue, Japanese encephalitis, West Nile and yellow fever belong to the Flavivirus genus, which is a member of the Flaviviridae family. They are human pathogens that cause large epidemics and tens of thousands of deaths annually in many parts of the world. The structural organization of these viruses and their associated structural proteins has provided insight into the molecular transitions that occur during the viral life cycle, such as assembly, budding, maturation and fusion. This review focuses mainly on structural studies of dengue virus.
Model based verification of the Secure Socket Layer (SSL) Protocol for NASA systems
NASA Technical Reports Server (NTRS)
Powell, John D.; Gilliam, David
2004-01-01
The National Aeronautics and Space Administration (NASA) has tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information theft, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach' offers formal verification of information technology (IT), through the creation of a Software Security Assessment Instrument (SSAI), to address software security risks.
A Website for Astronomy Education and Outreach
NASA Astrophysics Data System (ADS)
Impey, C.; Danehy, A.
2017-09-01
Teach Astronomy is a free, open access website designed for formal and informal learners of astronomy. The site features: an online textbook complete with quiz questions and a glossary; over ten thousand images; a curated collection of the astronomy articles in Wikipedia; a complete video lecture course; a video Frequently Asked Questions tool; and other materials provided by content partners. Clustering algorithms and an interactive visual interface allow users to browse related content. This article reviews the features of the website and how it can be used.
Silica waveguide devices and their applications
NASA Astrophysics Data System (ADS)
Sun, C. J.; Schmidt, Kevin M.; Lin, Wenhua
2005-03-01
Silica waveguide technology transitioned from laboratories to commercial use in early 1990. Since then, various applications have been exploited based on this technology. Tens of thousands of array waveguide grating (AWG) devices have been installed worldwide for DWDM Mux and Demux. The recent FTTH push in Japan has renewed the significance of this technology for passive optical network (PON) application. This paper reviews the past development of this technology, compare it with competing technologies, and outline the future role of this technology in the evolving optical communications.
North Polar Cap Layers and Ledges
2016-08-24
At the edge of Mars' permanent North Polar cap, we see an exposure of the internal layers, each with a different mix of water ice, dust and dirt. These layers are believed to correspond to different climate conditions over the past tens of thousands of years. When we zoom in closer, we see that the distinct layers erode differently. Some are stronger and more resistant to erosion, others only weakly cemented. The strong layers form ledges. http://photojournal.jpl.nasa.gov/catalog/PIA21022
NASA Astrophysics Data System (ADS)
Barnett, R. Michael
2013-02-01
After half a century of waiting, the drama was intense. Physicists slept overnight outside the auditorium to get seats for the seminar at the CERN lab in Geneva, Switzerland. Ten thousand miles away on the other side of the planet, at the world's most prestigious international particle physics conference, hundreds of physicists from every corner of the globe lined up to hear the seminar streamed live from Geneva (see Fig. 1). And in universities from North America to Asia, physicists and students gathered to watch the streaming talks.
Management of Ebola Virus Disease in Children.
Trehan, Indi; De Silva, Stephanie C
2018-03-01
The West African outbreak of 2013 to 2016 was the largest Ebola epidemic in history. With tens of thousands of patients treated during this outbreak, much was learned about how to optimize clinical care for children with Ebola. In anticipation of inevitable future outbreaks, a firsthand summary of the major aspects of pediatric Ebola case management in austere settings is presented. Emphasis is on early and aggressive critical care, including fluid resuscitation, electrolyte repletion, antimicrobial therapy, and nutritional supplementation. Copyright © 2017 Elsevier Inc. All rights reserved.
Medical device problem reporting for the betterment of healthcare.
1998-08-01
Given that there are nearly 5,000 individual classes of medical devices, tens of thousands of medical device suppliers, and millions of healthcare providers around the world, device-related problems are bound to happen. But effective problem reporting can help reduce or eliminate many of these problems--not only within an institution, but also potentially around the world. In this article, we trace the problem reporting process from its beginnings in the hospital to its global impact in making critical information available throughout the healthcare community.
Integral leadership and signal detection for high reliability organizing and learning
J. M. Saveland
2005-01-01
In the last ten years, the fire management community has made significant advances in firefighter safety and leadership development. Yet, there is no discernible downward trend in entrapment fatalities. While the complexity of the job and exposure of an increasing number of firefighters to increasingly severe situations has surely increased over that time frame, the...
ERIC Educational Resources Information Center
Reaves, Rosalind
2013-01-01
With Critical Race Theory (CRT) and social justice serving as complementary conceptual frames, this ethnographic study investigates the learning and living experiences of ten African American students of a predominantly White university in the Midwest. While several studies have investigated Black students' experiences at PWIs, most notably…
Automated Creation of Labeled Pointcloud Datasets in Support of Machine-Learning Based Perception
2017-12-01
computationally intensive 3D vector math and took more than ten seconds to segment a single LIDAR frame from the HDL-32e with the Dell XPS15 9650’s Intel...Core i7 CPU. Depth Clustering avoids the computationally intensive 3D vector math of Euclidean Clustering-based DON segmentation and, instead
28. VIEW TO NORTHEAST. VIEW OVER TOP OF TRUSS FROM ...
28. VIEW TO NORTHEAST. VIEW OVER TOP OF TRUSS FROM CONTROL CABIN DECK. Photographer unknown, August 1947 (Note that frame for electrical power cables is still in place, though the bridge was converted to hand operation almost ten years earlier.) - Gianella Bridge, Spanning Sacramento River at State Highway 32, Hamilton City, Glenn County, CA
ERIC Educational Resources Information Center
Hayes, Sharon; Koro-Ljungberg, Mirka
2011-01-01
This study, framed by social constructionism, investigated the dialogic exchanges and co-construction of knowledge among female graduate students, who met to discuss the ways in which the differences between mentors and mentees might be negotiated in order to develop and maintain mentoring relationships that benefit both partners. Ten female…
ERIC Educational Resources Information Center
Naigles, Letitia R.; Lehrer, Nadine
2002-01-01
This research investigates language-general and language-specific properties of the acquisition of argument structure. Ten French preschoolers enacted forty sentences containing motion verbs; sixteen sentences were ungrammatical in that the syntactic frame was incompatible with the standard argument structure for the verb (e.g. *"Le tigre va le…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becker, Julian; Tate, Mark W.; Shanks, Katherine S.
Pixel Array Detectors (PADs) consist of an x-ray sensor layer bonded pixel-by-pixel to an underlying readout chip. This approach allows both the sensor and the custom pixel electronics to be tailored independently to best match the x-ray imaging requirements. Here we describe the hybridization of CdTe sensors to two different charge-integrating readout chips, the Keck PAD and the Mixed-Mode PAD (MM-PAD), both developed previously in our laboratory. The charge-integrating architecture of each of these PADs extends the instantaneous counting rate by many orders of magnitude beyond that obtainable with photon counting architectures. The Keck PAD chip consists of rapid, 8-frame,more » in-pixel storage elements with framing periods <150 ns. The second detector, the MM-PAD, has an extended dynamic range by utilizing an in-pixel overflow counter coupled with charge removal circuitry activated at each overflow. This allows the recording of signals from the single-photon level to tens of millions of x-rays/pixel/frame while framing at 1 kHz. Both detector chips consist of a 128×128 pixel array with (150 µm){sup 2} pixels.« less
Brick, Cameron; McCully, Scout N.; Updegraff, John A.; Ehret, Phillip J.; Areguin, Maira A.; Sherman, David K.
2015-01-01
Background Health messages are more effective when framed to be congruent with recipient characteristics, and health practitioners can strategically decide on message features to promote adherence to recommended behaviors. We present exposure to United States (U.S.) culture as a moderator of the impact of gain- vs. loss-frame messages. Since U.S. culture emphasizes individualism and approach orientation, greater cultural exposure was expected to predict improved patient choices and memory for gain-framed messages, whereas individuals with less exposure to U.S. culture would show these advantages for loss-framed messages. Methods 223 participants viewed a written oral health message in one of three randomized conditions: gain-frame, loss-frame, or no-message control, and were given ten flosses. Cultural exposure was measured with the proportions of life spent and parents born in the U.S. At baseline and one week later, participants completed recall tests and reported recent flossing behavior. Results Message frame and cultural exposure interacted to predict improved patient decisions (increased flossing) and memory maintenance for the health message over one week. E.g., those with low cultural exposure who saw a loss-frame message flossed more. Incongruent messages led to the same flossing rates as no message. Memory retention did not explain the effect of message congruency on flossing. Limitations Flossing behavior was self-reported. Cultural exposure may only have practical application in either highly individualistic or collectivistic countries. Conclusions In healthcare settings where patients are urged to follow a behavior, asking basic demographic questions could allow medical practitioners to intentionally communicate in terms of gains or losses to improve patient decision making and treatment adherence. PMID:25654986
Temporary and definitive external fixation of war injuries: use of a French dedicated fixator.
Mathieu, Laurent; Ouattara, Naklan; Poichotte, Antoine; Saint-Macari, Erwan; Barbier, Olivier; Rongiéras, Fréderic; Rigal, Sylvain
2014-08-01
External fixation is the recommended stabilization method for both open and closed fractures of long bones in forward surgical hospitals. Specific combat surgical tactics are best performed using dedicated external fixators. The Percy Fx (Biomet) fixator was developed for this reason by the French Army Medical Service, and has been used in various theatres of operations for more than ten years. The tactics of Percy Fx (Biomet) fixator use were analysed in two different situations: for the treatment of French soldiers wounded on several battlefields and then evacuated to France and for the management of local nationals in forward medical treatment facilities in Afghanistan and Chad. Overall 48 externals fixators were implanted on 37 French casualties; 28 frames were temporary and converted to definitive rigid frames or internal fixation after medical evacuation. The 77 Afghan patients totalled 85 external fixators, including 13 temporary frames applied in Forward Surgical Teams (FSTs) prior to their arrival at the Kabul combat support hospital. All of the 47 Chadian patients were treated in a FST with primary definitive frames because of delayed surgical management and absence of higher level of care in Chad. Temporary frames were mostly used for French soldiers to facilitate strategic air medical evacuation following trauma damage control orthopaedic principles. Definitive rigid frames permitted achieving treatment of all types of war extremity injuries, even in poor conditions.
NASA Astrophysics Data System (ADS)
Pordes, Ruth; OSG Consortium; Petravick, Don; Kramer, Bill; Olson, Doug; Livny, Miron; Roy, Alain; Avery, Paul; Blackburn, Kent; Wenaus, Torre; Würthwein, Frank; Foster, Ian; Gardner, Rob; Wilde, Mike; Blatecky, Alan; McGee, John; Quick, Rob
2007-07-01
The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. OSG provides support for and evolution of the infrastructure through activities that cover operations, security, software, troubleshooting, addition of new capabilities, and support for existing and engagement with new communities. The OSG SciDAC-2 project provides specific activities to manage and evolve the distributed infrastructure and support it's use. The innovative aspects of the project are the maintenance and performance of a collaborative (shared & common) petascale national facility over tens of autonomous computing sites, for many hundreds of users, transferring terabytes of data a day, executing tens of thousands of jobs a day, and providing robust and usable resources for scientific groups of all types and sizes. More information can be found at the OSG web site: www.opensciencegrid.org.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaffney, Kelly
Movies have transformed our perception of the world. With slow motion photography, we can see a hummingbird flap its wings, and a bullet pierce an apple. The remarkably small and extremely fast molecular world that determines how your body functions cannot be captured with even the most sophisticated movie camera today. To see chemistry in real time requires a camera capable of seeing molecules that are one ten billionth of a foot with a frame rate of 10 trillion frames per second! SLAC has embarked on the construction of just such a camera. Please join me as I discuss howmore » this molecular movie camera will work and how it will change our perception of the molecular world.« less
Enhanced Detection of Sea-Disposed Man-Made Objects in Backscatter Data
NASA Astrophysics Data System (ADS)
Edwards, M.; Davis, R. B.
2016-12-01
The Hawai'i Undersea Military Munitions Assessment (HUMMA) project developed software to increase data visualization capabilities applicable to seafloor reflectivity datasets acquired by a variety of bottom-mapping sonar systems. The purpose of these improvements is to detect different intensity values within an arbitrary amplitude range that may be associated with relative target reflectivity as well as extend the overall amplitude range across which detailed dynamic contrast may be effectively displayed. The backscatter dataset used to develop this software imaged tens of thousands of reflective targets resting on the seabed that were systematically sea disposed south of Oahu, Hawaii, around the end of World War II in waters ranging from 300-600 meters depth. Human-occupied and remotely operated vehicles conducted ground-truth video and photographic reconnaissance of thousands of these reflective targets, documenting and geo-referencing long curvilinear trials of items including munitions, paint cans, airplane parts, scuttled ships, cars and bundled anti-submarine nets. Edwards et al. [2012] determined that most individual trails consist of objects of one particular type. The software described in this presentation, in combination with the ground-truth images, was developed to help recognize different types of objects based on reflectivity, size, and shape from altitudes of tens of meters above the seabed. The fundamental goal of the software is to facilitate rapid underway detection and geo-location of specific sea-disposed objects so their impact on the environment can be assessed.
Schweiger, Regev; Fisher, Eyal; Rahmani, Elior; Shenhav, Liat; Rosset, Saharon; Halperin, Eran
2018-06-22
Estimation of heritability is an important task in genetics. The use of linear mixed models (LMMs) to determine narrow-sense single-nucleotide polymorphism (SNP)-heritability and related quantities has received much recent attention, due of its ability to account for variants with small effect sizes. Typically, heritability estimation under LMMs uses the restricted maximum likelihood (REML) approach. The common way to report the uncertainty in REML estimation uses standard errors (SEs), which rely on asymptotic properties. However, these assumptions are often violated because of the bounded parameter space, statistical dependencies, and limited sample size, leading to biased estimates and inflated or deflated confidence intervals (CIs). In addition, for larger data sets (e.g., tens of thousands of individuals), the construction of SEs itself may require considerable time, as it requires expensive matrix inversions and multiplications. Here, we present FIESTA (Fast confidence IntErvals using STochastic Approximation), a method for constructing accurate CIs. FIESTA is based on parametric bootstrap sampling, and, therefore, avoids unjustified assumptions on the distribution of the heritability estimator. FIESTA uses stochastic approximation techniques, which accelerate the construction of CIs by several orders of magnitude, compared with previous approaches as well as to the analytical approximation used by SEs. FIESTA builds accurate CIs rapidly, for example, requiring only several seconds for data sets of tens of thousands of individuals, making FIESTA a very fast solution to the problem of building accurate CIs for heritability for all data set sizes.
Hollis, Geoff
2018-04-01
Best-worst scaling is a judgment format in which participants are presented with a set of items and have to choose the superior and inferior items in the set. Best-worst scaling generates a large quantity of information per judgment because each judgment allows for inferences about the rank value of all unjudged items. This property of best-worst scaling makes it a promising judgment format for research in psychology and natural language processing concerned with estimating the semantic properties of tens of thousands of words. A variety of different scoring algorithms have been devised in the previous literature on best-worst scaling. However, due to problems of computational efficiency, these scoring algorithms cannot be applied efficiently to cases in which thousands of items need to be scored. New algorithms are presented here for converting responses from best-worst scaling into item scores for thousands of items (many-item scoring problems). These scoring algorithms are validated through simulation and empirical experiments, and considerations related to noise, the underlying distribution of true values, and trial design are identified that can affect the relative quality of the derived item scores. The newly introduced scoring algorithms consistently outperformed scoring algorithms used in the previous literature on scoring many-item best-worst data.
Zircon reveals protracted magma storage and recycling beneath Mount St. Helens
Claiborne, L.L.; Miller, C.F.; Flanagan, D.M.; Clynne, M.A.; Wooden, J.L.
2010-01-01
Current data and models for Mount St. Helens volcano (Washington, United States) suggest relatively rapid transport from magma genesis to eruption, with no evidence for protracted storage or recycling of magmas. However, we show here that complex zircon age populations extending back hundreds of thousands of years from eruption age indicate that magmas regularly stall in the crust, cool and crystallize beneath the volcano, and are then rejuvenated and incorporated by hotter, young magmas on their way to the surface. Estimated dissolution times suggest that entrained zircon generally resided in rejuvenating magmas for no more than about a century. Zircon elemental compositions reflect the increasing influence of mafic input into the system through time, recording growth from hotter, less evolved magmas tens of thousands of years prior to the appearance of mafic magmas at the surface, or changes in whole-rock geochemistry and petrology, and providing a new, time-correlated record of this evolution independent of the eruption history. Zircon data thus reveal the history of the hidden, long-lived intrusive portion of the Mount St. Helens system, where melt and crystals are stored for as long as hundreds of thousands of years and interact with fresh influxes of magmas that traverse the intrusive reservoir before erupting. ?? 2010 Geological Society of America.
Shrinkage covariance matrix approach based on robust trimmed mean in gene sets detection
NASA Astrophysics Data System (ADS)
Karjanto, Suryaefiza; Ramli, Norazan Mohamed; Ghani, Nor Azura Md; Aripin, Rasimah; Yusop, Noorezatty Mohd
2015-02-01
Microarray involves of placing an orderly arrangement of thousands of gene sequences in a grid on a suitable surface. The technology has made a novelty discovery since its development and obtained an increasing attention among researchers. The widespread of microarray technology is largely due to its ability to perform simultaneous analysis of thousands of genes in a massively parallel manner in one experiment. Hence, it provides valuable knowledge on gene interaction and function. The microarray data set typically consists of tens of thousands of genes (variables) from just dozens of samples due to various constraints. Therefore, the sample covariance matrix in Hotelling's T2 statistic is not positive definite and become singular, thus it cannot be inverted. In this research, the Hotelling's T2 statistic is combined with a shrinkage approach as an alternative estimation to estimate the covariance matrix to detect significant gene sets. The use of shrinkage covariance matrix overcomes the singularity problem by converting an unbiased to an improved biased estimator of covariance matrix. Robust trimmed mean is integrated into the shrinkage matrix to reduce the influence of outliers and consequently increases its efficiency. The performance of the proposed method is measured using several simulation designs. The results are expected to outperform existing techniques in many tested conditions.
Resource inventory techniques used in the California Desert Conservation Area
NASA Technical Reports Server (NTRS)
Mcleod, R. G.; Johnson, H. B.
1981-01-01
A variety of conventional and remotely sensed data for the 25 million acre California Desert Conservation Area (CDCA) have been integrated and analyzed to estimate range carrying capacity. Multispectral classification was performed on a digital mosaic of ten Landsat frames. Multispectral classes were correlated with low level aerial photography, quantified and aggregated by grazing allotment, land ownership, and slope.
Emerging Conceptual Understanding of Complex Astronomical Phenomena by Using a Virtual Solar System
ERIC Educational Resources Information Center
Gazit, Elhanan; Yair, Yoav; Chen, David
2005-01-01
This study describes high school students' conceptual development of the basic astronomical phenomena during real-time interactions with a Virtual Solar System (VSS). The VSS is a non-immersive virtual environment which has a dynamic frame of reference that can be altered by the user. Ten 10th grade students were given tasks containing a set of…
Banerjee, Smita C.; Greene, Kathryn; Li, Yuelin; Ostroff, Jamie S.
2016-01-01
Objectives This study examined the effects of comparative-framing [C-F; ads highlighting differences between the advertised product and conventional cigarettes and/or smokeless tobacco products] versus similarity-framing (S-F; ads highlighting congruence with conventional cigarettes and/or smokeless tobacco products) in e-cigarette and snus ads on young adult smokers’ and non-smokers’ ad- and product-related perceptions. Methods One thousand fifty one (1,051) young adults (18–24 years; 76% women; 50% smokers) from existing consumer panels were recruited in a within-subjects quasi-experiment. Each participant viewed 4 online advertisements, varied by tobacco product type (e-cigarette or snus) and ad framing (C-F or S-F). The dependent measures for this study were ad-related (ad perceptions, ad credibility) and product-related perceptions (absolute and comparative risk perceptions, product appeal, and product use intentions). Results Former and current smokers rated C-F ads as more persuasive than S-F ads, as evidenced by favorable ad perceptions and high product use intentions. Former and current smokers also rated e-cigarette ads with more favorable ad perceptions, low absolute and comparative risk perceptions, high product appeal, and high product use intentions as compared to snus ads. However, the effect sizes of the significant differences are less than.2, indicating small magnitude of difference between the study variables. Conclusions Unless FDA regulates e-cig and snus advertising, there is a potential of decreasing risk perceptions and increasing use of e-cigs among young adults. Further research on implicit/explicit comparative claims in e-cigarettes and snus advertisements that encourage risk misperceptions is recommended. PMID:28042597
Banerjee, Smita C; Greene, Kathryn; Li, Yuelin; Ostroff, Jamie S
2016-07-01
This study examined the effects of comparative-framing [C-F; ads highlighting differences between the advertised product and conventional cigarettes and/or smokeless tobacco products] versus similarity-framing (S-F; ads highlighting congruence with conventional cigarettes and/or smokeless tobacco products) in e-cigarette and snus ads on young adult smokers' and non-smokers' ad- and product-related perceptions. One thousand fifty one (1,051) young adults (18-24 years; 76% women; 50% smokers) from existing consumer panels were recruited in a within-subjects quasi-experiment. Each participant viewed 4 online advertisements, varied by tobacco product type (e-cigarette or snus) and ad framing (C-F or S-F). The dependent measures for this study were ad-related (ad perceptions, ad credibility) and product-related perceptions (absolute and comparative risk perceptions, product appeal, and product use intentions). Former and current smokers rated C-F ads as more persuasive than S-F ads, as evidenced by favorable ad perceptions and high product use intentions. Former and current smokers also rated e-cigarette ads with more favorable ad perceptions, low absolute and comparative risk perceptions, high product appeal, and high product use intentions as compared to snus ads. However, the effect sizes of the significant differences are less than.2, indicating small magnitude of difference between the study variables. Unless FDA regulates e-cig and snus advertising, there is a potential of decreasing risk perceptions and increasing use of e-cigs among young adults. Further research on implicit/explicit comparative claims in e-cigarettes and snus advertisements that encourage risk misperceptions is recommended.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Popple, R; Bredel, M; Brezovich, I
Purpose: To compare the accuracy of CT-MR registration using a mutual information method with registration using a frame-based localizer box. Methods: Ten patients having the Leksell head frame and scanned with a modality specific localizer box were imported into the treatment planning system. The fiducial rods of the localizer box were contoured on both the MR and CT scans. The skull was contoured on the CT images. The MR and CT images were registered by two methods. The frame-based method used the transformation that minimized the mean square distance of the centroids of the contours of the fiducial rods frommore » a mathematical model of the localizer. The mutual information method used automated image registration tools in the TPS and was restricted to a volume-of-interest defined by the skull contours with a 5 mm margin. For each case, the two registrations were adjusted by two evaluation teams, each comprised of an experienced radiation oncologist and neurosurgeon, to optimize alignment in the region of the brainstem. The teams were blinded to the registration method. Results: The mean adjustment was 0.4 mm (range 0 to 2 mm) and 0.2 mm (range 0 to 1 mm) for the frame and mutual information methods, respectively. The median difference between the frame and mutual information registrations was 0.3 mm, but was not statistically significant using the Wilcoxon signed rank test (p=0.37). Conclusion: The difference between frame and mutual information registration techniques was neither statistically significant nor, for most applications, clinically important. These results suggest that mutual information is equivalent to frame-based image registration for radiosurgery. Work is ongoing to add additional evaluators and to assess the differences between evaluators.« less
NASA Astrophysics Data System (ADS)
Polyakhova, E. N.; Ovchinnikov, M. Yu.; Tikhonov, A. A.
2018-05-01
Short bibliographic review of several books on Astrodynamics written in Russian is outlined in frame of the scientific legacy of Friedrich Tsander (1887-1933). He was the outstanding soviet engineer, inventor and scientist who lived in Moscow in 1920s. Our review concerns only ten books edited and published in 2000-2017. We consider these books as the necessary honorable contribution to Tsander's memory to the 130-th Anniversary of his birthday. We outline several sections as follows: New translations to Russian, Textbooks and Educational Issues, Scientific Monograph on Astrodynamics, Electrodynamics and Magnetism in Space, Solar Sailing Dynamics.
Updated Starshade Technology Gap List
NASA Astrophysics Data System (ADS)
Crill, Brendan P.; Siegler, Nicholas
2017-01-01
NASA's Exoplanet Exploration Program (ExEP) guides the development of technology that enables the direct imaging and characterization of exo-Earths in the habitable zone of their stars, for future space observatories. Here we present the Starshade portion of the 2017 ExEP Enabling Technology Gap List, an annual update to ExEP's list of of technology to be advanced in the next 1-5 years. A Starshade is an external occulter on an independent spacecraft, allowing a space telescope to achieve exo-Earth imaging contrast requirements by blocking starlight before it enters the telescope. Building and operating a Starshade requires new technology: the occulter is a structure tens of meters in diameter that must be positioned precisely at a distance of tens of thousands of kilometers from the telescope. We review the current state-of-the-art performance and the performance level that must be achieved for a Starshade.
NASA Astrophysics Data System (ADS)
Yang, HongJiang; Wang, Enliang; Dong, WenXiu; Gong, Maomao; Shen, Zhenjie; Tang, Yaguo; Shan, Xu; Chen, Xiangjun
2018-05-01
The a b i n i t i o molecular dynamics (MD) simulations using an atom-centered density matrix propagation method have been carried out to investigate the fragmentation of the ground-state triply charged carbon dioxide, CO23 +→C+ + Oa+ + Ob+ . Ten thousands of trajectories have been simulated. By analyzing the momentum correlation of the final fragments, it is demonstrated that the sequential fragmentation dominates in the three-body dissociation, consistent with our experimental observations which were performed by electron collision at impact energy of 1500 eV. Furthermore, the MD simulations allow us to have detailed insight into the ultrafast evolution of the molecular bond breakage at a very early stage, within several tens of femtoseconds, and the result shows that the initial nuclear vibrational mode plays a decisive role in switching the dissociation pathways.
NCBI GEO: mining tens of millions of expression profiles--database and tools update.
Barrett, Tanya; Troup, Dennis B; Wilhite, Stephen E; Ledoux, Pierre; Rudnev, Dmitry; Evangelista, Carlos; Kim, Irene F; Soboleva, Alexandra; Tomashevsky, Maxim; Edgar, Ron
2007-01-01
The Gene Expression Omnibus (GEO) repository at the National Center for Biotechnology Information (NCBI) archives and freely disseminates microarray and other forms of high-throughput data generated by the scientific community. The database has a minimum information about a microarray experiment (MIAME)-compliant infrastructure that captures fully annotated raw and processed data. Several data deposit options and formats are supported, including web forms, spreadsheets, XML and Simple Omnibus Format in Text (SOFT). In addition to data storage, a collection of user-friendly web-based interfaces and applications are available to help users effectively explore, visualize and download the thousands of experiments and tens of millions of gene expression patterns stored in GEO. This paper provides a summary of the GEO database structure and user facilities, and describes recent enhancements to database design, performance, submission format options, data query and retrieval utilities. GEO is accessible at http://www.ncbi.nlm.nih.gov/geo/
Assessing, Modeling, and Monitoring the Impacts of Extreme Climate Events
NASA Astrophysics Data System (ADS)
Murnane, Richard J.; Diaz, Henry F.
2006-01-01
Extreme weather and climate events provide dramatic content for the news media, and the past few years have supplied plenty of material. The 2004 and 2005 Atlantic hurricane seasons were very active; the United States was struck repeatedly by landfalling major hurricanes. A five-year drought in the southwestern United States was punctuated in 2003 by wildfires in southern California that caused billions of dollars in losses. Ten cyclones of at least tropical storm strength struck Japan in 2004, easily breaking the 1990 and 1993 records of six cyclones each year. Hurricane Catarina was the first recorded hurricane in the South Atlantic. Europe's summer of 2003 saw record-breaking heat that caused tens of thousands of deaths. These events have all been widely publicized, and they naturally raise several questions: Is climate changing, and if so, why? What can we expect in the future? How can we better respond to climate variability regardless of its source?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, Luning; Neuscamman, Eric
We present a modification to variational Monte Carlo’s linear method optimization scheme that addresses a critical memory bottleneck while maintaining compatibility with both the traditional ground state variational principle and our recently-introduced variational principle for excited states. For wave function ansatzes with tens of thousands of variables, our modification reduces the required memory per parallel process from tens of gigabytes to hundreds of megabytes, making the methodology a much better fit for modern supercomputer architectures in which data communication and per-process memory consumption are primary concerns. We verify the efficacy of the new optimization scheme in small molecule tests involvingmore » both the Hilbert space Jastrow antisymmetric geminal power ansatz and real space multi-Slater Jastrow expansions. Satisfied with its performance, we have added the optimizer to the QMCPACK software package, with which we demonstrate on a hydrogen ring a prototype approach for making systematically convergent, non-perturbative predictions of Mott-insulators’ optical band gaps.« less
Hydrothermal plumes over spreading-center axes: Global distributions and geological inferences
NASA Astrophysics Data System (ADS)
Baker, Edward T.; German, Christopher R.; Elderfield, Henry
Seafloor hydrothermal circulation is the principal agent of energy and mass exchange between the ocean and the earth's crust. Discharging fluids cool hot rock, construct mineral deposits, nurture biological communities, alter deep-sea mixing and circulation patterns, and profoundly influence ocean chemistry and biology. Although the active discharge orifices themselves cover only a minuscule percentage of the ridge-axis seafloor, the investigation and quantification of their effects is enhanced as a consequence of the mixing process that forms hydrothermal plumes. Hydrothermal fluids discharged from vents are rapidly diluted with ambient seawater by factors of 104-105 [Lupton et al., 1985]. During dilution, the mixture rises tens to hundreds of meters to a level of neutral buoyancy, eventually spreading laterally as a distinct hydrographic and chemical layer with a spatial scale of tens to thousands of kilometers [e.g., Lupton and Craig, 1981; Baker and Massoth, 1987; Speer and Rona, 1989].
A Molecular Dynamic Modeling of Hemoglobin-Hemoglobin Interactions
NASA Astrophysics Data System (ADS)
Wu, Tao; Yang, Ye; Sheldon Wang, X.; Cohen, Barry; Ge, Hongya
2010-05-01
In this paper, we present a study of hemoglobin-hemoglobin interaction with model reduction methods. We begin with a simple spring-mass system with given parameters (mass and stiffness). With this known system, we compare the mode superposition method with Singular Value Decomposition (SVD) based Principal Component Analysis (PCA). Through PCA we are able to recover the principal direction of this system, namely the model direction. This model direction will be matched with the eigenvector derived from mode superposition analysis. The same technique will be implemented in a much more complicated hemoglobin-hemoglobin molecule interaction model, in which thousands of atoms in hemoglobin molecules are coupled with tens of thousands of T3 water molecule models. In this model, complex inter-atomic and inter-molecular potentials are replaced by nonlinear springs. We employ the same method to get the most significant modes and their frequencies of this complex dynamical system. More complex physical phenomena can then be further studied by these coarse grained models.
Droplet microfluidics--a tool for single-cell analysis.
Joensson, Haakan N; Andersson Svahn, Helene
2012-12-03
Droplet microfluidics allows the isolation of single cells and reagents in monodisperse picoliter liquid capsules and manipulations at a throughput of thousands of droplets per second. These qualities allow many of the challenges in single-cell analysis to be overcome. Monodispersity enables quantitative control of solute concentrations, while encapsulation in droplets provides an isolated compartment for the single cell and its immediate environment. The high throughput allows the processing and analysis of the tens of thousands to millions of cells that must be analyzed to accurately describe a heterogeneous cell population so as to find rare cell types or access sufficient biological space to find hits in a directed evolution experiment. The low volumes of the droplets make very large screens economically viable. This Review gives an overview of the current state of single-cell analysis involving droplet microfluidics and offers examples where droplet microfluidics can further biological understanding. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Edge systems in the deep ocean
NASA Astrophysics Data System (ADS)
Coon, Andrew; Earp, Samuel L.
2010-04-01
DARPA has initiated a program to explore persistent presence in the deep ocean. The deep ocean is difficult to access and presents a hostile environment. Persistent operations in the deep ocean will require new technology for energy, communications and autonomous operations. Several fundamental characteristics of the deep ocean shape any potential system architecture. The deep sea presents acoustic sensing opportunities that may provide significantly enhanced sensing footprints relative to sensors deployed at traditional depths. Communication limitations drive solutions towards autonomous operation of the platforms and automation of data collection and processing. Access to the seabed presents an opportunity for fixed infrastructure with no important limitations on size and weight. Difficult access and persistence impose requirements for long-life energy sources and potentially energy harvesting. The ocean is immense, so there is a need to scale the system footprint for presence over tens of thousands and perhaps hundreds of thousands of square nautical miles. This paper focuses on the aspect of distributed sensing, and the engineering of networks of sensors to cover the required footprint.
Sriramoju, Manoj Kumar; Chen, Yen; Lee, Yun-Tzai Cloud; Hsu, Shang-Te Danny
2018-05-04
More than one thousand knotted protein structures have been identified so far, but the functional roles of these knots remain elusive. It has been postulated that backbone entanglement may provide additional mechanostability. Here, we employed a bacterial proteasome, ClpXP, to mechanically unfold 5 2 -knotted human ubiquitin C-terminal hydrolase (UCH) paralogs from their C-termini, followed by processive translocation into the proteolytic chamber for degradation. Our results revealed unprecedentedly slow kinetics of ClpXP-mediated proteolysis for the proteasome-associated UCHL5: ten thousand times slower than that of a green fluorescence protein (GFP), which has a comparable size to the UCH domain but much higher chemical and thermal stabilities. The ClpXP-dependent mechanostability positively correlates with the intrinsic unfolding rates of the substrates, spanning over several orders of magnitude for the UCHs. The broad range of mechanostability within the same protein family may be associated with the functional requirements for their differential malleabilities.
DataWarrior: an open-source program for chemistry aware data visualization and analysis.
Sander, Thomas; Freyss, Joel; von Korff, Modest; Rufener, Christian
2015-02-23
Drug discovery projects in the pharmaceutical industry accumulate thousands of chemical structures and ten-thousands of data points from a dozen or more biological and pharmacological assays. A sufficient interpretation of the data requires understanding, which molecular families are present, which structural motifs correlate with measured properties, and which tiny structural changes cause large property changes. Data visualization and analysis software with sufficient chemical intelligence to support chemists in this task is rare. In an attempt to contribute to filling the gap, we released our in-house developed chemistry aware data analysis program DataWarrior for free public use. This paper gives an overview of DataWarrior's functionality and architecture. Exemplarily, a new unsupervised, 2-dimensional scaling algorithm is presented, which employs vector-based or nonvector-based descriptors to visualize the chemical or pharmacophore space of even large data sets. DataWarrior uses this method to interactively explore chemical space, activity landscapes, and activity cliffs.
Upadhyay, Amit A.; Fleetwood, Aaron D.; Adebali, Ogun; ...
2016-04-06
Cellular receptors usually contain a designated sensory domain that recognizes the signal. Per/Arnt/Sim (PAS) domains are ubiquitous sensors in thousands of species ranging from bacteria to humans. Although PAS domains were described as intracellular sensors, recent structural studies revealed PAS-like domains in extracytoplasmic regions in several transmembrane receptors. However, these structurally defined extracellular PAS-like domains do not match sequence-derived PAS domain models, and thus their distribution across the genomic landscape remains largely unknown. Here we show that structurally defined extracellular PAS-like domains belong to the Cache superfamily, which is homologous to, but distinct from the PAS superfamily. Our newly builtmore » computational models enabled identification of Cache domains in tens of thousands of signal transduction proteins including those from important pathogens and model organisms.Moreover, we show that Cache domains comprise the dominant mode of extracellular sensing in prokaryotes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Upadhyay, Amit A.; Fleetwood, Aaron D.; Adebali, Ogun
Cellular receptors usually contain a designated sensory domain that recognizes the signal. Per/Arnt/Sim (PAS) domains are ubiquitous sensors in thousands of species ranging from bacteria to humans. Although PAS domains were described as intracellular sensors, recent structural studies revealed PAS-like domains in extracytoplasmic regions in several transmembrane receptors. However, these structurally defined extracellular PAS-like domains do not match sequence-derived PAS domain models, and thus their distribution across the genomic landscape remains largely unknown. Here we show that structurally defined extracellular PAS-like domains belong to the Cache superfamily, which is homologous to, but distinct from the PAS superfamily. Our newly builtmore » computational models enabled identification of Cache domains in tens of thousands of signal transduction proteins including those from important pathogens and model organisms.Moreover, we show that Cache domains comprise the dominant mode of extracellular sensing in prokaryotes.« less
Exploring the Influence of Dynamic Disorder on Excitons in Solid Pentacene
NASA Astrophysics Data System (ADS)
Wang, Zhiping; Sharifzadeh, Sahar; Doak, Peter; Lu, Zhenfei; Neaton, Jeffrey
2014-03-01
A complete understanding of the spectroscopic and charge transport properties of organic semiconductors requires knowledge of the role of thermal fluctuations and dynamic disorder. We present a first-principles theoretical study aimed at understanding the degree to which dynamic disorder at room temperature results in energy level broadening and excited-state localization within bulk crystalline pentacene. Ab initio molecular dynamics simulations are well-equilibrated for 7-9 ps and tens of thousands of structural snapshots, taken at 0.5 fs intervals, provide input for many-body perturbation theory within the GW approximation and Bethe-Salpeter equation (BSE) approach. The GW-corrected density of states, including thousands of snapshots, indicates that thermal fluctuations significantly broaden the valence and conduction states by >0.2 eV. Additionally, we investigate the nature and energy of the lowest energy singlet and triplet excitons, computed for a set of uncorrelated and energetically preferred structures. This work supported by DOE; computational resources provided by NERSC.
Our Globally Changing Climate. Chapter 1
NASA Technical Reports Server (NTRS)
Wuebbles, D. J.; Easterling, D. R.; Hayhoe, K.; Knutson, T.; Kopp, R. E.; Kossin, J. P.; Kunkel, K. E.; LeGrande, A. N.; Mears, C.; Sweet, W. V.;
2017-01-01
Since the Third U.S. National Climate Assessment (NCA3) was published in May 2014, new observations along multiple lines of evidence have strengthened the conclusion that Earth's climate is changing at a pace and in a pattern not explainable by natural influences. While this report focuses especially on observed and projected future changes for the United States, it is important to understand those changes in the global context (this chapter). The world has warmed over the last 150 years, especially over the last six decades, and that warming has triggered many other changes to Earth's climate. Evidence for a changing climate abounds, from the top of the atmosphere to the depths of the oceans. Thousands of studies conducted by tens of thousands of scientists around the world have documented changes in surface, atmospheric, and oceanic temperatures; melting glaciers; disappearing snow cover; shrinking sea ice; rising sea level; and an increase in atmospheric water vapor. Rainfall patterns and storms are changing, and the occurrence of droughts is shifting.
Friend, M.; Franson, J.C.
1999-01-01
Individual disease outbreaks have killed many thousands of animals on numerous occasions. Tens of thousands of migratory birds have died in single die-offs with as many as 1,000 birds succumbing in 1 day. The ability to successfully combat such explosive situations is highly dependent on the readiness of field personnel to deal with them. Because many disease agents can spread through wildlife populations very quickly, advance preparation is essential for preventing infected animals from spreading disease to additional species and locations. Carefully thought-out disease contingency plans should be developed as practical working documents for field personnel and updated as necessary. Well-designed plans can prove invaluable in minimizing wildlife losses and the costs associated with disease control activities.Although requirements for disease control operations vary and must be tailored to each situation, all disease contingency planning involves general concepts and basic biological information. This chapter, which is intended to be a practical guide, identifies the major activities and needs of disease control operations, and relates them to disease contingency planning.
Unveiling adaptation using high-resolution lineage tracking
NASA Astrophysics Data System (ADS)
Blundell, Jamie; Levy, Sasha; Fisher, Daniel; Petrov, Dmitri; Sherlock, Gavin
2013-03-01
Human diseases such as cancer and microbial infections are adaptive processes inside the human body with enormous population sizes: between 106 -1012 cells. In spite of this our understanding of adaptation in large populations is limited. The key problem is the difficulty in identifying anything more than a handful of rare, large-effect beneficial mutations. The development and use of molecular barcodes allows us to uniquely tag hundreds of thousands of cells and enable us to track tens of thousands of adaptive mutations in large yeast populations. We use this system to test some of the key theories on which our understanding of adaptation in large populations is based. We (i) measure the fitness distribution in an evolving population at different times, (ii) identify when an appreciable fraction of clones in the population have at most a single adaptive mutation and isolate a large number of clones with independent single adaptive mutations, and (iii) use this clone collection to determine the distribution of fitness effects of single beneficial mutations.
Huang, Rao; Lo, Li-Ta; Wen, Yuhua; Voter, Arthur F; Perez, Danny
2017-10-21
Modern molecular-dynamics-based techniques are extremely powerful to investigate the dynamical evolution of materials. With the increase in sophistication of the simulation techniques and the ubiquity of massively parallel computing platforms, atomistic simulations now generate very large amounts of data, which have to be carefully analyzed in order to reveal key features of the underlying trajectories, including the nature and characteristics of the relevant reaction pathways. We show that clustering algorithms, such as the Perron Cluster Cluster Analysis, can provide reduced representations that greatly facilitate the interpretation of complex trajectories. To illustrate this point, clustering tools are used to identify the key kinetic steps in complex accelerated molecular dynamics trajectories exhibiting shape fluctuations in Pt nanoclusters. This analysis provides an easily interpretable coarse representation of the reaction pathways in terms of a handful of clusters, in contrast to the raw trajectory that contains thousands of unique states and tens of thousands of transitions.
NASA Astrophysics Data System (ADS)
Huang, Rao; Lo, Li-Ta; Wen, Yuhua; Voter, Arthur F.; Perez, Danny
2017-10-01
Modern molecular-dynamics-based techniques are extremely powerful to investigate the dynamical evolution of materials. With the increase in sophistication of the simulation techniques and the ubiquity of massively parallel computing platforms, atomistic simulations now generate very large amounts of data, which have to be carefully analyzed in order to reveal key features of the underlying trajectories, including the nature and characteristics of the relevant reaction pathways. We show that clustering algorithms, such as the Perron Cluster Cluster Analysis, can provide reduced representations that greatly facilitate the interpretation of complex trajectories. To illustrate this point, clustering tools are used to identify the key kinetic steps in complex accelerated molecular dynamics trajectories exhibiting shape fluctuations in Pt nanoclusters. This analysis provides an easily interpretable coarse representation of the reaction pathways in terms of a handful of clusters, in contrast to the raw trajectory that contains thousands of unique states and tens of thousands of transitions.
2D photoacoustic scanning imaging with a single pulsed laser diode excitation
NASA Astrophysics Data System (ADS)
Chen, Xuegang; Li, Changwei; Zeng, Lvming; Liu, Guodong; Huang, Zhen; Ren, Zhong
2012-03-01
A portable near-infrared photoacoustic scanning imaging system has been developed with a single pulsed laser diode, which was integrated with an optical lens system to straightforward boost the laser energy density for photoacoustic generation. The 905 nm laser diode provides a maximum energy output of 14 μJ within 100 ns pulse duration, and the pulse repetition frequency rate is 0.8 KHz. As a possible alternative light source, the preliminary 2D photoacoustic results primely correspond with the test phantoms of umbonate extravasated gore and knotted blood vessel network. The photoacoustic SNR can reach 20.6+/-1.2 dB while signal averaging reduces to 128 pulses from thousands to tens of thousands times, and the signal acquisition time accelerates to less than 0.2 s in each A-scan, especially the volume of the total radiation source is only 10 × 3 × 3 cm3. It demonstrated that the pulsed semiconductor laser could be a candidate of photoacoustic equipment for daily clinical application.
Ultrapulse welding: A new joining technique. [for automotive industry
NASA Technical Reports Server (NTRS)
Anderson, D. G.
1972-01-01
The ultrapulse process is a resistance welding process that utilizes unidirectional current of high magnitude for a very short time with a precisely controlled dynamic force pulse. Peak currents of up to 220,000 amperes for two to ten milliseconds are used with synchronized force pulses of up to nine thousand pounds. The welding current passing through the relatively high resistance of the interface between the parts that are being joined results in highly localized heating. Described is the UPW process as it applies to the automotive industry.
Promising application of dynamic nuclear polarization for in vivo (13)C MR imaging.
Yen, Yi-Fen; Nagasawa, Kiyoshi; Nakada, Tsutomu
2011-01-01
Use of hyperpolarized (13)C in magnetic resonance (MR) imaging is a new technique that enhances signal tens of thousands-fold. Recent in vivo animal studies of metabolic imaging that used hyperpolarized (13)C demonstrated its potential in many applications for disease indication, metabolic profiling, and treatment monitoring. We review the basic physics for dynamic nuclear polarization (DNP) and in vivo studies reported in prostate cancer research, hepatocellular carcinoma research, diabetes and cardiac applications, brain metabolism, and treatment response as well as investigations of various DNP (13)C substrates.
Using VizieR/Aladin to Measure Neglected Double Stars
NASA Astrophysics Data System (ADS)
Harshaw, Richard
2013-04-01
The VizierR service of the Centres de Donnes Astronomiques de Strasbourg (France) offers amateur astronomers a treasure trove of resources, including access to the most current version of the Washington Double Star Catalog (WDS) and links to tens of thousands of digitized sky survey plates via the Aladin Java applet. These plates allow the amateur to make accurate measurements of position angle and separation for many neglected pairs that fall within reasonable tolerances for the use of Aladin. This paper presents 428 measurements of 251 neglected pairs from the WDS.
Owen, Jesse; Imel, Zac E
2016-04-01
This article introduces the special section on utilizing large data sets to explore psychotherapy processes and outcomes. The increased use of technology has provided new opportunities for psychotherapy researchers. In particular, there is a rise in large databases of tens of thousands clients. Additionally, there are new ways to pool valuable resources for meta-analytic processes. At the same time, these tools also come with limitations. These issues are introduced as well as brief overview of the articles. (c) 2016 APA, all rights reserved).
Holographic Characterization of Colloidal Fractal Aggregates
NASA Astrophysics Data System (ADS)
Wang, Chen; Cheong, Fook Chiong; Ruffner, David B.; Zhong, Xiao; Ward, Michael D.; Grier, David G.
In-line holographic microscopy images of micrometer-scale fractal aggregates can be interpreted with the Lorenz-Mie theory of light scattering and an effective-sphere model to obtain each aggregate's size and the population-averaged fractal dimension. We demonstrate this technique experimentally using model fractal clusters of polystyrene nanoparticles and fractal protein aggregates composed of bovine serum albumin and bovine pancreas insulin. This technique can characterize several thousand aggregates in ten minutes and naturally distinguishes aggregates from contaminants such as silicone oil droplets. Work supported by the SBIR program of the NSF.
1994-03-01
reality the structure of even one individual aircraft consists of many bat- ches and the tens of thousand of cars of one type manufactured in even...generated neural network power spectral densities of surface pressures are used to augment existing data and then load an elastic finite clement...investigated for possible use in augmenting this information which is required for fatigue life calculations. Since empennage environments on fighter
Inductive System for Reliable Magnesium Level Detection in a Titanium Reduction Reactor
NASA Astrophysics Data System (ADS)
Krauter, Nico; Eckert, Sven; Gundrum, Thomas; Stefani, Frank; Wondrak, Thomas; Frick, Peter; Khalilov, Ruslan; Teimurazov, Andrei
2018-05-01
The determination of the Magnesium level in a Titanium reduction retort by inductive methods is often hampered by the formation of Titanium sponge rings which disturb the propagation of electromagnetic signals between excitation and receiver coils. We present a new method for the reliable identification of the Magnesium level which explicitly takes into account the presence of sponge rings with unknown geometry and conductivity. The inverse problem is solved by a look-up-table method, based on the solution of the inductive forward problems for several tens of thousands parameter combinations.
2006-11-01
color images. 14. ABSTRACT 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18 . NUMBER OF PAGES 8 19a. NAME OF...Std Z39- 18 small problem domain can require millions of solution vari- ables solved repeatedly for tens of thousands of time steps. Finally, the...terms of vector and scalar potentials, A and ψ respec- tively. E = − ( ∂A ∂t +∇ψ ) = Erot + Eirr (5) Since the curl of a gradient is always zero, ∇ψ
The Genetics of Canine Skull Shape Variation
Schoenebeck, Jeffrey J.; Ostrander, Elaine A.
2013-01-01
A dog’s craniofacial diversity is the result of continual human intervention in natural selection, a process that began tens of thousands of years ago. To date, we know little of the genetic underpinnings and developmental mechanisms that make dog skulls so morphologically plastic. In this Perspectives, we discuss the origins of dog skull shapes in terms of history and biology and highlight recent advances in understanding the genetics of canine skull shapes. Of particular interest are those molecular genetic changes that are associated with the development of distinct breeds. PMID:23396475
Twist-induced tuning in tapered fiber couplers.
Birks, T A
1989-10-01
The power-splitting ratio of fused tapered single-mode fiber couplers can be reversibly tuned by axial twisting without affecting loss. The twist-tuning behavior of a range of different tapered couplers is described. A simple expression for twist-tuning can be derived by representing the effects of twist by a change in the refractive index profile. Good agreement between this expression and experimental results is demonstrated. Repeated tuning over tens of thousands of cycles is found not to degrade coupler performance, and a number of practical applications, including a freely tunable tapered coupler, are described.
Preliminary Investigation into the Water Usage from Fracking in Drought Ridden California
NASA Astrophysics Data System (ADS)
Lew, S.; Wu, M.
2014-12-01
Hydraulic fracking is a common method used to obtain natural gas as well as oil from the ground. The process begins with drilling the ground, which is then followed by thousands of gallons of fluid being pumped into the ground to break the shale rock and release natural gas. The job requires thousands of gallons of water, and chemicals are added to the water, often making it unusable for other purposes. The amount of water being used for fracking in California has been recently brought to attention because the state is currently facing a drought. Currently California is experiencing the worst drought since the 1920's. In the time frame of 2013-2014 California rainfall has been 50% below the average with 2013 being the driest year. The lack of rain is attributed to the Pacific Decadal Oscillation which occurs every 20-30 years and causes the Pacific Ocean to cool, leading to less rain because storms are diverted to the north. As a result of the drought, food prices are expected to rise and farmers are pumping 75% of their water need from reserved aquifers.
In-Frame Mutations in Exon 1 of SKI Cause Dominant Shprintzen-Goldberg Syndrome
Carmignac, Virginie; Thevenon, Julien; Adès, Lesley; Callewaert, Bert; Julia, Sophie; Thauvin-Robinet, Christel; Gueneau, Lucie; Courcet, Jean-Benoit; Lopez, Estelle; Holman, Katherine; Renard, Marjolijn; Plauchu, Henri; Plessis, Ghislaine; De Backer, Julie; Child, Anne; Arno, Gavin; Duplomb, Laurence; Callier, Patrick; Aral, Bernard; Vabres, Pierre; Gigot, Nadège; Arbustini, Eloisa; Grasso, Maurizia; Robinson, Peter N.; Goizet, Cyril; Baumann, Clarisse; Di Rocco, Maja; Sanchez Del Pozo, Jaime; Huet, Frédéric; Jondeau, Guillaume; Collod-Beroud, Gwenaëlle; Beroud, Christophe; Amiel, Jeanne; Cormier-Daire, Valérie; Rivière, Jean-Baptiste; Boileau, Catherine; De Paepe, Anne; Faivre, Laurence
2012-01-01
Shprintzen-Goldberg syndrome (SGS) is characterized by severe marfanoid habitus, intellectual disability, camptodactyly, typical facial dysmorphism, and craniosynostosis. Using family-based exome sequencing, we identified a dominantly inherited heterozygous in-frame deletion in exon 1 of SKI. Direct sequencing of SKI further identified one overlapping heterozygous in-frame deletion and ten heterozygous missense mutations affecting recurrent residues in 18 of the 19 individuals screened for SGS; these individuals included one family affected by somatic mosaicism. All mutations were located in a restricted area of exon 1, within the R-SMAD binding domain of SKI. No mutation was found in a cohort of 11 individuals with other marfanoid-craniosynostosis phenotypes. The interaction between SKI and Smad2/3 and Smad 4 regulates TGF-β signaling, and the pattern of anomalies in Ski-deficient mice corresponds to the clinical manifestations of SGS. These findings define SGS as a member of the family of diseases associated with the TGF-β-signaling pathway. PMID:23103230
2000-04-01
be an extension of Utah’s nascent Quarks system, oriented to closely coupled cluster environments. However, the grant did not actually begin until... Intel x86, implemented ten virtual machine monitors and servers, including a virtual memory manager, a checkpointer, a process manager, a file server...Fluke, we developed a novel hierarchical processor scheduling frame- work called CPU inheritance scheduling [5]. This is a framework for scheduling
Ultrafast ultrasound localization microscopy for deep super-resolution vascular imaging
NASA Astrophysics Data System (ADS)
Errico, Claudia; Pierre, Juliette; Pezet, Sophie; Desailly, Yann; Lenkei, Zsolt; Couture, Olivier; Tanter, Mickael
2015-11-01
Non-invasive imaging deep into organs at microscopic scales remains an open quest in biomedical imaging. Although optical microscopy is still limited to surface imaging owing to optical wave diffusion and fast decorrelation in tissue, revolutionary approaches such as fluorescence photo-activated localization microscopy led to a striking increase in resolution by more than an order of magnitude in the last decade. In contrast with optics, ultrasonic waves propagate deep into organs without losing their coherence and are much less affected by in vivo decorrelation processes. However, their resolution is impeded by the fundamental limits of diffraction, which impose a long-standing trade-off between resolution and penetration. This limits clinical and preclinical ultrasound imaging to a sub-millimetre scale. Here we demonstrate in vivo that ultrasound imaging at ultrafast frame rates (more than 500 frames per second) provides an analogue to optical localization microscopy by capturing the transient signal decorrelation of contrast agents—inert gas microbubbles. Ultrafast ultrasound localization microscopy allowed both non-invasive sub-wavelength structural imaging and haemodynamic quantification of rodent cerebral microvessels (less than ten micrometres in diameter) more than ten millimetres below the tissue surface, leading to transcranial whole-brain imaging within short acquisition times (tens of seconds). After intravenous injection, single echoes from individual microbubbles were detected through ultrafast imaging. Their localization, not limited by diffraction, was accumulated over 75,000 images, yielding 1,000,000 events per coronal plane and statistically independent pixels of ten micrometres in size. Precise temporal tracking of microbubble positions allowed us to extract accurately in-plane velocities of the blood flow with a large dynamic range (from one millimetre per second to several centimetres per second). These results pave the way for deep non-invasive microscopy in animals and humans using ultrasound. We anticipate that ultrafast ultrasound localization microscopy may become an invaluable tool for the fundamental understanding and diagnostics of various disease processes that modify the microvascular blood flow, such as cancer, stroke and arteriosclerosis.
Ultrafast ultrasound localization microscopy for deep super-resolution vascular imaging.
Errico, Claudia; Pierre, Juliette; Pezet, Sophie; Desailly, Yann; Lenkei, Zsolt; Couture, Olivier; Tanter, Mickael
2015-11-26
Non-invasive imaging deep into organs at microscopic scales remains an open quest in biomedical imaging. Although optical microscopy is still limited to surface imaging owing to optical wave diffusion and fast decorrelation in tissue, revolutionary approaches such as fluorescence photo-activated localization microscopy led to a striking increase in resolution by more than an order of magnitude in the last decade. In contrast with optics, ultrasonic waves propagate deep into organs without losing their coherence and are much less affected by in vivo decorrelation processes. However, their resolution is impeded by the fundamental limits of diffraction, which impose a long-standing trade-off between resolution and penetration. This limits clinical and preclinical ultrasound imaging to a sub-millimetre scale. Here we demonstrate in vivo that ultrasound imaging at ultrafast frame rates (more than 500 frames per second) provides an analogue to optical localization microscopy by capturing the transient signal decorrelation of contrast agents--inert gas microbubbles. Ultrafast ultrasound localization microscopy allowed both non-invasive sub-wavelength structural imaging and haemodynamic quantification of rodent cerebral microvessels (less than ten micrometres in diameter) more than ten millimetres below the tissue surface, leading to transcranial whole-brain imaging within short acquisition times (tens of seconds). After intravenous injection, single echoes from individual microbubbles were detected through ultrafast imaging. Their localization, not limited by diffraction, was accumulated over 75,000 images, yielding 1,000,000 events per coronal plane and statistically independent pixels of ten micrometres in size. Precise temporal tracking of microbubble positions allowed us to extract accurately in-plane velocities of the blood flow with a large dynamic range (from one millimetre per second to several centimetres per second). These results pave the way for deep non-invasive microscopy in animals and humans using ultrasound. We anticipate that ultrafast ultrasound localization microscopy may become an invaluable tool for the fundamental understanding and diagnostics of various disease processes that modify the microvascular blood flow, such as cancer, stroke and arteriosclerosis.
NASA Technical Reports Server (NTRS)
Mena-Werth, Jose
1998-01-01
The Vulcan Photometric Planet Search is the ground-based counterpart of Kepler Mission Proposal. The Kepler Proposal calls for the launch of telescope to look intently at a small patch of sky for four year. The mission is designed to look for extra-solar planets that transit sun-like stars. The Kepler Mission should be able to detect Earth-size planets. This goal requires an instrument and software capable of detecting photometric changes of several parts per hundred thousand in the flux of a star. The goal also requires the continuous monitoring of about a hundred thousand stars. The Kepler Mission is a NASA Discovery Class proposal similar in cost to the Lunar Prospector. The Vulcan Search is also a NASA project but based at Lick Observatory. A small wide-field telescope monitors various star fields successively during the year. Dozens of images, each containing tens of thousands of stars, are taken any night that weather permits. The images are then monitored for photometric changes of the order of one part in a thousand. These changes would reveal the transit of an inner-orbit Jupiter-size planet similar to those discovered recently in spectroscopic searches. In order to achieve a one part in one thousand photometric precision even the choice of a filter used in taking an exposure can be critical. The ultimate purpose of an filter is to increase the signal-to-noise ratio (S/N) of one's observation. Ideally, filters reduce the sky glow cause by street lights and, thereby, make the star images more distinct. The higher the S/N, the higher is the chance to observe a transit signal that indicates the presence of a new planet. It is, therefore, important to select the filter that maximizes the S/N.
NASA Astrophysics Data System (ADS)
Lai, H.; Russell, C. T.; Wei, H.; Delzanno, G. L.; Connors, M. G.
2014-12-01
Near-Earth objects (NEOs) of tens of meters in diameter are difficult to detect by optical methods from the Earth but they result in the most damage per year. Many of these bodies are produced in non-destructive collisions with larger well-characterized NEOs. After generation, the debris spreads forward and backward in a cocoon around the orbit of the parent body. Thereafter, scattering will occur due to gravitational perturbations when the debris stream passes near a planet even when the parent body has no such close approaches. Therefore "safe" NEOs which have no close encounters to the Earth for thousands of years may be accompanied by potentially hazardous co-orbiting debris. We have developed a technique to identify co-orbiting debris by detecting the magnetic signature produced when some of the debris suffers destructive collisions with meteoroids, which are numerous and can be as small as tens of centimeters in diameter. Clouds of nanoscale dust/gas particles released in such collisions can interact coherently with the solar wind electromagnetically. The resultant magnetic perturbations are readily identified when they pass spacecraft equipped with magnetometers. We can use such observations to obtain the spatial and size distribution as well as temporal variation of the debris streams. A test of this technique has been performed and debris streams both leading and trailing asteroid 138175 have been identified. There is a finite spread across the original orbit and most of the co-orbitals were tens of meters in diameter before the disruptive collisions. We estimate that there were tens of thousands of such co-orbiting objects, comprising only 1% of the original mass of the parent asteroid but greatly increasing the impact hazard. A loss of the co-orbitals since 1970s has been inferred from observations with a decay time consistent with that calculated from the existing collisional model [Grün et al., 1985]. Therefore disruptive collisions are the main loss mechanism of the co-orbiting debris associated with 138175. In summary, our technique helps us to identify which NEOs are accompanied by hazardous debris trails. Although our technique provides only the statistical properties, it indicates where high resolution optical surveys should be obtained in order to identify and track specific hazardous bodies.
NASA Astrophysics Data System (ADS)
Suciu, N.; Vamos, C.; Vereecken, H.; Vanderborght, J.; Hardelauf, H.
2003-04-01
When the small scale transport is modeled by a Wiener process and the large scale heterogeneity by a random velocity field, the effective coefficients, Deff, can be decomposed as sums between the local coefficient, D, a contribution of the random advection, Dadv, and a contribution of the randomness of the trajectory of plume center of mass, Dcm: Deff=D+Dadv-Dcm. The coefficient Dadv is similar to that introduced by Taylor in 1921, and more recent works associate it with the thermodynamic equilibrium. The ``ergodic hypothesis'' says that over large time intervals Dcm vanishes and the effect of the heterogeneity is described by Dadv=Deff-D. In this work we investigate numerically the long time behavior of the effective coefficients as well as the validity of the ergodic hypothesis. The transport in every realization of the velocity field is modeled with the Global Random Walk Algorithm, which is able to track as many particles as necessary to achieve a statistically reliable simulation of the process. Averages over realizations are further used to estimate mean coefficients and standard deviations. In order to remain in the frame of most of the theoretical approaches, the velocity field was generated in a linear approximation and the logarithm of the hydraulic conductivity was taken to be exponential decaying correlated with variance equal to 0.1. Our results show that even in these idealized conditions, the effective coefficients tend to asymptotic constant values only when the plume travels thousands of correlations lengths (while the first order theories usually predict Fickian behavior after tens of correlations lengths) and that the ergodicity conditions are still far from being met.
4-D ultrafast shear-wave imaging.
Gennisson, Jean-Luc; Provost, Jean; Deffieux, Thomas; Papadacci, Clément; Imbault, Marion; Pernot, Mathieu; Tanter, Mickael
2015-06-01
Over the last ten years, shear wave elastography (SWE) has seen considerable development and is now routinely used in clinics to provide mechanical characterization of tissues to improve diagnosis. The most advanced technique relies on the use of an ultrafast scanner to generate and image shear waves in real time in a 2-D plane at several thousands of frames per second. We have recently introduced 3-D ultrafast ultrasound imaging to acquire with matrix probes the 3-D propagation of shear waves generated by a dedicated radiation pressure transducer in a single acquisition. In this study, we demonstrate 3-D SWE based on ultrafast volumetric imaging in a clinically applicable configuration. A 32 × 32 matrix phased array driven by a customized, programmable, 1024-channel ultrasound system was designed to perform 4-D shear-wave imaging. A matrix phased array was used to generate and control in 3-D the shear waves inside the medium using the acoustic radiation force. The same matrix array was used with 3-D coherent plane wave compounding to perform high-quality ultrafast imaging of the shear wave propagation. Volumetric ultrafast acquisitions were then beamformed in 3-D using a delay-and-sum algorithm. 3-D volumetric maps of the shear modulus were reconstructed using a time-of-flight algorithm based on local multiscale cross-correlation of shear wave profiles in the three main directions using directional filters. Results are first presented in an isotropic homogeneous and elastic breast phantom. Then, a full 3-D stiffness reconstruction of the breast was performed in vivo on healthy volunteers. This new full 3-D ultrafast ultrasound system paves the way toward real-time 3-D SWE.
Discriminant Features and Temporal Structure of Nonmanuals in American Sign Language
Benitez-Quiroz, C. Fabian; Gökgöz, Kadir; Wilbur, Ronnie B.; Martinez, Aleix M.
2014-01-01
To fully define the grammar of American Sign Language (ASL), a linguistic model of its nonmanuals needs to be constructed. While significant progress has been made to understand the features defining ASL manuals, after years of research, much still needs to be done to uncover the discriminant nonmanual components. The major barrier to achieving this goal is the difficulty in correlating facial features and linguistic features, especially since these correlations may be temporally defined. For example, a facial feature (e.g., head moves down) occurring at the end of the movement of another facial feature (e.g., brows moves up), may specify a Hypothetical conditional, but only if this time relationship is maintained. In other instances, the single occurrence of a movement (e.g., brows move up) can be indicative of the same grammatical construction. In the present paper, we introduce a linguistic–computational approach to efficiently carry out this analysis. First, a linguistic model of the face is used to manually annotate a very large set of 2,347 videos of ASL nonmanuals (including tens of thousands of frames). Second, a computational approach is used to determine which features of the linguistic model are more informative of the grammatical rules under study. We used the proposed approach to study five types of sentences – Hypothetical conditionals, Yes/no questions, Wh-questions, Wh-questions postposed, and Assertions – plus their polarities – positive and negative. Our results verify several components of the standard model of ASL nonmanuals and, most importantly, identify several previously unreported features and their temporal relationship. Notably, our results uncovered a complex interaction between head position and mouth shape. These findings define some temporal structures of ASL nonmanuals not previously detected by other approaches. PMID:24516528
Ribosome profiling reveals the what, when, where and how of protein synthesis.
Brar, Gloria A; Weissman, Jonathan S
2015-11-01
Ribosome profiling, which involves the deep sequencing of ribosome-protected mRNA fragments, is a powerful tool for globally monitoring translation in vivo. The method has facilitated discovery of the regulation of gene expression underlying diverse and complex biological processes, of important aspects of the mechanism of protein synthesis, and even of new proteins, by providing a systematic approach for experimental annotation of coding regions. Here, we introduce the methodology of ribosome profiling and discuss examples in which this approach has been a key factor in guiding biological discovery, including its prominent role in identifying thousands of novel translated short open reading frames and alternative translation products.
Tian, Yue Yue; Zhang, Li Xia; Zhang, Zheng Qun; Qiao, Ming Ming; Fan, Yan Gen
2017-03-18
In order to ensure the suitable shade model for 'Huangjinya' tea plant in Shandong Province, black or blue shading net at 55%, 70% or 85% shading rates was selected to recover tea garden in summer and autumn, then micro-climate of tea garden, leaf color, chlorophyll fluorescence parameters, growth status and biochemical composition of tea shoots were investigated.The results showed that compared with the control, light intensity and air temperature in tea garden, leaf temperature of tea plants in different shading treatments significantly decreased, while air humidity in tea garden increased. The contents of chlorophyll in the tea leaves were obviously increased with increasing the shading rate, which resulted in the leaf color becoming green. The yellowing characteristics and biochemical quality of 'Huangjinya' tea plants could be well kept in 55% shading treatments. In 70% shading treatments, 'Huangjinya' tea plants had better growth situation and higher yield with no photo-inhibition. Compared with the blue shading treatments, black shading treatments could obviously promote the growth of 'Huangjinya' tea plants, keep yellowing characteristics, and improve the quality. Therefore, the 70% black shading treatment (daily PAR values of 1.2-3.5 ten thousand lx) was appropriate for promoting the growth of 'Huangjinya' tea plants at the seedling stage. For mature tea plants, the 55% black shading treatment (daily PAR values of 1.8-5.5 ten thousand lx) could be used to keep the yellowing characteristics and to improve biochemical quality effectively, so as to give full play to its variety characteristics, to achieve goal of high quality and high yield.
Slone, Daniel H.; Reid, James P.; Kenworthy, W. Judson
2013-01-01
Turbid water conditions make the delineation and characterization of benthic habitats difficult by traditional in situ and remote sensing methods. Here, we develop and validate modeling and sampling methodology for detecting and characterizing seagrass beds by analyzing GPS telemetry records from radio-tagged manatees. Between October 2002 and October 2005, 14 manatees were tracked in the Ten Thousand Islands (TTI) in southwest Florida (USA) using Global Positioning System (GPS) tags. High density manatee use areas were found to occur off each island facing the open, nearshore waters of the Gulf of Mexico. We implemented a spatially stratified random sampling plan and used a camera-based sampling technique to observe and record bottom observations of seagrass and macroalgae presence and abundance. Five species of seagrass were identified in our study area: Halodule wrightii, Thalassia testudinum, Syringodium filiforme, Halophila engelmannii, and Halophila decipiens. A Bayesian model was developed to choose and parameterize a spatial process function that would describe the observed patterns of seagrass and macroalgae. The seagrasses were found in depths <2 m and in the higher manatee use strata, whereas macroalgae was found at moderate densities at all sampled depths and manatee use strata. The manatee spatial data showed a strong association with seagrass beds, a relationship that increased seagrass sampling efficiency. Our camera-based field sampling proved to be effective for assessing seagrass density and spatial coverage under turbid water conditions, and would be an effective monitoring tool to detect changes in seagrass beds.
Ohmi, Masato; Wada, Yuki
2016-08-01
In this paper, we demonstrate dynamic analysis of mental sweating for sound stimulus of a few tens of eccrine sweat glands by the time-sequential piled-up en face optical coherence tomography (OCT) images with the frame spacing of 3.3 sec. In the experiment, the amount of excess sweat can be evaluated simultaneously for a few tens of sweat glands by piling up of all the en face OCT images. Non-uniformity was observed in mental sweating where the amount of sweat in response to sound stimulus is different for each sweat gland. Furthermore, the amount of sweat is significantly increased in proportion to the strength of the stimulus.
Grid site availability evaluation and monitoring at CMS
Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe; ...
2017-10-01
The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impactmore » data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Furthermore, enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.« less
Grid site availability evaluation and monitoring at CMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe
The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impactmore » data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Furthermore, enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.« less
Grid site availability evaluation and monitoring at CMS
NASA Astrophysics Data System (ADS)
Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe; Lammel, Stephan; Sciabà, Andrea
2017-10-01
The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impact data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.
BlazeDEM3D-GPU A Large Scale DEM simulation code for GPUs
NASA Astrophysics Data System (ADS)
Govender, Nicolin; Wilke, Daniel; Pizette, Patrick; Khinast, Johannes
2017-06-01
Accurately predicting the dynamics of particulate materials is of importance to numerous scientific and industrial areas with applications ranging across particle scales from powder flow to ore crushing. Computational discrete element simulations is a viable option to aid in the understanding of particulate dynamics and design of devices such as mixers, silos and ball mills, as laboratory scale tests comes at a significant cost. However, the computational time required to simulate an industrial scale simulation which consists of tens of millions of particles can take months to complete on large CPU clusters, making the Discrete Element Method (DEM) unfeasible for industrial applications. Simulations are therefore typically restricted to tens of thousands of particles with highly detailed particle shapes or a few million of particles with often oversimplified particle shapes. However, a number of applications require accurate representation of the particle shape to capture the macroscopic behaviour of the particulate system. In this paper we give an overview of the recent extensions to the open source GPU based DEM code, BlazeDEM3D-GPU, that can simulate millions of polyhedra and tens of millions of spheres on a desktop computer with a single or multiple GPUs.
Hill, Jon; Davis, Katie E
2014-01-01
Building large supertrees involves the collection, storage, and processing of thousands of individual phylogenies to create large phylogenies with thousands to tens of thousands of taxa. Such large phylogenies are useful for macroevolutionary studies, comparative biology and in conservation and biodiversity. No easy to use and fully integrated software package currently exists to carry out this task. Here, we present a new Python-based software package that uses well defined XML schema to manage both data and metadata. It builds on previous versions by 1) including new processing steps, such as Safe Taxonomic Reduction, 2) using a user-friendly GUI that guides the user to complete at least the minimum information required and includes context-sensitive documentation, and 3) a revised storage format that integrates both tree- and meta-data into a single file. These data can then be manipulated according to a well-defined, but flexible, processing pipeline using either the GUI or a command-line based tool. Processing steps include standardising names, deleting or replacing taxa, ensuring adequate taxonomic overlap, ensuring data independence, and safe taxonomic reduction. This software has been successfully used to store and process data consisting of over 1000 trees ready for analyses using standard supertree methods. This software makes large supertree creation a much easier task and provides far greater flexibility for further work.
Opportunities and Challenges of Linking Scientific Core Samples to the Geoscience Data Ecosystem
NASA Astrophysics Data System (ADS)
Noren, A. J.
2016-12-01
Core samples generated in scientific drilling and coring are critical for the advancement of the Earth Sciences. The scientific themes enabled by analysis of these samples are diverse, and include plate tectonics, ocean circulation, Earth-life system interactions (paleoclimate, paleobiology, paleoanthropology), Critical Zone processes, geothermal systems, deep biosphere, and many others, and substantial resources are invested in their collection and analysis. Linking core samples to researchers, datasets, publications, and funding agencies through registration of globally unique identifiers such as International Geo Sample Numbers (IGSNs) offers great potential for advancing several frontiers. These include maximizing sample discoverability, access, reuse, and return on investment; a means for credit to researchers; and documentation of project outputs to funding agencies. Thousands of kilometers of core samples and billions of derivative subsamples have been generated through thousands of investigators' projects, yet the vast majority of these samples are curated at only a small number of facilities. These numbers, combined with the substantial similarity in sample types, make core samples a compelling target for IGSN implementation. However, differences between core sample communities and other geoscience disciplines continue to create barriers to implementation. Core samples involve parent-child relationships spanning 8 or more generations, an exponential increase in sample numbers between levels in the hierarchy, concepts related to depth/position in the sample, requirements for associating data derived from core scanning and lithologic description with data derived from subsample analysis, and publications based on tens of thousands of co-registered scan data points and thousands of analyses of subsamples. These characteristics require specialized resources for accurate and consistent assignment of IGSNs, and a community of practice to establish norms, workflows, and infrastructure to support implementation.
Evaluating the Effectiveness of IP Hopping via an Address Routing Gateway
2013-03-01
37 DARPA Defense Advanced Research Projects Agency . . . . . . . . . . . . . . . . . . . . . . . . . . 20 DHCP Dynamic Host...Protocol ( DHCP ) to force the changes. Through the use of a slightly intelligent DHCP server that leases IPs for a only a short time frame (on the order of...tens of minutes) and only offers IPs that have not been used recently, most networks already using DHCP can quickly change to a randomized scheme. This
Krychowiak, M; Adnan, A; Alonso, A; Andreeva, T; Baldzuhn, J; Barbui, T; Beurskens, M; Biel, W; Biedermann, C; Blackwell, B D; Bosch, H S; Bozhenkov, S; Brakel, R; Bräuer, T; Brotas de Carvalho, B; Burhenn, R; Buttenschön, B; Cappa, A; Cseh, G; Czarnecka, A; Dinklage, A; Drews, P; Dzikowicka, A; Effenberg, F; Endler, M; Erckmann, V; Estrada, T; Ford, O; Fornal, T; Frerichs, H; Fuchert, G; Geiger, J; Grulke, O; Harris, J H; Hartfuß, H J; Hartmann, D; Hathiramani, D; Hirsch, M; Höfel, U; Jabłoński, S; Jakubowski, M W; Kaczmarczyk, J; Klinger, T; Klose, S; Knauer, J; Kocsis, G; König, R; Kornejew, P; Krämer-Flecken, A; Krawczyk, N; Kremeyer, T; Książek, I; Kubkowska, M; Langenberg, A; Laqua, H P; Laux, M; Lazerson, S; Liang, Y; Liu, S C; Lorenz, A; Marchuk, A O; Marsen, S; Moncada, V; Naujoks, D; Neilson, H; Neubauer, O; Neuner, U; Niemann, H; Oosterbeek, J W; Otte, M; Pablant, N; Pasch, E; Sunn Pedersen, T; Pisano, F; Rahbarnia, K; Ryć, L; Schmitz, O; Schmuck, S; Schneider, W; Schröder, T; Schuhmacher, H; Schweer, B; Standley, B; Stange, T; Stephey, L; Svensson, J; Szabolics, T; Szepesi, T; Thomsen, H; Travere, J-M; Trimino Mora, H; Tsuchiya, H; Weir, G M; Wenzel, U; Werner, A; Wiegel, B; Windisch, T; Wolf, R; Wurden, G A; Zhang, D; Zimbal, A; Zoletnik, S
2016-11-01
Wendelstein 7-X, a superconducting optimized stellarator built in Greifswald/Germany, started its first plasmas with the last closed flux surface (LCFS) defined by 5 uncooled graphite limiters in December 2015. At the end of the 10 weeks long experimental campaign (OP1.1) more than 20 independent diagnostic systems were in operation, allowing detailed studies of many interesting plasma phenomena. For example, fast neutral gas manometers supported by video cameras (including one fast-frame camera with frame rates of tens of kHz) as well as visible cameras with different interference filters, with field of views covering all ten half-modules of the stellarator, discovered a MARFE-like radiation zone on the inboard side of machine module 4. This structure is presumably triggered by an inadvertent plasma-wall interaction in module 4 resulting in a high impurity influx that terminates some discharges by radiation cooling. The main plasma parameters achieved in OP1.1 exceeded predicted values in discharges of a length reaching 6 s. Although OP1.1 is characterized by short pulses, many of the diagnostics are already designed for quasi-steady state operation of 30 min discharges heated at 10 MW of ECRH. An overview of diagnostic performance for OP1.1 is given, including some highlights from the physics campaigns.
Perspectives on the Near-Earth Object Impact Hazard After Chelyabinsk
NASA Astrophysics Data System (ADS)
Chapman, C. R.
2013-12-01
Until this year, the NEO impact hazard had been regarded as a theoretical example of a very low probability high consequence natural disaster. There had been no confirmed examples of fatalities directly due to asteroid or meteoroid strikes. (There still aren't.) The several megaton Tunguska event in 1908 was in a remote, unpopulated place. So human beings have been witnessing only the tiniest analogs of asteroid strikes, the night-sky meteors and occasional bolides, which - on rare occasions - yield meteoritic fragments that puncture holes in roofs. Though the NEO impact hazard has occasionally been treated in the natural hazards literature, interest primarily remained in the planetary science and aerospace communities. The Chelyabinsk asteroid impact on 15 February 2013 was a real disaster, occurring near a city with a population exceeding a million. Well over a thousand people were injured, thousands of buildings suffered at least superficial damage (mainly to windows), schools and sports facilities were closed, and emergency responders swarmed across the city and surrounding rural areas. While the consequences were very small compared with larger natural disasters, which kill tens of thousands of people annually worldwide, this specific case - for the first time - has permitted a calibration of the consequences of the rare impacts asteroid astronomers have been predicting. There now are reasons to expect that impacts by bodies tens of meters in diameter are several times more frequent than had been thought and each impact is more damaging than previously estimated. The Chelyabinsk event, produced by a 20 meter diameter asteroid, specifically suggests that asteroids just 15 meters diameter, or even smaller, could be very dangerous and damaging; indeed, a more common steeper impact angle would have produced more consequential damage on the ground. This contrasts with estimates a decade earlier [NASA NEO Science Definition Team report, 2003] that asteroids smaller than 40 to 50 meters diameter would explode harmlessly in the upper atmosphere. Given the observed size-frequency relation for NEOs, this means that dangerous impacts could be many tens of times more frequent than had been thought. New observing campaigns (e.g. ATLAS) oriented towards finding roughly half of the frequent smaller impactors meters to tens of meters in size during their final days to weeks before impact will soon result in warnings every few years of a potentially dangerous impact, perhaps requiring evacuation or instructions to shelter-in-place, even though most will turn out to be essentially harmless events. Warnings may become even more frequent as prudent emergency managers take into account the large uncertainties in sizes and destructive potential of these 'final plungers.' So emergency management officials around the world should at least be aware of the potential for a NEO impact to produce a real, if generally minor and local, natural disaster. Fortunately, success of the Spaceguard search for civilization-threatening large NEOs (> 1 km diameter) over the last 15 years has nearly retired the risk of global calamity by impact. So attention turns to the much smaller impacts that are far less dangerous, but soon will be frequently predicted and so cannot be ignored.
Hamad, Rita; Pomeranz, Jennifer L; Siddiqi, Arjumand; Basu, Sanjay
2015-02-01
Analyzing news media allows obesity policy researchers to understand popular conceptions about obesity, which is important for targeting health education and policies. A persistent dilemma is that investigators have to read and manually classify thousands of individual news articles to identify how obesity and obesity-related policy proposals may be described to the public in the media. A machine learning method called "automated content analysis" that permits researchers to train computers to "read" and classify massive volumes of documents was demonstrated. 14,302 newspaper articles that mentioned the word "obesity" during 2011-2012 were identified. Four states that vary in obesity prevalence and policy (Alabama, California, New Jersey, and North Carolina) were examined. The reliability of an automated program to categorize the media's framing of obesity as an individual-level problem (e.g., diet) and/or an environmental-level problem (e.g., obesogenic environment) was tested. The automated program performed similarly to human coders. The proportion of articles with individual-level framing (27.7-31.0%) was higher than the proportion with neutral (18.0-22.1%) or environmental-level framing (16.0-16.4%) across all states and over the entire study period (P<0.05). A novel approach to the study of how obesity concepts are communicated and propagated in news media was demonstrated. © 2014 The Obesity Society.
Statistical photocalibration of photodetectors for radiometry without calibrated light sources
NASA Astrophysics Data System (ADS)
Yielding, Nicholas J.; Cain, Stephen C.; Seal, Michael D.
2018-01-01
Calibration of CCD arrays for identifying bad pixels and achieving nonuniformity correction is commonly accomplished using dark frames. This kind of calibration technique does not achieve radiometric calibration of the array since only the relative response of the detectors is computed. For this, a second calibration is sometimes utilized by looking at sources with known radiances. This process can be used to calibrate photodetectors as long as a calibration source is available and is well-characterized. A previous attempt at creating a procedure for calibrating a photodetector using the underlying Poisson nature of the photodetection required calculations of the skewness of the photodetector measurements. Reliance on the third moment of measurement meant that thousands of samples would be required in some cases to compute that moment. A photocalibration procedure is defined that requires only first and second moments of the measurements. The technique is applied to image data containing a known light source so that the accuracy of the technique can be surmised. It is shown that the algorithm can achieve accuracy of nearly 2.7% of the predicted number of photons using only 100 frames of image data.
High throughput imaging cytometer with acoustic focussing.
Zmijan, Robert; Jonnalagadda, Umesh S; Carugo, Dario; Kochi, Yu; Lemm, Elizabeth; Packham, Graham; Hill, Martyn; Glynne-Jones, Peter
2015-10-31
We demonstrate an imaging flow cytometer that uses acoustic levitation to assemble cells and other particles into a sheet structure. This technique enables a high resolution, low noise CMOS camera to capture images of thousands of cells with each frame. While ultrasonic focussing has previously been demonstrated for 1D cytometry systems, extending the technology to a planar, much higher throughput format and integrating imaging is non-trivial, and represents a significant jump forward in capability, leading to diagnostic possibilities not achievable with current systems. A galvo mirror is used to track the images of the moving cells permitting exposure times of 10 ms at frame rates of 50 fps with motion blur of only a few pixels. At 80 fps, we demonstrate a throughput of 208 000 beads per second. We investigate the factors affecting motion blur and throughput, and demonstrate the system with fluorescent beads, leukaemia cells and a chondrocyte cell line. Cells require more time to reach the acoustic focus than beads, resulting in lower throughputs; however a longer device would remove this constraint.
NASA Astrophysics Data System (ADS)
Ciufolini, Ignazio; Paolozzi, Antonio; Pavlis, Erricos C.; Sindoni, Giampiero; Koenig, Rolf; Ries, John C.; Matzner, Richard; Gurzadyan, Vahe; Penrose, Roger; Rubincam, David; Paris, Claudio
2017-08-01
We introduce the LARES 2 space experiment recently approved by the Italian Space Agency (ASI). The LARES 2 satellite is planned for launch in 2019 with the new VEGA C launch vehicle of ASI, ESA and ELV. The orbital analysis of LARES 2 experiment will be carried out by our international science team of experts in General Relativity, theoretical physics, space geodesy and aerospace engineering. The main objectives of the LARES 2 experiment are gravitational and fundamental physics, including accurate measurements of General Relativity, in particular a test of frame-dragging aimed at achieving an accuracy of a few parts in a thousand, i.e., aimed at improving by about an order of magnitude the present state-of-the-art and forthcoming tests of this general relativistic phenomenon. LARES 2 will also achieve determinations in space geodesy. LARES 2 is an improved version of the LAGEOS 3 experiment, proposed in 1984 to measure frame-dragging and analyzed in 1989 by a joint ASI and NASA study.
Hololujah; A One Kilometre Art Hologram
NASA Astrophysics Data System (ADS)
Warren, David
2013-02-01
This paper will outline the production of the white light transmission achromatic image art hologram titled Hololujah displaying forward projecting real imagery of text and measuring 100,000 × 3.5 cm. Materials and methods: The paper will cover the use of Slavich VRP-M film 33.3 × 1.05 metres that was exposed and processed as thirty-three and a third 100 × 105 cm frames. These thirty-three and a third frames were subsequently cut into one thousand 100 × 3.5 cm strips with their ends cold laminated together to form the kilometre length hologram. This paper will expand on the use of a Coherent Compass 315M, 532 nm, 150 mW DPSS laser diode in a lensless setup, using a single beam through diffuse glass, no isolation systems and a two minute exposure time with the film lying flat on the floor. Lastly, this paper will illustrate how the hologram was produced in a 220 × 200 × 300 cm confined area of a suburban bedroom. Theme: This artwork is a comment on the social networking site Twitter.
Imaging intracellular protein dynamics by spinning disk confocal microscopy
Stehbens, Samantha; Pemble, Hayley; Murrow, Lindsay; Wittmann, Torsten
2012-01-01
The palette of fluorescent proteins has grown exponentially over the last decade, and as a result live imaging of cells expressing fluorescently tagged proteins is becoming more and more main stream. Spinning disk confocal microscopy (SDC) is a high speed optical sectioning technique, and a method of choice to observe and analyze intracellular fluorescent protein dynamics at high spatial and temporal resolution. In an SDC system, a rapidly rotating pinhole disk generates thousands of points of light that scan the specimen simultaneously, which allows direct capture of the confocal image with low noise scientific grade cooled charged-coupled device (CCD) cameras, and can achieve frame rates of up 1000 frames per second. In this chapter we describe important components of a state-of-the-art spinning disk system optimized for live cell microscopy, and provide a rationale for specific design choices. We also give guidelines how other imaging techniques such as total internal reflection (TIRF) microscopy or spatially controlled photoactivation can be coupled with SDC imaging, and provide a short protocol on how to generate cell lines stably expressing fluorescently tagged proteins by lentivirus-mediated transduction. PMID:22264541
Deep learning massively accelerates super-resolution localization microscopy.
Ouyang, Wei; Aristov, Andrey; Lelek, Mickaël; Hao, Xian; Zimmer, Christophe
2018-06-01
The speed of super-resolution microscopy methods based on single-molecule localization, for example, PALM and STORM, is limited by the need to record many thousands of frames with a small number of observed molecules in each. Here, we present ANNA-PALM, a computational strategy that uses artificial neural networks to reconstruct super-resolution views from sparse, rapidly acquired localization images and/or widefield images. Simulations and experimental imaging of microtubules, nuclear pores, and mitochondria show that high-quality, super-resolution images can be reconstructed from up to two orders of magnitude fewer frames than usually needed, without compromising spatial resolution. Super-resolution reconstructions are even possible from widefield images alone, though adding localization data improves image quality. We demonstrate super-resolution imaging of >1,000 fields of view containing >1,000 cells in ∼3 h, yielding an image spanning spatial scales from ∼20 nm to ∼2 mm. The drastic reduction in acquisition time and sample irradiation afforded by ANNA-PALM enables faster and gentler high-throughput and live-cell super-resolution imaging.
2017-01-01
Abstract Tens of thousands of women were coercively sterilized in Czechoslovakia and its successor states. Romani women were particularly targeted for these measures. These practices stopped only in 2004, as a result of international pressure. Although some measures have been taken to ensure that these practices are not repeated, to date neither the Czech Republic nor Slovakia have completed the work of providing effective remedy to victims, as is their right. This article focusses on efforts in the Czech Republic. It concludes that, inter alia, an administrative mechanism is needed to provide financial compensation to victims, since the road to remedy via courts is effectively blocked. PMID:29302159
Dynamics of playa lakes in the Texas High Plains
NASA Technical Reports Server (NTRS)
Reeves, C. C., Jr. (Principal Investigator)
1972-01-01
The author has identified the following significant results. Regional viewing of ERTS-1 imagery around the test sites shows that storm paths can be accurately traced and a count made of the number of intermittent lake basins filled by the storm. Therefore, during wet years ERTS-type imagery can be used to conduct a reliable count of the tens of thousands of natural lake basins on the southern High Plains which contain water. This type of regional overview of water filled basins in the normally arid southern High Plains is illustrated by bands 6 and 7, ERTS E-1078-16524.
Thermonuclear runaways in thick hydrogen rich envelopes of neutron stars
NASA Technical Reports Server (NTRS)
Starrfield, S. G.; Kenyon, S.; Truran, J. W.; Sparks, W. M.
1981-01-01
A Lagrangian, fully implicit, one dimensional hydrodynamic computer code was used to evolve thermonuclear runaways in the accreted hydrogen rich envelopes of 1.0 Msub solar neutron stars with radii of 10 km and 20 km. Simulations produce outbursts which last from about 750 seconds to about one week. Peak effective temeratures and luninosities were 26 million K and 80 thousand Lsub solar for the 10 km study and 5.3 millison and 600 Lsub solar for the 20 km study. Hydrodynamic expansion on the 10 km neutron star produced a precursor lasting about one ten thousandth seconds.
Reliability, synchrony and noise
Ermentrout, G. Bard; Galán, Roberto F.; Urban, Nathaniel N.
2008-01-01
The brain is noisy. Neurons receive tens of thousands of highly fluctuating inputs and generate spike trains that appear highly irregular. Much of this activity is spontaneous—uncoupled to overt stimuli or motor outputs—leading to questions about the functional impact of this noise. Although noise is most often thought of as disrupting patterned activity and interfering with the encoding of stimuli, recent theoretical and experimental work has shown that noise can play a constructive role—leading to increased reliability or regularity of neuronal firing in single neurons and across populations. These results raise fundamental questions about how noise can influence neural function and computation. PMID:18603311
Precision Astrophysics Experiments with the Kepler Satellite
NASA Astrophysics Data System (ADS)
Jackiewicz, Jason
2012-10-01
Long photometric observations from space of tens of thousands of stars, such as those provided by Kepler, offer unique opportunities to carry out ensemble astrophysics as well as detailed studies of individual objects. One of the primary tools at our disposal for understanding pulsating stars is asteroseismology, which uses observed stellar oscillation frequencies to determine interior properties. This can provide very strict constraints on theories of stellar evolution, structure, and the population characteristics of stars in the Milky Way galaxy. This talk will focus on several of the exciting insights Kepler has enabled through asteroseismology of stars across the H-R diagram.
The role of the dentist in identifying missing and unidentified persons.
Riley, Amber D
2015-01-01
The longer a person is missing, the more profound the need for dental records becomes. In 2013, there were >84,000 missing persons and >8,000 unidentified persons registered in the National Crime Information Center (NCIC) database. Tens of thousands of families are left without answers or closure, always maintaining hope that their relative will be located. Law enforcement needs the cooperation of organized dentistry to procure dental records, translate their findings, and upload them into the NCIC database for cross-matching with unidentified person records created by medical examiner and coroner departments across the United States and Canada.
Vaccines and Immunization Practice.
Hogue, Michael D; Meador, Anna E
2016-03-01
Vaccines are among most cost-effective public health strategies. Despite effective vaccines for many bacterial and viral illnesses, tens of thousands of adults and hundreds of children die each year in the United States from vaccine-preventable diseases. Underutilization of vaccines requires rethinking the approach to incorporating vaccines into practice. Arguably, immunizations could be a part all health care encounters. Shared responsibility is paramount if deaths are to be reduced. This article reviews the available vaccines in the US market, as well as practice recommendations of the Centers for Disease Control and Prevention's Advisory Committee on Immunization Practices. Copyright © 2016 Elsevier Inc. All rights reserved.
Algorithms for classification of astronomical object spectra
NASA Astrophysics Data System (ADS)
Wasiewicz, P.; Szuppe, J.; Hryniewicz, K.
2015-09-01
Obtaining interesting celestial objects from tens of thousands or even millions of recorded optical-ultraviolet spectra depends not only on the data quality but also on the accuracy of spectra decomposition. Additionally rapidly growing data volumes demands higher computing power and/or more efficient algorithms implementations. In this paper we speed up the process of substracting iron transitions and fitting Gaussian functions to emission peaks utilising C++ and OpenCL methods together with the NOSQL database. In this paper we implemented typical astronomical methods of detecting peaks in comparison to our previous hybrid methods implemented with CUDA.
Dissecting the genetics of complex traits using summary association statistics.
Pasaniuc, Bogdan; Price, Alkes L
2017-02-01
During the past decade, genome-wide association studies (GWAS) have been used to successfully identify tens of thousands of genetic variants associated with complex traits and diseases. These studies have produced extensive repositories of genetic variation and trait measurements across large numbers of individuals, providing tremendous opportunities for further analyses. However, privacy concerns and other logistical considerations often limit access to individual-level genetic data, motivating the development of methods that analyse summary association statistics. Here, we review recent progress on statistical methods that leverage summary association data to gain insights into the genetic basis of complex traits and diseases.
ICPS Turnover GSDO Employee Event
2017-11-07
Kennedy Space Center Associate Director Kelvin Manning, right, speaks with a guest during a ceremony marking NASA's Spacecraft/Payload Integration and Evolution (SPIE) organization formally turning over processing of the Space Launch System (SLS) rocket's Interim Cryogenic Propulsion Stage (ICPS) to the center's Ground Systems Development and Operations (GSDO) Directorate. The ICPS is the first integrated piece of flight hardware to arrive in preparation for the uncrewed Exploration Mission-1. With the Orion attached, the ICPS sits atop the SLS rocket and will provide the spacecraft with the additional thrust needed to travel tens of thousands of miles beyond the Moon.
Interim Cryogenic Propulsion Stage (ICPS) Handover Signing
2017-10-26
Meeting in the Launch Control Center of NASA's Kennedy Space Center in Florida, officials of the agency's Spacecraft/Payload Integration and Evolution (SPIE) organization formally turn over processing of the Space Launch System (SLS) rocket's Interim Cryogenic Propulsion Stage (ICPS) to the center's Ground Systems Development and Operations (GSDO) directorate. The ICPS is the first integrated piece of flight hardware to arrive in preparation for the uncrewed Exploration Mission-1. With the Orion attached, the ICPS sits atop the SLS rocket and will provide the spacecraft with the additional thrust needed to travel tens of thousands of miles beyond the Moon.
Ibrahim, Khaled Z.; Madduri, Kamesh; Williams, Samuel; ...
2013-07-18
The Gyrokinetic Toroidal Code (GTC) uses the particle-in-cell method to efficiently simulate plasma microturbulence. This paper presents novel analysis and optimization techniques to enhance the performance of GTC on large-scale machines. We introduce cell access analysis to better manage locality vs. synchronization tradeoffs on CPU and GPU-based architectures. Finally, our optimized hybrid parallel implementation of GTC uses MPI, OpenMP, and NVIDIA CUDA, achieves up to a 2× speedup over the reference Fortran version on multiple parallel systems, and scales efficiently to tens of thousands of cores.
NASA Technical Reports Server (NTRS)
Phillips, Veronica J.
2017-01-01
STI is for a fact sheet on the Space Object Query Tool being created by the MDC. When planning launches, NASA must first factor in the tens of thousands of objects already in orbit around the Earth. The number of human-made objects, including nonfunctional spacecraft, abandoned launch vehicle stages, mission-related debris and fragmentation debris orbiting Earth has grown steadily since Sputnik 1 was launched in 1957. Currently, the U.S. Department of Defenses Joint Space Operations Center, or JSpOC, tracks over 15,000 distinct objects and provides data for more than 40,000 objects via its Space-Track program, found at space-track.org.
Supersonic gas streams enhance the formation of massive black holes in the early universe
NASA Astrophysics Data System (ADS)
Hirano, Shingo; Hosokawa, Takashi; Yoshida, Naoki; Kuiper, Rolf
2017-09-01
Supermassive black holes existed less than a billion years after the Big Bang. Because black holes can grow at a maximum rate that depends on their current mass, it has been difficult to understand how such massive black holes could have formed so quickly. Hirano et al. developed simulations to show that streaming motions—velocity offsets between the gas and dark matter components—could have produced black holes with tens of thousands of solar masses in the early universe. That's big enough to grow into the supermassive black holes that we observe today.
Dissecting the genetics of complex traits using summary association statistics
Pasaniuc, Bogdan; Price, Alkes L.
2017-01-01
During the past decade, genome-wide association studies (GWAS) have successfully identified tens of thousands of genetic variants associated with complex traits and diseases. These studies have produced extensive repositories of genetic variation and trait measurements across large numbers of individuals, providing tremendous opportunities for further analyses. However, privacy concerns and other logistical considerations often limit access to individual-level genetic data, motivating the development of methods that analyze summary association statistics. Here we review recent progress on statistical methods that leverage summary association data to gain insights into the genetic basis of complex traits and diseases. PMID:27840428
Catching Cosmic Light with the Galileoscope
NASA Astrophysics Data System (ADS)
Fienberg, R. T.; Arion, D. N.
2015-09-01
Created for the 2009 International Year of Astronomy, the Galileoscope solved a long-standing problem: the lack of high quality, low cost telescope kits suitable for both optics education and celestial observation. Through an effort managed entirely by the volunteers who have authored this article almost 240 000 Galileoscope kits have now been distributed in 106 countries across the globe, for use in science teaching and public outreach. The Galileoscope outreach programme for the 2015 International Year of Light is now in full swing, giving tens of thousands of students, teachers and parents their first telescopic look at the Moon's craters and Saturn's rings.
Guyatt, G H; Cook, D J; King, D; Norman, G R; Kane, S L; van Ineveld, C
1999-02-01
To determine whether framing questions positively or negatively influences residents' apparent satisfaction with their training. In 1993-94, 276 residents at five Canadian internal medicine residency programs responded to 53 Likert-scale items designed to determine sources of the residents' satisfaction and stress. Two versions of the questionnaire were randomly distributed: one in which half the items were stated positively and the other half negatively, the other version in which the items were stated in the opposite way. The residents scored 43 of the 53 items higher when stated positively and scored ten higher when stated negatively (p < .0001). When analyzed using an analysis-of-variance model, the effect of positive versus negative framing was highly significant (F = 129.81, p < .0001). While the interaction between item and framing was also significant, the effect was much less strong (F = 5.56, p < .0001). On a scale where 1 represented the lowest possible level of satisfaction and 7 the highest, the mean score of the positively stated items was 4.1 and that of the negatively stated items, 3.8, an effect of 0.3. These results suggest a significant "response acquiescence bias." To minimize this bias, questionnaires assessing attitudes toward educational programs should include a mix of positively and negatively stated items.
ERIC Educational Resources Information Center
ROSEN, ELLEN F.; STOLUROW, LAWRENCE M.
IN ORDER TO FIND A GOOD PREDICTOR OF EMPIRICAL DIFFICULTY, AN OPERATIONAL DEFINITION OF STEP SIZE, TEN PROGRAMER-JUDGES RATED CHANGE IN COMPLEXITY IN TWO VERSIONS OF A MATHEMATICS PROGRAM, AND THESE RATINGS WERE THEN COMPARED WITH MEASURES OF EMPIRICAL DIFFICULTY OBTAINED FROM STUDENT RESPONSE DATA. THE TWO VERSIONS, A 54 FRAME BOOKLET AND A 35…
VizieR Online Data Catalog: BV(RI)c light curves of FF Vul (Samec+, 2016)
NASA Astrophysics Data System (ADS)
Samec, R. G.; Nyaude, R.; Caton, D.; van Hamme, W.
2017-02-01
The present BVRcIc light curves were taken by DC, the Dark Sky Observatory 0.81m reflector at Phillips Gap, North Carolina. These were taken on 2015 September 12, 13, 14 and 15, and October 15, with a thermoelectrically cooled (-40°C) 2*2K Apogee Alta camera. Additional observations were obtained remotely with the SARA north 0.91m reflector at KPNO on 2015 September 20 and October 11, with the ARC 2*2K camera cooled to -110°C. Individual observations were taken at both sites with standard Johnson-Cousins filters, and included 444 field images in B, 451 in V, 443 in Rc, and 445 in Ic. The standard error was ~7mmag in each of B, V, Rc and Ic. Nightly images were calibrated with 25 bias frames, five flat frames in each filter, and ten 300s dark frames. The exposure times were 40-50s in B, 25-30s in V, 15-25s in Rc and Ic. Our observations are listed in Table1. (1 data file).
In-frame mutations in exon 1 of SKI cause dominant Shprintzen-Goldberg syndrome.
Carmignac, Virginie; Thevenon, Julien; Adès, Lesley; Callewaert, Bert; Julia, Sophie; Thauvin-Robinet, Christel; Gueneau, Lucie; Courcet, Jean-Benoit; Lopez, Estelle; Holman, Katherine; Renard, Marjolijn; Plauchu, Henri; Plessis, Ghislaine; De Backer, Julie; Child, Anne; Arno, Gavin; Duplomb, Laurence; Callier, Patrick; Aral, Bernard; Vabres, Pierre; Gigot, Nadège; Arbustini, Eloisa; Grasso, Maurizia; Robinson, Peter N; Goizet, Cyril; Baumann, Clarisse; Di Rocco, Maja; Sanchez Del Pozo, Jaime; Huet, Frédéric; Jondeau, Guillaume; Collod-Beroud, Gwenaëlle; Beroud, Christophe; Amiel, Jeanne; Cormier-Daire, Valérie; Rivière, Jean-Baptiste; Boileau, Catherine; De Paepe, Anne; Faivre, Laurence
2012-11-02
Shprintzen-Goldberg syndrome (SGS) is characterized by severe marfanoid habitus, intellectual disability, camptodactyly, typical facial dysmorphism, and craniosynostosis. Using family-based exome sequencing, we identified a dominantly inherited heterozygous in-frame deletion in exon 1 of SKI. Direct sequencing of SKI further identified one overlapping heterozygous in-frame deletion and ten heterozygous missense mutations affecting recurrent residues in 18 of the 19 individuals screened for SGS; these individuals included one family affected by somatic mosaicism. All mutations were located in a restricted area of exon 1, within the R-SMAD binding domain of SKI. No mutation was found in a cohort of 11 individuals with other marfanoid-craniosynostosis phenotypes. The interaction between SKI and Smad2/3 and Smad 4 regulates TGF-β signaling, and the pattern of anomalies in Ski-deficient mice corresponds to the clinical manifestations of SGS. These findings define SGS as a member of the family of diseases associated with the TGF-β-signaling pathway. Copyright © 2012 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.
Fierstein, J.; Wilson, C.J.N.
2005-01-01
The 1912 Valley of Ten Thousand Smokes (VTTS) ignimbrite was constructed from 9 compositionally distinct, sequentially emplaced packages, each with distinct proportions of rhyolite (R), dacite (D), and andesite (A) pumices that permit us to map package boundaries and flow paths from vent to distal extents. Changing pumice proportions and interbedding relationships link ignimbrite formation to coeval fall deposition during the first ???16 h (Episode I) of the eruption. Pumice compositional proportions in the ignimbrite were estimated by counts on ???100 lapilli at multiple levels in vertical sections wherever accessible and more widely over most of the ignimbrite surface in the VTTS. The initial, 100% rhyolite ignimbrite package (equivalent to regional fall Layer A and occupying ???3.5 h) was followed by packages with increasing proportions of andesite, then dacite, emplaced over ???12.5 h and equivalent to regional fall Layers B1-B3. Coeval fall deposits are locally intercalated with the ignimbrite and show parallel changes in R:D (rhyolite:dacite) proportions, but lack significant amounts of andesite. Andesite was thus dominantly a low-fountaining component in the eruption column and is preferentially represented in packages filling the VTTS north of the vent. The most extensive packages (3 and 4) occur in B1 and early B2 times where flow mobility and volume were optimized; earlier all-rhyolite flows (Package 1) were highly energetic but less voluminous, while later packages (5-9) were both less voluminous and emplaced at lower velocities. Package boundaries are expressed as one or more of the following: sharp color changes corresponding to compositional variations; persistent finer-grained basal parts of flow units; compaction swales filled by later packages; erosional channels cut by the flows that fill them; lobate accumulations of one package; and (mostly south of the vent) intercalated fall deposit layers. Clear flow-unit boundaries are best developed between ignimbrite of non-successive packages, indicating time breaks of tens of minutes to hours. Less well-defined stratification may represent rapidly emplaced successive flow units but often changes over short distances and indicates variations in localized depositional conditions. ?? 2005 Geological Society of America.
Smoothed particle hydrodynamics with GRAPE-1A
NASA Technical Reports Server (NTRS)
Umemura, Masayuki; Fukushige, Toshiyuki; Makino, Junichiro; Ebisuzaki, Toshikazu; Sugimoto, Daiichiro; Turner, Edwin L.; Loeb, Abraham
1993-01-01
We describe the implementation of a smoothed particle hydrodynamics (SPH) scheme using GRAPE-1A, a special-purpose processor used for gravitational N-body simulations. The GRAPE-1A calculates the gravitational force exerted on a particle from all other particles in a system, while simultaneously making a list of the nearest neighbors of the particle. It is found that GRAPE-1A accelerates SPH calculations by direct summation by about two orders of magnitudes for a ten thousand-particle simulation. The effective speed is 80 Mflops, which is about 30 percent of the peak speed of GRAPE-1A. Also, in order to investigate the accuracy of GRAPE-SPH, some test simulations were executed. We found that the force and position errors are smaller than those due to representing a fluid by a finite number of particles. The total energy and momentum were conserved within 0.2-0.4 percent and 2-5 x 10 exp -5, respectively, in simulations with several thousand particles. We conclude that GRAPE-SPH is quite effective and sufficiently accurate for self-gravitating hydrodynamics.
Electroforming of optical tooling in high-strength Ni-Co alloy
NASA Astrophysics Data System (ADS)
Stein, Berl
2003-05-01
Plastic optics are often mass produced by injection, compression or injection-compression molding. Optical quality molds can be directly machined in appropriate materials (tool steels, electroless nickel, aluminum, etc.), but much greater cost efficiency can be achieved with electroformed modl inserts. Traditionally, electroforming of optical quality mold inserts has been carried out in nickel, a material much softer than tool steels which, when hardened to 45 - 50 HRc usually exhibit high wear resistance and long service life (hundreds of thousands of impressions per mold). Because of their low hardness (< 20 HRc), nickel molds can produce only tens of thousands of parts before they are scrapped due to wear or accidental damage. This drawback prevented their wider usage in general plastic and optical mold making. Recently, NiCoForm has developed a proprietary Ni-CO electroforming bath combining the high strength and wear resistance of the alloy with the low stress and high replication fidelity typical of pure nickel electroforming. This paper will outline the approach to electroforming of optical quality tooling in low stress, high strength Ni-Co alloy and present several examples of electroformed NiColoy mold inserts.
Genome-scale measurement of off-target activity using Cas9 toxicity in high-throughput screens.
Morgens, David W; Wainberg, Michael; Boyle, Evan A; Ursu, Oana; Araya, Carlos L; Tsui, C Kimberly; Haney, Michael S; Hess, Gaelen T; Han, Kyuho; Jeng, Edwin E; Li, Amy; Snyder, Michael P; Greenleaf, William J; Kundaje, Anshul; Bassik, Michael C
2017-05-05
CRISPR-Cas9 screens are powerful tools for high-throughput interrogation of genome function, but can be confounded by nuclease-induced toxicity at both on- and off-target sites, likely due to DNA damage. Here, to test potential solutions to this issue, we design and analyse a CRISPR-Cas9 library with 10 variable-length guides per gene and thousands of negative controls targeting non-functional, non-genic regions (termed safe-targeting guides), in addition to non-targeting controls. We find this library has excellent performance in identifying genes affecting growth and sensitivity to the ricin toxin. The safe-targeting guides allow for proper control of toxicity from on-target DNA damage. Using this toxicity as a proxy to measure off-target cutting, we demonstrate with tens of thousands of guides both the nucleotide position-dependent sensitivity to single mismatches and the reduction of off-target cutting using truncated guides. Our results demonstrate a simple strategy for high-throughput evaluation of target specificity and nuclease toxicity in Cas9 screens.
Genome-scale measurement of off-target activity using Cas9 toxicity in high-throughput screens
Morgens, David W.; Wainberg, Michael; Boyle, Evan A.; Ursu, Oana; Araya, Carlos L.; Tsui, C. Kimberly; Haney, Michael S.; Hess, Gaelen T.; Han, Kyuho; Jeng, Edwin E.; Li, Amy; Snyder, Michael P.; Greenleaf, William J.; Kundaje, Anshul; Bassik, Michael C.
2017-01-01
CRISPR-Cas9 screens are powerful tools for high-throughput interrogation of genome function, but can be confounded by nuclease-induced toxicity at both on- and off-target sites, likely due to DNA damage. Here, to test potential solutions to this issue, we design and analyse a CRISPR-Cas9 library with 10 variable-length guides per gene and thousands of negative controls targeting non-functional, non-genic regions (termed safe-targeting guides), in addition to non-targeting controls. We find this library has excellent performance in identifying genes affecting growth and sensitivity to the ricin toxin. The safe-targeting guides allow for proper control of toxicity from on-target DNA damage. Using this toxicity as a proxy to measure off-target cutting, we demonstrate with tens of thousands of guides both the nucleotide position-dependent sensitivity to single mismatches and the reduction of off-target cutting using truncated guides. Our results demonstrate a simple strategy for high-throughput evaluation of target specificity and nuclease toxicity in Cas9 screens. PMID:28474669
An efficient method to identify differentially expressed genes in microarray experiments
Qin, Huaizhen; Feng, Tao; Harding, Scott A.; Tsai, Chung-Jui; Zhang, Shuanglin
2013-01-01
Motivation Microarray experiments typically analyze thousands to tens of thousands of genes from small numbers of biological replicates. The fact that genes are normally expressed in functionally relevant patterns suggests that gene-expression data can be stratified and clustered into relatively homogenous groups. Cluster-wise dimensionality reduction should make it feasible to improve screening power while minimizing information loss. Results We propose a powerful and computationally simple method for finding differentially expressed genes in small microarray experiments. The method incorporates a novel stratification-based tight clustering algorithm, principal component analysis and information pooling. Comprehensive simulations show that our method is substantially more powerful than the popular SAM and eBayes approaches. We applied the method to three real microarray datasets: one from a Populus nitrogen stress experiment with 3 biological replicates; and two from public microarray datasets of human cancers with 10 to 40 biological replicates. In all three analyses, our method proved more robust than the popular alternatives for identification of differentially expressed genes. Availability The C++ code to implement the proposed method is available upon request for academic use. PMID:18453554
Lab-to-Lab Cooperative Threat Reduction
NASA Astrophysics Data System (ADS)
Hecker, Siegfried S.
2017-11-01
It is difficult to imagine today how dramatically global nuclear risks changed 25 years ago as the Soviet Union disintegrated. Instead of the threat of mutual nuclear annihilation, the world became concerned that Russia and the other 14 former Soviet states would lose control of their huge nuclear assets - tens of thousands of nuclear weapons, more than a million kilograms of fissile materials, hundreds of thousands of nuclear workers, and a huge nuclear complex. I will describe how scientists and engineers at the DOE laboratories, with a focus on Los Alamos, Lawrence Livermore and Sandia national laboratories, joined forces with those at the Russian nuclear weapon institutes for more than 20 years to avoid what looked like the perfect nuclear storm - a story told in the two-volume book Doomed to Cooperate1 published in 2016. Due to an internal processing error, an incorrect version of this article was published on 15 November 2017 that omitted the footnotes. AIP Publishing apologizes for this error. An updated version of this article, including the missing footnotes, was published on 21 November 2017.
Emerging Tools to Estimate and to Predict Exposures to ...
The timely assessment of the human and ecological risk posed by thousands of existing and emerging commercial chemicals is a critical challenge facing EPA in its mission to protect public health and the environment The US EPA has been conducting research to enhance methods used to estimate and forecast exposures for tens of thousands of chemicals. This research is aimed at both assessing risks and supporting life cycle analysis, by developing new models and tools for high throughput exposure screening and prioritization, as well as databases that support these and other tools, especially regarding consumer products. The models and data address usage, and take advantage of quantitative structural activity relationships (QSARs) for both inherent chemical properties and function (why the chemical is a product ingredient). To make them more useful and widely available, the new tools, data and models are designed to be: • Flexible • Intraoperative • Modular (useful to more than one, stand-alone application) • Open (publicly available software) Presented at the Society for Risk Analysis Forum: Risk Governance for Key Enabling Technologies, Venice, Italy, March 1-3, 2017
Manek, Nisha J
2012-01-01
A remarkable phenomenon is taking place around the globe, one that I have been fortunate enough to witness and in which to participate. The relics of the historical Buddha, also known as Siddhartha or Shakyamuni Buddha, still survive today over 2500 years since his enlightenment, and, for the first time in history, are traveling throughout the world. In common Buddhist practice, relics are highly venerated and treasured remains of realized Masters. It is very rare for relics to travel from city to city and be available for viewing by the general public. The Buddha relic tour is demonstrating that a direct experience of the spiritual state is not mysterious, nor is it for a select few. The spiritual state, here defined as a universal theme of unconditional love, is a component of human evolutionary unfoldment, a process through which thousands of human beings have passed, and through which thousands more will pass. We are "waking up" as a species. Consequently, more information is required about this transformation of human consciousness. The Buddha relics offer us a priceless means by which we can obtain a richer perspective about the nature of human consciousness, spiritual realities such as love, and ultimately understanding ourselves.
Scanning Apollo Flight Films and Reconstructing CSM Trajectories
NASA Astrophysics Data System (ADS)
Speyerer, E.; Robinson, M. S.; Grunsfeld, J. M.; Locke, S. D.; White, M.
2006-12-01
Over thirty years ago, the astronauts of the Apollo program made the journey from the Earth to the Moon and back. To record their historic voyages and collect scientific observations many thousands of photographs were acquired with handheld and automated cameras. After returning to Earth, these films were developed and stored at the film archive at Johnson Space Center (JSC), where they still reside. Due to the historical significance of the original flight films typically only duplicate (2nd or 3rd generation) film products are studied and used to make prints. To allow full access to the original flight films for both researchers and the general public, JSC and Arizona State University are scanning and creating an online digital archive. A Leica photogrammetric scanner is being used to insure geometric and radiometric fidelity. Scanning resolution will preserve the grain of the film. Color frames are being scanned and archived as 48 bit pixels to insure capture of the full dynamic range of the film (16 bit for BW). The raw scans will consist of 70 Terabytes of data (10,000 BW Hasselblad, 10,000 color Hasselblad, 10,000 Metric frames, 4500 Pan frames, 620 35mm frames counts; are estimates). All the scanned films will be made available for download through a searchable database. Special tools are being developed to locate images based on various search parameters. To geolocate metric and panoramic frames acquired during Apollos 15\\-17, prototype SPICE kernels are being generated from existing photographic support data by entering state vectors and timestamps from multiple points throughout each orbit into the NAIF toolkit to create a type 9 Spacecraft and Planet Ephemeris Kernel (SPK), a nadir pointing C\\- matrix Kernel (CK), and a Spacecraft Clock Kernel (SCLK). These SPICE kernels, in addition to the Instrument Kernel (IK) and Frames Kernel (FK) that also under development, will be archived along with the scanned images. From the generated kernels, several IDL programs have been designed to display orbital tracks, produce footprint plots, and create image projections. Using the output from these SPICE based programs enables accurate geolocating of SIM bay photography as well as providing potential data from lunar gravitational studies.
Fundraising for Accelerated Study for the PhD in Nursing: A Community Partnership.
Starck, Patricia L
2015-01-01
This article describes fundraising strategies by a School of Nursing to support a post-master's accelerated (3-year) PhD degree program. A sample proposal to solicit funds is included, as well as a contract that students sign before accepting the scholarship and agreeing to teach for 3 years or repay the money. The first campaign raised $2.3 million for ten students, and the second campaign raised $1.3 million for six students. One useful marketing strategy is to show the impact of an investment in educating ten doctoral students who will become faculty and teach 100 additional students per year, who will then become professionals caring for thousands of patients during their careers. Over a 10 year period, the impact of an accelerated program is enormous, with 660 students taught who in their lifetime will care for 2.4 million patients. The article also discusses motivation and mind sets for giving to promote success in fundraising. Copyright © 2015 Elsevier Inc. All rights reserved.