Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications
NASA Astrophysics Data System (ADS)
Maskey, M.; Ramachandran, R.; Miller, J.
2017-12-01
Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.
Stability of Rasch Scales over Time
ERIC Educational Resources Information Center
Taylor, Catherine S.; Lee, Yoonsun
2010-01-01
Item response theory (IRT) methods are generally used to create score scales for large-scale tests. Research has shown that IRT scales are stable across groups and over time. Most studies have focused on items that are dichotomously scored. Now Rasch and other IRT models are used to create scales for tests that include polytomously scored items.…
How institutions shaped the last major evolutionary transition to large-scale human societies
Powers, Simon T.; van Schaik, Carel P.; Lehmann, Laurent
2016-01-01
What drove the transition from small-scale human societies centred on kinship and personal exchange, to large-scale societies comprising cooperation and division of labour among untold numbers of unrelated individuals? We propose that the unique human capacity to negotiate institutional rules that coordinate social actions was a key driver of this transition. By creating institutions, humans have been able to move from the default ‘Hobbesian’ rules of the ‘game of life’, determined by physical/environmental constraints, into self-created rules of social organization where cooperation can be individually advantageous even in large groups of unrelated individuals. Examples include rules of food sharing in hunter–gatherers, rules for the usage of irrigation systems in agriculturalists, property rights and systems for sharing reputation between mediaeval traders. Successful institutions create rules of interaction that are self-enforcing, providing direct benefits both to individuals that follow them, and to individuals that sanction rule breakers. Forming institutions requires shared intentionality, language and other cognitive abilities largely absent in other primates. We explain how cooperative breeding likely selected for these abilities early in the Homo lineage. This allowed anatomically modern humans to create institutions that transformed the self-reliance of our primate ancestors into the division of labour of large-scale human social organization. PMID:26729937
CFD Script for Rapid TPS Damage Assessment
NASA Technical Reports Server (NTRS)
McCloud, Peter
2013-01-01
This grid generation script creates unstructured CFD grids for rapid thermal protection system (TPS) damage aeroheating assessments. The existing manual solution is cumbersome, open to errors, and slow. The invention takes a large-scale geometry grid and its large-scale CFD solution, and creates a unstructured patch grid that models the TPS damage. The flow field boundary condition for the patch grid is then interpolated from the large-scale CFD solution. It speeds up the generation of CFD grids and solutions in the modeling of TPS damages and their aeroheating assessment. This process was successfully utilized during STS-134.
Lim, Chong Wee; Ohmori, Kenji; Petrov, Ivan Georgiev; Greene, Joseph E.
2004-07-13
A method for forming atomic-scale structures on a surface of a substrate on a large-scale includes creating a predetermined amount of surface vacancies on the surface of the substrate by removing an amount of atoms on the surface of the material corresponding to the predetermined amount of the surface vacancies. Once the surface vacancies have been created, atoms of a desired structure material are deposited on the surface of the substrate to enable the surface vacancies and the atoms of the structure material to interact. The interaction causes the atoms of the structure material to form the atomic-scale structures.
A process for creating multimetric indices for large-scale aquatic surveys
Differences in sampling and laboratory protocols, differences in techniques used to evaluate metrics, and differing scales of calibration and application prohibit the use of many existing multimetric indices (MMIs) in large-scale bioassessments. We describe an approach to develop...
Generation of large-scale density fluctuations by buoyancy
NASA Technical Reports Server (NTRS)
Chasnov, J. R.; Rogallo, R. S.
1990-01-01
The generation of fluid motion from a state of rest by buoyancy forces acting on a homogeneous isotropic small-scale density field is considered. Nonlinear interactions between the generated fluid motion and the initial isotropic small-scale density field are found to create an anisotropic large-scale density field with spectrum proportional to kappa(exp 4). This large-scale density field is observed to result in an increasing Reynolds number of the fluid turbulence in its final period of decay.
The large scale microelectronics Computer-Aided Design and Test (CADAT) system
NASA Technical Reports Server (NTRS)
Gould, J. M.
1978-01-01
The CADAT system consists of a number of computer programs written in FORTRAN that provide the capability to simulate, lay out, analyze, and create the artwork for large scale microelectronics. The function of each software component of the system is described with references to specific documentation for each software component.
Children's Understanding of Large-Scale Mapping Tasks: An Analysis of Talk, Drawings, and Gesture
ERIC Educational Resources Information Center
Kotsopoulos, Donna; Cordy, Michelle; Langemeyer, Melanie
2015-01-01
This research examined how children represent motion in large-scale mapping tasks that we referred to as "motion maps". The underlying mathematical content was transformational geometry. In total, 19 children, 8- to 10-year-old, created motion maps and captured their motion maps with accompanying verbal description digitally. Analysis of…
From Lobster Shells to Plastic Objects: A Bioplastics Activity
ERIC Educational Resources Information Center
Hudson, Reuben; Glaisher, Samuel; Bishop, Alexandra; Katz, Jeffrey L.
2015-01-01
A multiple day activity for students to create large-scale plastic objects from the biopolymer chitin (major component of lobster, crab, and shrimp shells) is described. The plastic objects created are durable and made from benign materials, making them suitable for students to take home to play with. Since the student-created plastic objects are…
1983-09-01
Approved by: Me<i W4 1tsZ7 CaifI ,KDpartmento I inistrative Science 3 ( ABSTRACT >This thesis intends to create the basic...a need for a small scale model which allows a student analyst of tactical air operations to create his own battles and to test his own strategies with...iconic model is a large or small-scale repre- sentation of states-objects, or events. For example a scale model airplance resembles the system under the
Cloud-enabled large-scale land surface model simulations with the NASA Land Information System
NASA Astrophysics Data System (ADS)
Duffy, D.; Vaughan, G.; Clark, M. P.; Peters-Lidard, C. D.; Nijssen, B.; Nearing, G. S.; Rheingrover, S.; Kumar, S.; Geiger, J. V.
2017-12-01
Developed by the Hydrological Sciences Laboratory at NASA Goddard Space Flight Center (GSFC), the Land Information System (LIS) is a high-performance software framework for terrestrial hydrology modeling and data assimilation. LIS provides the ability to integrate satellite and ground-based observational products and advanced modeling algorithms to extract land surface states and fluxes. Through a partnership with the National Center for Atmospheric Research (NCAR) and the University of Washington, the LIS model is currently being extended to include the Structure for Unifying Multiple Modeling Alternatives (SUMMA). With the addition of SUMMA in LIS, meaningful simulations containing a large multi-model ensemble will be enabled and can provide advanced probabilistic continental-domain modeling capabilities at spatial scales relevant for water managers. The resulting LIS/SUMMA application framework is difficult for non-experts to install due to the large amount of dependencies on specific versions of operating systems, libraries, and compilers. This has created a significant barrier to entry for domain scientists that are interested in using the software on their own systems or in the cloud. In addition, the requirement to support multiple run time environments across the LIS community has created a significant burden on the NASA team. To overcome these challenges, LIS/SUMMA has been deployed using Linux containers, which allows for an entire software package along with all dependences to be installed within a working runtime environment, and Kubernetes, which orchestrates the deployment of a cluster of containers. Within a cloud environment, users can now easily create a cluster of virtual machines and run large-scale LIS/SUMMA simulations. Installations that have taken weeks and months can now be performed in minutes of time. This presentation will discuss the steps required to create a cloud-enabled large-scale simulation, present examples of its use, and describe the potential deployment of this information technology with other NASA applications.
ERIC Educational Resources Information Center
Fitzgerald, Michael; McKinnon, David H.; Danaia, Lena
2015-01-01
In this paper, we outline the theory behind the educational design used to implement a large-scale high school astronomy education project. This design was created in response to the realization of ineffective educational design in the initial early stages of the project. The new design follows an iterative improvement model where the materials…
Towards Large-area Field-scale Operational Evapotranspiration for Water Use Mapping
NASA Astrophysics Data System (ADS)
Senay, G. B.; Friedrichs, M.; Morton, C.; Huntington, J. L.; Verdin, J.
2017-12-01
Field-scale evapotranspiration (ET) estimates are needed for improving surface and groundwater use and water budget studies. Ideally, field-scale ET estimates would be at regional to national levels and cover long time periods. As a result of large data storage and computational requirements associated with processing field-scale satellite imagery such as Landsat, numerous challenges remain to develop operational ET estimates over large areas for detailed water use and availability studies. However, the combination of new science, data availability, and cloud computing technology is enabling unprecedented capabilities for ET mapping. To demonstrate this capability, we used Google's Earth Engine cloud computing platform to create nationwide annual ET estimates with 30-meter resolution Landsat ( 16,000 images) and gridded weather data using the Operational Simplified Surface Energy Balance (SSEBop) model in support of the National Water Census, a USGS research program designed to build decision support capacity for water management agencies and other natural resource managers. By leveraging Google's Earth Engine Application Programming Interface (API) and developing software in a collaborative, open-platform environment, we rapidly advance from research towards applications for large-area field-scale ET mapping. Cloud computing of the Landsat image archive combined with other satellite, climate, and weather data, is creating never imagined opportunities for assessing ET model behavior and uncertainty, and ultimately providing the ability for more robust operational monitoring and assessment of water use at field-scales.
The co-evolution of social institutions, demography, and large-scale human cooperation.
Powers, Simon T; Lehmann, Laurent
2013-11-01
Human cooperation is typically coordinated by institutions, which determine the outcome structure of the social interactions individuals engage in. Explaining the Neolithic transition from small- to large-scale societies involves understanding how these institutions co-evolve with demography. We study this using a demographically explicit model of institution formation in a patch-structured population. Each patch supports both social and asocial niches. Social individuals create an institution, at a cost to themselves, by negotiating how much of the costly public good provided by cooperators is invested into sanctioning defectors. The remainder of their public good is invested in technology that increases carrying capacity, such as irrigation systems. We show that social individuals can invade a population of asocials, and form institutions that support high levels of cooperation. We then demonstrate conditions where the co-evolution of cooperation, institutions, and demographic carrying capacity creates a transition from small- to large-scale social groups. © 2013 John Wiley & Sons Ltd/CNRS.
ERIC Educational Resources Information Center
Cameron, David Hagen
2010-01-01
This article uses a cultural and political theoretical framework to examine the relationship between consultants and secondary school leaders within a large-scale consultancy-based reform, the Secondary National Strategy (SNS), in London UK. The SNS follows a cascade model of implementation, in which nationally created initiatives are introduced…
NASA Astrophysics Data System (ADS)
Harris, B.; McDougall, K.; Barry, M.
2012-07-01
Digital Elevation Models (DEMs) allow for the efficient and consistent creation of waterways and catchment boundaries over large areas. Studies of waterway delineation from DEMs are usually undertaken over small or single catchment areas due to the nature of the problems being investigated. Improvements in Geographic Information Systems (GIS) techniques, software, hardware and data allow for analysis of larger data sets and also facilitate a consistent tool for the creation and analysis of waterways over extensive areas. However, rarely are they developed over large regional areas because of the lack of available raw data sets and the amount of work required to create the underlying DEMs. This paper examines definition of waterways and catchments over an area of approximately 25,000 km2 to establish the optimal DEM scale required for waterway delineation over large regional projects. The comparative study analysed multi-scale DEMs over two test areas (Wivenhoe catchment, 543 km2 and a detailed 13 km2 within the Wivenhoe catchment) including various data types, scales, quality, and variable catchment input parameters. Historic and available DEM data was compared to high resolution Lidar based DEMs to assess variations in the formation of stream networks. The results identified that, particularly in areas of high elevation change, DEMs at 20 m cell size created from broad scale 1:25,000 data (combined with more detailed data or manual delineation in flat areas) are adequate for the creation of waterways and catchments at a regional scale.
Private School Chains in Chile: Do Better Schools Scale Up? Policy Analysis. No. 682
ERIC Educational Resources Information Center
Elacqua, Gregory; Contreras, Dante; Salazar, Felipe; Santos, Humberto
2011-01-01
There is a persistent debate over the role of scale of operations in education. Some argue that school franchises offer educational services more effectively than do small independent schools. Skeptics counter that large, centralized operations create hard-to-manage bureaucracies and foster diseconomies of scale and that small schools are more…
Isolating causal pathways between flow and fish in the regulated river hierarchy
Ryan McManamay; Donald J. Orth; Charles A. Dolloff; David C. Mathews
2015-01-01
Unregulated river systems are organized in a hierarchy in which large scale factors (i.e. landscape and segment scales) influence local habitats (i.e. reach, meso- and microhabitat scales), and both differentially exert selective pressures on biota. Dams, however, create discontinua in these processes and change the hierarchical structure. We examined the relative...
A leap forward in geographic scale for forest ectomycorrhizal fungi
Filipa Cox; Nadia Barsoum; Martin I. Bidartondo; Isabella Børja; Erik Lilleskov; Lars O. Nilsson; Pasi Rautio; Kath Tubby; Lars Vesterdal
2010-01-01
In this letter we propose a first large-scale assessment of mycorrhizas with a European-wide network of intensively monitored forest plots as a research platform. This effort would create a qualitative and quantitative shift in mycorrhizal research by delivering the first continental-scale map of mycorrhizal fungi. Readersmay note that several excellent detailed...
A Thermal Technique of Fault Nucleation, Growth, and Slip
NASA Astrophysics Data System (ADS)
Garagash, D.; Germanovich, L. N.; Murdoch, L. C.; Martel, S. J.; Reches, Z.; Elsworth, D.; Onstott, T. C.
2009-12-01
Fractures and fluids influence virtually all mechanical processes in the crust, but many aspects of these processes remain poorly understood largely because of a lack of controlled field experiments at appropriate scale. We have developed an in-situ experimental approach to create carefully controlled faults at scale of ~10 meters using thermal techniques to modify in situ stresses to the point where the rock fails in shear. This approach extends experiments on fault nucleation and growth to length scales 2-3 orders of magnitude greater than are currently possible in the laboratory. The experiments could be done at depths where the modified in situ stresses are sufficient to drive faulting, obviating the need for unrealistically large loading frames. Such experiments require an access to large rock volumes in the deep subsurface in a controlled setting. The Deep Underground Science and Engineering Laboratory (DUSEL), which is a research facility planned to occupy the workings of the former Homestake gold mine in the northern Black Hills, South Dakota, presents an opportunity for accessing locations with vertical stresses as large as 60 MPa (down to 2400 m depth), which is sufficient to create faults. One of the most promising methods for manipulating stresses to create faults that we have evaluated involves drilling two parallel planar arrays of boreholes and circulating cold fluid (e.g., liquid nitrogen) to chill the region in the vicinity of the boreholes. Cooling a relatively small region around each borehole causes the rock to contract, reducing the normal compressive stress throughout much larger region between the arrays of boreholes. This scheme was evaluated using both scaling analysis and a finite element code. Our results show that if the boreholes are spaced by ~1 m, in several days to weeks, the normal compressive stress can be reduced by 10 MPa or more, and it is even possible to create net tension between the borehole arrays. According to the Mohr-Coulomb strength criterion with standard Byerlee parameters, a fault will initiate before the net tension occurs. After a new fault is created, hot fluid can be injected into the boreholes to increase the temperature and reverse the direction of fault slip. This process can be repeated to study the formation of gouge, and how the properties of gouge control fault slip and associated seismicity. Instrumenting the site with arrays of geophones, tiltmeters, strain gauges, and displacement transducers as well as back mining - an opportunity provided by the DUSEL project - can reveal details of the fault geometry and gouge. We also expect to find small faults (with cm-scale displacement) during construction of DUSEL drifts. The same thermal technique can be used to induce slip on one of them and compare the “man-made” and natural gouges. The thermal technique appears to be a relatively simple way to rapidly change the stress field and either create slip on existing fractures or create new faults at scales up to 10 m or more.
Multi-scale Modeling of Radiation Damage: Large Scale Data Analysis
NASA Astrophysics Data System (ADS)
Warrier, M.; Bhardwaj, U.; Bukkuru, S.
2016-10-01
Modification of materials in nuclear reactors due to neutron irradiation is a multiscale problem. These neutrons pass through materials creating several energetic primary knock-on atoms (PKA) which cause localized collision cascades creating damage tracks, defects (interstitials and vacancies) and defect clusters depending on the energy of the PKA. These defects diffuse and recombine throughout the whole duration of operation of the reactor, thereby changing the micro-structure of the material and its properties. It is therefore desirable to develop predictive computational tools to simulate the micro-structural changes of irradiated materials. In this paper we describe how statistical averages of the collision cascades from thousands of MD simulations are used to provide inputs to Kinetic Monte Carlo (KMC) simulations which can handle larger sizes, more defects and longer time durations. Use of unsupervised learning and graph optimization in handling and analyzing large scale MD data will be highlighted.
Ecological fire use for ecological fire management: Managing large wildfires by design
Timothy Ingalsbee
2015-01-01
Past fire exclusion policies and fire suppression actions have led to a historic "fire deficit" on public wildlands. These sociocultural actions have led to unprecedented environmental changes that have created conditions conducive to more frequent large-scale wildfires. Politicians, the newsmedia, and agency officials portray large wildland fires as...
Robust Coordination for Large Sets of Simple Rovers
NASA Technical Reports Server (NTRS)
Tumer, Kagan; Agogino, Adrian
2006-01-01
The ability to coordinate sets of rovers in an unknown environment is critical to the long-term success of many of NASA;s exploration missions. Such coordination policies must have the ability to adapt in unmodeled or partially modeled domains and must be robust against environmental noise and rover failures. In addition such coordination policies must accommodate a large number of rovers, without excessive and burdensome hand-tuning. In this paper we present a distributed coordination method that addresses these issues in the domain of controlling a set of simple rovers. The application of these methods allows reliable and efficient robotic exploration in dangerous, dynamic, and previously unexplored domains. Most control policies for space missions are directly programmed by engineers or created through the use of planning tools, and are appropriate for single rover missions or missions requiring the coordination of a small number of rovers. Such methods typically require significant amounts of domain knowledge, and are difficult to scale to large numbers of rovers. The method described in this article aims to address cases where a large number of rovers need to coordinate to solve a complex time dependent problem in a noisy environment. In this approach, each rover decomposes a global utility, representing the overall goal of the system, into rover-specific utilities that properly assign credit to the rover s actions. Each rover then has the responsibility to create a control policy that maximizes its own rover-specific utility. We show a method of creating rover-utilities that are "aligned" with the global utility, such that when the rovers maximize their own utility, they also maximize the global utility. In addition we show that our method creates rover-utilities that allow the rovers to create their control policies quickly and reliably. Our distributed learning method allows large sets rovers be used unmodeled domains, while providing robustness against rover failures and changing environments. In experimental simulations we show that our method scales well with large numbers of rovers in addition to being robust against noisy sensor inputs and noisy servo control. The results show that our method is able to scale to large numbers of rovers and achieves up to 400% performance improvement over standard machine learning methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mendon, Vrushali V.; Taylor, Zachary T.
ABSTRACT: Recent advances in residential building energy efficiency and codes have resulted in increased interest in detailed residential building energy models using the latest energy simulation software. One of the challenges of developing residential building models to characterize new residential building stock is to allow for flexibility to address variability in house features like geometry, configuration, HVAC systems etc. Researchers solved this problem in a novel way by creating a simulation structure capable of creating fully-functional EnergyPlus batch runs using a completely scalable residential EnergyPlus template system. This system was used to create a set of thirty-two residential prototype buildingmore » models covering single- and multifamily buildings, four common foundation types and four common heating system types found in the United States (US). A weighting scheme with detailed state-wise and national weighting factors was designed to supplement the residential prototype models. The complete set is designed to represent a majority of new residential construction stock. The entire structure consists of a system of utility programs developed around the core EnergyPlus simulation engine to automate the creation and management of large-scale simulation studies with minimal human effort. The simulation structure and the residential prototype building models have been used for numerous large-scale studies, one of which is briefly discussed in this paper.« less
An economy of scale system's mensuration of large spacecraft
NASA Technical Reports Server (NTRS)
Deryder, L. J.
1981-01-01
The systems technology and cost particulars of using multipurpose platforms versus several sizes of bus type free flyer spacecraft to accomplish the same space experiment missions. Computer models of these spacecraft bus designs were created to obtain data relative to size, weight, power, performance, and cost. To answer the question of whether or not large scale does produce economy, the dominant cost factors were determined and the programmatic effect on individual experiment costs were evaluated.
ERIC Educational Resources Information Center
Guth, Douglas J.
2017-01-01
A community college's success hinges in large part on the effectiveness of its teaching faculty, no more so than in times of major organizational change. However, any large-scale foundational shift requires institutional buy-in, with the onus on leadership to create an environment where everyone is working together toward the same endpoint.…
Validity of Scores for a Developmental Writing Scale Based on Automated Scoring
ERIC Educational Resources Information Center
Attali, Yigal; Powers, Donald
2009-01-01
A developmental writing scale for timed essay-writing performance was created on the basis of automatically computed indicators of writing fluency, word choice, and conventions of standard written English. In a large-scale data collection effort that involved a national sample of more than 12,000 students from 4th, 6th, 8th, 10th, and 12th grade,…
A modular approach to creating large engineered cartilage surfaces.
Ford, Audrey C; Chui, Wan Fung; Zeng, Anne Y; Nandy, Aditya; Liebenberg, Ellen; Carraro, Carlo; Kazakia, Galateia; Alliston, Tamara; O'Connell, Grace D
2018-01-23
Native articular cartilage has limited capacity to repair itself from focal defects or osteoarthritis. Tissue engineering has provided a promising biological treatment strategy that is currently being evaluated in clinical trials. However, current approaches in translating these techniques to developing large engineered tissues remains a significant challenge. In this study, we present a method for developing large-scale engineered cartilage surfaces through modular fabrication. Modular Engineered Tissue Surfaces (METS) uses the well-known, but largely under-utilized self-adhesion properties of de novo tissue to create large scaffolds with nutrient channels. Compressive mechanical properties were evaluated throughout METS specimens, and the tensile mechanical strength of the bonds between attached constructs was evaluated over time. Raman spectroscopy, biochemical assays, and histology were performed to investigate matrix distribution. Results showed that by Day 14, stable connections had formed between the constructs in the METS samples. By Day 21, bonds were robust enough to form a rigid sheet and continued to increase in size and strength over time. Compressive mechanical properties and glycosaminoglycan (GAG) content of METS and individual constructs increased significantly over time. The METS technique builds on established tissue engineering accomplishments of developing constructs with GAG composition and compressive properties approaching native cartilage. This study demonstrated that modular fabrication is a viable technique for creating large-scale engineered cartilage, which can be broadly applied to many tissue engineering applications and construct geometries. Copyright © 2017 Elsevier Ltd. All rights reserved.
Cosmic microwave background probes models of inflation
NASA Technical Reports Server (NTRS)
Davis, Richard L.; Hodges, Hardy M.; Smoot, George F.; Steinhardt, Paul J.; Turner, Michael S.
1992-01-01
Inflation creates both scalar (density) and tensor (gravity wave) metric perturbations. We find that the tensor-mode contribution to the cosmic microwave background anisotropy on large-angular scales can only exceed that of the scalar mode in models where the spectrum of perturbations deviates significantly from scale invariance. If the tensor mode dominates at large-angular scales, then the value of DeltaT/T predicted on 1 deg is less than if the scalar mode dominates, and, for cold-dark-matter models, bias factors greater than 1 can be made consistent with Cosmic Background Explorer (COBE) DMR results.
Using Microsoft Excel[R] to Calculate Descriptive Statistics and Create Graphs
ERIC Educational Resources Information Center
Carr, Nathan T.
2008-01-01
Descriptive statistics and appropriate visual representations of scores are important for all test developers, whether they are experienced testers working on large-scale projects, or novices working on small-scale local tests. Many teachers put in charge of testing projects do not know "why" they are important, however, and are utterly convinced…
Development and Examination of the Social Appearance Anxiety Scale
ERIC Educational Resources Information Center
Hart, Trevor A.; Flora, David B.; Palyo, Sarah A.; Fresco, David M.; Holle, Christian; Heimberg, Richard G.
2008-01-01
The Social Appearance Anxiety Scale (SAAS) was created to measure anxiety about being negatively evaluated by others because of one's overall appearance, including body shape. This study examined the psychometric properties of the SAAS in three large samples of undergraduate students (respective ns = 512, 853, and 541). The SAAS demonstrated a…
The Effectiveness of Private School Franchises in Chile's National Voucher Program
ERIC Educational Resources Information Center
Elacqua, Gregory; Contreras, Dante; Salazar, Felipe; Santos, Humberto
2011-01-01
There is persistent debate over the role of scale of operations in education. Some argue that school franchises offer educational services more effectively than small independent schools. Skeptics counter that large centralized operations create hard-to-manage bureaucracies and foster diseconomies of scale and that small schools are more effective…
Postinflationary Higgs relaxation and the origin of matter-antimatter asymmetry.
Kusenko, Alexander; Pearce, Lauren; Yang, Louis
2015-02-13
The recent measurement of the Higgs boson mass implies a relatively slow rise of the standard model Higgs potential at large scales, and a possible second minimum at even larger scales. Consequently, the Higgs field may develop a large vacuum expectation value during inflation. The relaxation of the Higgs field from its large postinflationary value to the minimum of the effective potential represents an important stage in the evolution of the Universe. During this epoch, the time-dependent Higgs condensate can create an effective chemical potential for the lepton number, leading to a generation of the lepton asymmetry in the presence of some large right-handed Majorana neutrino masses. The electroweak sphalerons redistribute this asymmetry between leptons and baryons. This Higgs relaxation leptogenesis can explain the observed matter-antimatter asymmetry of the Universe even if the standard model is valid up to the scale of inflation, and any new physics is suppressed by that high scale.
Derivation of large-scale cellular regulatory networks from biological time series data.
de Bivort, Benjamin L
2010-01-01
Pharmacological agents and other perturbants of cellular homeostasis appear to nearly universally affect the activity of many genes, proteins, and signaling pathways. While this is due in part to nonspecificity of action of the drug or cellular stress, the large-scale self-regulatory behavior of the cell may also be responsible, as this typically means that when a cell switches states, dozens or hundreds of genes will respond in concert. If many genes act collectively in the cell during state transitions, rather than every gene acting independently, models of the cell can be created that are comprehensive of the action of all genes, using existing data, provided that the functional units in the model are collections of genes. Techniques to develop these large-scale cellular-level models are provided in detail, along with methods of analyzing them, and a brief summary of major conclusions about large-scale cellular networks to date.
Project Management Life Cycle Models to Improve Management in High-rise Construction
NASA Astrophysics Data System (ADS)
Burmistrov, Andrey; Siniavina, Maria; Iliashenko, Oksana
2018-03-01
The paper describes a possibility to improve project management in high-rise buildings construction through the use of various Project Management Life Cycle Models (PMLC models) based on traditional and agile project management approaches. Moreover, the paper describes, how the split the whole large-scale project to the "project chain" will create the factor for better manageability of the large-scale buildings project and increase the efficiency of the activities of all participants in such projects.
Effect of Carboxymethylation on the Rheological Properties of Hyaluronan
Wendling, Rian J.; Christensen, Amanda M.; Quast, Arthur D.; Atzet, Sarah K.; Mann, Brenda K.
2016-01-01
Chemical modifications made to hyaluronan to enable covalent crosslinking to form a hydrogel or to attach other molecules may alter the physical properties as well, which have physiological importance. Here we created carboxymethyl hyaluronan (CMHA) with varied degree of modification and investigated the effect on the viscosity of CMHA solutions. Viscosity decreased initially as modification increased, with a minimum viscosity for about 30–40% modification. This was followed by an increase in viscosity around 45–50% modification. The pH of the solution had a variable effect on viscosity, depending on the degree of carboxymethyl modification and buffer. The presence of phosphates in the buffer led to decreased viscosity. We also compared large-scale production lots of CMHA to lab-scale and found that large-scale required extended reaction times to achieve the same degree of modification. Finally, thiolated CMHA was disulfide crosslinked to create hydrogels with increased viscosity and shear-thinning aspects compared to CMHA solutions. PMID:27611817
Thompson, A J; Marks, L H; Goudie, M J; Rojas-Pena, A; Handa, H; Potkay, J A
2017-03-01
Artificial lungs have been used in the clinic for multiple decades to supplement patient pulmonary function. Recently, small-scale microfluidic artificial lungs (μAL) have been demonstrated with large surface area to blood volume ratios, biomimetic blood flow paths, and pressure drops compatible with pumpless operation. Initial small-scale microfluidic devices with blood flow rates in the μ l/min to ml/min range have exhibited excellent gas transfer efficiencies; however, current manufacturing techniques may not be suitable for scaling up to human applications. Here, we present a new manufacturing technology for a microfluidic artificial lung in which the structure is assembled via a continuous "rolling" and bonding procedure from a single, patterned layer of polydimethyl siloxane (PDMS). This method is demonstrated in a small-scale four-layer device, but is expected to easily scale to larger area devices. The presented devices have a biomimetic branching blood flow network, 10 μ m tall artificial capillaries, and a 66 μ m thick gas transfer membrane. Gas transfer efficiency in blood was evaluated over a range of blood flow rates (0.1-1.25 ml/min) for two different sweep gases (pure O 2 , atmospheric air). The achieved gas transfer data closely follow predicted theoretical values for oxygenation and CO 2 removal, while pressure drop is marginally higher than predicted. This work is the first step in developing a scalable method for creating large area microfluidic artificial lungs. Although designed for microfluidic artificial lungs, the presented technique is expected to result in the first manufacturing method capable of simply and easily creating large area microfluidic devices from PDMS.
Forced Alignment for Understudied Language Varieties: Testing Prosodylab-Aligner with Tongan Data
ERIC Educational Resources Information Center
Johnson, Lisa M.; Di Paolo, Marianna; Bell, Adrian
2018-01-01
Automated alignment of transcriptions to audio files expedites the process of preparing data for acoustic analysis. Unfortunately, the benefits of auto-alignment have generally been available only to researchers studying majority languages, for which large corpora exist and for which acoustic models have been created by large-scale research…
Recent developments in large-scale ozone generation with dielectric barrier discharges
NASA Astrophysics Data System (ADS)
Lopez, Jose L.
2014-10-01
Large-scale ozone generation for industrial applications has been entirely based on the creation of microplasmas or microdischarges created using dielectric barrier discharge (DBD) reactors. Although versions of DBD generated ozone have been in continuous use for over a hundred years especially in water treatment, recent changes in environmental awareness and sustainability have lead to a surge of ozone generating facilities throughout the world. As a result of this enhanced global usage of this environmental cleaning application various new discoveries have emerged in the science and technology of ozone generation. This presentation will describe some of the most recent breakthrough developments in large-scale ozone generation while further addressing some of the current scientific and engineering challenges of this technology.
Building and measuring a high performance network architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kramer, William T.C.; Toole, Timothy; Fisher, Chuck
2001-04-20
Once a year, the SC conferences present a unique opportunity to create and build one of the most complex and highest performance networks in the world. At SC2000, large-scale and complex local and wide area networking connections were demonstrated, including large-scale distributed applications running on different architectures. This project was designed to use the unique opportunity presented at SC2000 to create a testbed network environment and then use that network to demonstrate and evaluate high performance computational and communication applications. This testbed was designed to incorporate many interoperable systems and services and was designed for measurement from the very beginning.more » The end results were key insights into how to use novel, high performance networking technologies and to accumulate measurements that will give insights into the networks of the future.« less
Managing landscapes at multiple scales for sustainability of ecosystem functions (Preface)
R.A. Birdsey; R. Lucas; Y. Pan; G. Sun; E.J. Gustafson; A.H. Perera
2010-01-01
The science of landscape ecology is a rapidly evolving academic field with an emphasis on studying large-scale spatial heterogeneity created by natural influences and human activities. These advances have important implications for managing and conserving natural resources. At a September 2008 IUFRO conference in Chengdu, Sichuan, P.R. China, we highlighted both the...
Large-scale neuromorphic computing systems
NASA Astrophysics Data System (ADS)
Furber, Steve
2016-10-01
Neuromorphic computing covers a diverse range of approaches to information processing all of which demonstrate some degree of neurobiological inspiration that differentiates them from mainstream conventional computing systems. The philosophy behind neuromorphic computing has its origins in the seminal work carried out by Carver Mead at Caltech in the late 1980s. This early work influenced others to carry developments forward, and advances in VLSI technology supported steady growth in the scale and capability of neuromorphic devices. Recently, a number of large-scale neuromorphic projects have emerged, taking the approach to unprecedented scales and capabilities. These large-scale projects are associated with major new funding initiatives for brain-related research, creating a sense that the time and circumstances are right for progress in our understanding of information processing in the brain. In this review we present a brief history of neuromorphic engineering then focus on some of the principal current large-scale projects, their main features, how their approaches are complementary and distinct, their advantages and drawbacks, and highlight the sorts of capabilities that each can deliver to neural modellers.
ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wieselquist, William A.; Thompson, Adam B.; Bowman, Stephen M.
2016-04-01
Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process datamore » to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.« less
State of the Art in Large-Scale Soil Moisture Monitoring
NASA Technical Reports Server (NTRS)
Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.;
2013-01-01
Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.
The Convergence of High Performance Computing and Large Scale Data Analytics
NASA Astrophysics Data System (ADS)
Duffy, D.; Bowen, M. K.; Thompson, J. H.; Yang, C. P.; Hu, F.; Wills, B.
2015-12-01
As the combinations of remote sensing observations and model outputs have grown, scientists are increasingly burdened with both the necessity and complexity of large-scale data analysis. Scientists are increasingly applying traditional high performance computing (HPC) solutions to solve their "Big Data" problems. While this approach has the benefit of limiting data movement, the HPC system is not optimized to run analytics, which can create problems that permeate throughout the HPC environment. To solve these issues and to alleviate some of the strain on the HPC environment, the NASA Center for Climate Simulation (NCCS) has created the Advanced Data Analytics Platform (ADAPT), which combines both HPC and cloud technologies to create an agile system designed for analytics. Large, commonly used data sets are stored in this system in a write once/read many file system, such as Landsat, MODIS, MERRA, and NGA. High performance virtual machines are deployed and scaled according to the individual scientist's requirements specifically for data analysis. On the software side, the NCCS and GMU are working with emerging commercial technologies and applying them to structured, binary scientific data in order to expose the data in new ways. Native NetCDF data is being stored within a Hadoop Distributed File System (HDFS) enabling storage-proximal processing through MapReduce while continuing to provide accessibility of the data to traditional applications. Once the data is stored within HDFS, an additional indexing scheme is built on top of the data and placed into a relational database. This spatiotemporal index enables extremely fast mappings of queries to data locations to dramatically speed up analytics. These are some of the first steps toward a single unified platform that optimizes for both HPC and large-scale data analysis, and this presentation will elucidate the resulting and necessary exascale architectures required for future systems.
Choosing the best partition of the output from a large-scale simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Challacombe, Chelsea Jordan; Casleton, Emily Michele
Data partitioning becomes necessary when a large-scale simulation produces more data than can be feasibly stored. The goal is to partition the data, typically so that every element belongs to one and only one partition, and store summary information about the partition, either a representative value plus an estimate of the error or a distribution. Once the partitions are determined and the summary information stored, the raw data is discarded. This process can be performed in-situ; meaning while the simulation is running. When creating the partitions there are many decisions that researchers must make. For instance, how to determine oncemore » an adequate number of partitions have been created, how are the partitions created with respect to dividing the data, or how many variables should be considered simultaneously. In addition, decisions must be made for how to summarize the information within each partition. Because of the combinatorial number of possible ways to partition and summarize the data, a method of comparing the different possibilities will help guide researchers into choosing a good partitioning and summarization scheme for their application.« less
A Novel Multi-scale Simulation Strategy for Turbulent Reacting Flows
DOE Office of Scientific and Technical Information (OSTI.GOV)
James, Sutherland C.
In this project, a new methodology was proposed to bridge the gap between Direct Numerical Simulation (DNS) and Large Eddy Simulation (LES). This novel methodology, titled Lattice-Based Multiscale Simulation (LBMS), creates a lattice structure of One-Dimensional Turbulence (ODT) models. This model has been shown to capture turbulent combustion with high fidelity by fully resolving interactions between turbulence and diffusion. By creating a lattice of ODT models, which are then coupled, LBMS overcomes the shortcomings of ODT, which are its inability to capture large scale three dimensional flow structures. However, by spacing these lattices significantly apart, LBMS can avoid the cursemore » of dimensionality that creates untenable computational costs associated with DNS. This project has shown that LBMS is capable of reproducing statistics of isotropic turbulent flows while coarsening the spacing between lines significantly. It also investigates and resolves issues that arise when coupling ODT lines, such as flux reconstruction perpendicular to a given ODT line, preservation of conserved quantities when eddies cross a course cell volume and boundary condition application. Robust parallelization is also investigated.« less
Coupling large scale hydrologic-reservoir-hydraulic models for impact studies in data sparse regions
NASA Astrophysics Data System (ADS)
O'Loughlin, Fiachra; Neal, Jeff; Wagener, Thorsten; Bates, Paul; Freer, Jim; Woods, Ross; Pianosi, Francesca; Sheffied, Justin
2017-04-01
As hydraulic modelling moves to increasingly large spatial domains it has become essential to take reservoirs and their operations into account. Large-scale hydrological models have been including reservoirs for at least the past two decades, yet they cannot explicitly model the variations in spatial extent of reservoirs, and many reservoirs operations in hydrological models are not undertaken during the run-time operation. This requires a hydraulic model, yet to-date no continental scale hydraulic model has directly simulated reservoirs and their operations. In addition to the need to include reservoirs and their operations in hydraulic models as they move to global coverage, there is also a need to link such models to large scale hydrology models or land surface schemes. This is especially true for Africa where the number of river gauges has consistently declined since the middle of the twentieth century. In this study we address these two major issues by developing: 1) a coupling methodology for the VIC large-scale hydrological model and the LISFLOOD-FP hydraulic model, and 2) a reservoir module for the LISFLOOD-FP model, which currently includes four sets of reservoir operating rules taken from the major large-scale hydrological models. The Volta Basin, West Africa, was chosen to demonstrate the capability of the modelling framework as it is a large river basin ( 400,000 km2) and contains the largest man-made lake in terms of area (8,482 km2), Lake Volta, created by the Akosombo dam. Lake Volta also experiences a seasonal variation in water levels of between two and six metres that creates a dynamic shoreline. In this study, we first run our coupled VIC and LISFLOOD-FP model without explicitly modelling Lake Volta and then compare these results with those from model runs where the dam operations and Lake Volta are included. The results show that we are able to obtain variation in the Lake Volta water levels and that including the dam operations and Lake Volta has significant impacts on the water levels across the domain.
EPA Facilities and Regional Boundaries Service, US, 2012, US EPA, SEGS
This SEGS web service contains EPA facilities, EPA facilities labels, small- and large-scale versions of EPA region boundaries, and EPA region boundaries extended to the 200nm Exclusive Economic Zone (EEZ). Small scale EPA boundaries and boundaries extended to the EEZ render at scales of less than 5 million, large scale EPA boundaries draw at scales greater than or equal to 5 million. EPA facilities labels draw at scales greater than 2 million. Data used to create this web service are available as a separate download at the Secondary Linkage listed above. Full FGDC metadata records for each layer may be found by clicking the layer name in the web service table of contents (available through the online link provided above) and viewing the layer description. This SEGS dataset was produced by EPA through the Office of Environmental Information.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Srivastava, A. K.; Goossens, M.
2013-11-01
We present rare observational evidence of vertical kink oscillations in a laminar and diffused large-scale plasma curtain as observed by the Atmospheric Imaging Assembly on board the Solar Dynamics Observatory. The X6.9-class flare in active region 11263 on 2011 August 9 induces a global large-scale disturbance that propagates in a narrow lane above the plasma curtain and creates a low density region that appears as a dimming in the observational image data. This large-scale propagating disturbance acts as a non-periodic driver that interacts asymmetrically and obliquely with the top of the plasma curtain and triggers the observed oscillations. In themore » deeper layers of the curtain, we find evidence of vertical kink oscillations with two periods (795 s and 530 s). On the magnetic surface of the curtain where the density is inhomogeneous due to coronal dimming, non-decaying vertical oscillations are also observed (period ≈ 763-896 s). We infer that the global large-scale disturbance triggers vertical kink oscillations in the deeper layers as well as on the surface of the large-scale plasma curtain. The properties of the excited waves strongly depend on the local plasma and magnetic field conditions.« less
Creating Cultural Consumers: The Dynamics of Cultural Capital Acquisition
ERIC Educational Resources Information Center
Kisida, Brian; Greene, Jay P.; Bowen, Daniel H.
2014-01-01
The theories of cultural reproduction and cultural mobility have largely shaped the study of the effects of cultural capital on academic outcomes. Missing in this debate has been a rigorous examination of how children actually acquire cultural capital when it is not provided by their families. Drawing on data from a large-scale experimental study…
Academic-industrial partnerships in drug discovery in the age of genomics.
Harris, Tim; Papadopoulos, Stelios; Goldstein, David B
2015-06-01
Many US FDA-approved drugs have been developed through productive interactions between the biotechnology industry and academia. Technological breakthroughs in genomics, in particular large-scale sequencing of human genomes, is creating new opportunities to understand the biology of disease and to identify high-value targets relevant to a broad range of disorders. However, the scale of the work required to appropriately analyze large genomic and clinical data sets is challenging industry to develop a broader view of what areas of work constitute precompetitive research. Copyright © 2015 Elsevier Ltd. All rights reserved.
Using the Partial Credit Model to Evaluate the Student Engagement in Mathematics Scale
ERIC Educational Resources Information Center
Leis, Micela; Schmidt, Karen M.; Rimm-Kaufman, Sara E.
2015-01-01
The Student Engagement in Mathematics Scale (SEMS) is a self-report measure that was created to assess three dimensions of student engagement (social, emotional, and cognitive) in mathematics based on a single day of class. In the current study, the SEMS was administered to a sample of 360 fifth graders from a large Mid-Atlantic district. The…
ERIC Educational Resources Information Center
Binfet, John Tyler; Gadermann, Anne M.; Schonert-Reichl, Kimberly A.
2016-01-01
In this study, we sought to create and validate a brief measure to assess students' perceptions of kindness in school. Participants included 1,753 students in Grades 4 to 8 attending public schools in a large school district in southern British Columbia. The School Kindness Scale (SKS) demonstrated a unidimensional factor structure and adequate…
High performance cellular level agent-based simulation with FLAME for the GPU.
Richmond, Paul; Walker, Dawn; Coakley, Simon; Romano, Daniela
2010-05-01
Driven by the availability of experimental data and ability to simulate a biological scale which is of immediate interest, the cellular scale is fast emerging as an ideal candidate for middle-out modelling. As with 'bottom-up' simulation approaches, cellular level simulations demand a high degree of computational power, which in large-scale simulations can only be achieved through parallel computing. The flexible large-scale agent modelling environment (FLAME) is a template driven framework for agent-based modelling (ABM) on parallel architectures ideally suited to the simulation of cellular systems. It is available for both high performance computing clusters (www.flame.ac.uk) and GPU hardware (www.flamegpu.com) and uses a formal specification technique that acts as a universal modelling format. This not only creates an abstraction from the underlying hardware architectures, but avoids the steep learning curve associated with programming them. In benchmarking tests and simulations of advanced cellular systems, FLAME GPU has reported massive improvement in performance over more traditional ABM frameworks. This allows the time spent in the development and testing stages of modelling to be drastically reduced and creates the possibility of real-time visualisation for simple visual face-validation.
Global Carbon Dioxide Transport from AIRS Data, July 2009
2009-11-09
Created with data acquired by JPL Atmospheric Infrared Sounder instrument during July 2009 this image shows large-scale patterns of carbon dioxide concentrations that are transported around Earth by the general circulation of the atmosphere.
Policy Driven Development: Flexible Policy Insertion for Large Scale Systems.
Demchak, Barry; Krüger, Ingolf
2012-07-01
The success of a software system depends critically on how well it reflects and adapts to stakeholder requirements. Traditional development methods often frustrate stakeholders by creating long latencies between requirement articulation and system deployment, especially in large scale systems. One source of latency is the maintenance of policy decisions encoded directly into system workflows at development time, including those involving access control and feature set selection. We created the Policy Driven Development (PDD) methodology to address these development latencies by enabling the flexible injection of decision points into existing workflows at runtime , thus enabling policy composition that integrates requirements furnished by multiple, oblivious stakeholder groups. Using PDD, we designed and implemented a production cyberinfrastructure that demonstrates policy and workflow injection that quickly implements stakeholder requirements, including features not contemplated in the original system design. PDD provides a path to quickly and cost effectively evolve such applications over a long lifetime.
Porous microwells for geometry-selective, large-scale microparticle arrays
NASA Astrophysics Data System (ADS)
Kim, Jae Jung; Bong, Ki Wan; Reátegui, Eduardo; Irimia, Daniel; Doyle, Patrick S.
2017-01-01
Large-scale microparticle arrays (LSMAs) are key for material science and bioengineering applications. However, previous approaches suffer from trade-offs between scalability, precision, specificity and versatility. Here, we present a porous microwell-based approach to create large-scale microparticle arrays with complex motifs. Microparticles are guided to and pushed into microwells by fluid flow through small open pores at the bottom of the porous well arrays. A scaling theory allows for the rational design of LSMAs to sort and array particles on the basis of their size, shape, or modulus. Sequential particle assembly allows for proximal and nested particle arrangements, as well as particle recollection and pattern transfer. We demonstrate the capabilities of the approach by means of three applications: high-throughput single-cell arrays; microenvironment fabrication for neutrophil chemotaxis; and complex, covert tags by the transfer of an upconversion nanocrystal-laden LSMA.
Large-scale self-assembled zirconium phosphate smectic layers via a simple spray-coating process
NASA Astrophysics Data System (ADS)
Wong, Minhao; Ishige, Ryohei; White, Kevin L.; Li, Peng; Kim, Daehak; Krishnamoorti, Ramanan; Gunther, Robert; Higuchi, Takeshi; Jinnai, Hiroshi; Takahara, Atsushi; Nishimura, Riichi; Sue, Hung-Jue
2014-04-01
The large-scale assembly of asymmetric colloidal particles is used in creating high-performance fibres. A similar concept is extended to the manufacturing of thin films of self-assembled two-dimensional crystal-type materials with enhanced and tunable properties. Here we present a spray-coating method to manufacture thin, flexible and transparent epoxy films containing zirconium phosphate nanoplatelets self-assembled into a lamellar arrangement aligned parallel to the substrate. The self-assembled mesophase of zirconium phosphate nanoplatelets is stabilized by epoxy pre-polymer and exhibits rheology favourable towards large-scale manufacturing. The thermally cured film forms a mechanically robust coating and shows excellent gas barrier properties at both low- and high humidity levels as a result of the highly aligned and overlapping arrangement of nanoplatelets. This work shows that the large-scale ordering of high aspect ratio nanoplatelets is easier to achieve than previously thought and may have implications in the technological applications for similar materials.
Content Is King: Databases Preserve the Collective Information of Science.
Yates, John R
2018-04-01
Databases store sequence information experimentally gathered to create resources that further science. In the last 20 years databases have become critical components of fields like proteomics where they provide the basis for large-scale and high-throughput proteomic informatics. Amos Bairoch, winner of the Association of Biomolecular Resource Facilities Frederick Sanger Award, has created some of the important databases proteomic research depends upon for accurate interpretation of data.
Peace Operations in Mali: Theory into Practice Then Measuring Effectiveness
2017-06-09
community’s response along two broad lines of effort (LOE): Creating a Safe and Secure Environment and promoting Stable Governance. When seeking to achieve a... Safe and Secure Environment , two objectives were measured. Objective #1 sought the Cessation of Large Scale Violence. Success was attained, as...Creating a Safe and Secure Environment and promoting Stable Governance. When seeking to achieve a Safe and Secure Environment , two objectives were
Global Carbon Dioxide Transport from AIRS Data, July 2008
2008-09-24
This image was created with data acquired by JPLa Atmospheric Infrared Sounder during July 2008. The image shows large scale patterns of carbon dioxide concentrations that are transported around the Earth by the general circulation of the atmosphere.
Imaging spectroscopy links aspen genotype with below-ground processes at landscape scales
Madritch, Michael D.; Kingdon, Clayton C.; Singh, Aditya; Mock, Karen E.; Lindroth, Richard L.; Townsend, Philip A.
2014-01-01
Fine-scale biodiversity is increasingly recognized as important to ecosystem-level processes. Remote sensing technologies have great potential to estimate both biodiversity and ecosystem function over large spatial scales. Here, we demonstrate the capacity of imaging spectroscopy to discriminate among genotypes of Populus tremuloides (trembling aspen), one of the most genetically diverse and widespread forest species in North America. We combine imaging spectroscopy (AVIRIS) data with genetic, phytochemical, microbial and biogeochemical data to determine how intraspecific plant genetic variation influences below-ground processes at landscape scales. We demonstrate that both canopy chemistry and below-ground processes vary over large spatial scales (continental) according to aspen genotype. Imaging spectrometer data distinguish aspen genotypes through variation in canopy spectral signature. In addition, foliar spectral variation correlates well with variation in canopy chemistry, especially condensed tannins. Variation in aspen canopy chemistry, in turn, is correlated with variation in below-ground processes. Variation in spectra also correlates well with variation in soil traits. These findings indicate that forest tree species can create spatial mosaics of ecosystem functioning across large spatial scales and that these patterns can be quantified via remote sensing techniques. Moreover, they demonstrate the utility of using optical properties as proxies for fine-scale measurements of biodiversity over large spatial scales. PMID:24733949
Zhang, Panpan; Huang, Ying; Lu, Xin; Zhang, Siyu; Li, Jingfeng; Wei, Gang; Su, Zhiqiang
2014-07-29
We demonstrated a facile one-step synthesis strategy for the preparation of a large-scale reduced graphene oxide multilayered film doped with gold nanoparticles (RGO/AuNP film) and applied this film as functional nanomaterials for electrochemistry and Raman detection applications. The related applications of the fabricated RGO/AuNP film in electrochemical nonenzymatic H2O2 biosensor, electrochemical oxygen reduction reaction (ORR), and surface-enhanced Raman scattering (SERS) detection were investigated. Electrochemical data indicate that the H2O2 biosensor fabricated by RGO/AuNP film shows a wide linear range, low limitation of detection, high selectivity, and long-term stability. In addition, it was proved that the created RGO/AuNP film also exhibits excellent ORR electrochemical catalysis performance. The created RGO/AuNP film, when serving as SERS biodetection platform, presents outstanding performances in detecting 4-aminothiophenol with an enhancement factor of approximately 5.6 × 10(5) as well as 2-thiouracil sensing with a low concentration to 1 μM. It is expected that this facile strategy for fabricating large-scale graphene film doped with metallic nanoparticles will spark inspirations in preparing functional nanomaterials and further extend their applications in drug delivery, wastewater purification, and bioenergy.
Impact phenomena as factors in the evolution of the Earth
NASA Technical Reports Server (NTRS)
Grieve, R. A. F.; Parmentier, E. M.
1984-01-01
It is estimated that 30 to 200 large impact basins could have been formed on the early Earth. These large impacts may have resulted in extensive volcanism and enhanced endogenic geologic activity over large areas. Initial modelling of the thermal and subsidence history of large terrestrial basins indicates that they created geologic and thermal anomalies which lasted for geologically significant times. The role of large-scale impact in the biological evolution of the Earth has been highlighted by the discovery of siderophile anomalies at the Cretaceous-Tertiary boundary and associated with North American microtektites. Although in neither case has an associated crater been identified, the observations are consistent with the deposition of projectile-contaminated high-speed ejecta from major impact events. Consideration of impact processes reveals a number of mechanisms by which large-scale impact may induce extinctions.
Arciniega, Luis M; González, Luis; Soares, Vítor; Ciulli, Stefania; Giannini, Marco
2009-11-01
The Work Values Scale EVAT (based on its initials in Spanish: Escala de Valores hacia el Trabajo) was created in 2000 to measure values in the work context. The instrument operationalizes the four higher-order-values of the Schwartz Theory (1992) through sixteen items focused on work scenarios. The questionnaire has been used among large samples of Mexican and Spanish individuals reporting adequate psychometric properties. The instrument has recently been translated into Portuguese and Italian, and subsequently used in a large-scale study with nurses in Portugal and in a sample of various occupations in Italy. The purpose of this research was to demonstrate the cross-cultural validity of the Work Values Scale EVAT in Spanish, Portuguese, and Italian. Our results suggest that the original Spanish version of the EVAT scale and the new Portuguese and Italian versions are equivalent.
Effects of Pre-Existing Target Structure on the Formation of Large Craters
NASA Technical Reports Server (NTRS)
Barnouin-Jha, O. S.; Cintala, M. J.; Crawford, D. A.
2003-01-01
The shapes of large-scale craters and the mechanics responsible for melt generation are influenced by broad and small-scale structures present in a target prior to impact. For example, well-developed systems of fractures often create craters that appear square in outline, good examples being Meteor Crater, AZ and the square craters of 433 Eros. Pre-broken target material also affects melt generation. Kieffer has shown how the shock wave generated in Coconino sandstone at Meteor crater created reverberations which, in combination with the natural target heterogeneity present, created peaks and troughs in pressure and compressed density as individual grains collided to produce a range of shock mineralogies and melts within neighboring samples. In this study, we further explore how pre-existing target structure influences various aspects of the cratering process. We combine experimental and numerical techniques to explore the connection between the scales of the impact generated shock wave and the pre-existing target structure. We focus on the propagation of shock waves in coarse, granular media, emphasizing its consequences on excavation, crater growth, ejecta production, cratering efficiency, melt generation, and crater shape. As a baseline, we present a first series of results for idealized targets where the particles are all identical in size and possess the same shock impedance. We will also present a few results, whereby we increase the complexities of the target properties by varying the grain size, strength, impedance and frictional properties. In addition, we investigate the origin and implications of reverberations that are created by the presence of physical and chemical heterogeneity in a target.
Large-Eddy Simulation of Wind-Plant Aerodynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Churchfield, M. J.; Lee, S.; Moriarty, P. J.
In this work, we present results of a large-eddy simulation of the 48 multi-megawatt turbines composing the Lillgrund wind plant. Turbulent inflow wind is created by performing an atmospheric boundary layer precursor simulation, and turbines are modeled using a rotating, variable-speed actuator line representation. The motivation for this work is that few others have done large-eddy simulations of wind plants with a substantial number of turbines, and the methods for carrying out the simulations are varied. We wish to draw upon the strengths of the existing simulations and our growing atmospheric large-eddy simulation capability to create a sound methodology formore » performing this type of simulation. We used the OpenFOAM CFD toolbox to create our solver. The simulated time-averaged power production of the turbines in the plant agrees well with field observations, except with the sixth turbine and beyond in each wind-aligned. The power produced by each of those turbines is overpredicted by 25-40%. A direct comparison between simulated and field data is difficult because we simulate one wind direction with a speed and turbulence intensity characteristic of Lillgrund, but the field observations were taken over a year of varying conditions. The simulation shows the significant 60-70% decrease in the performance of the turbines behind the front row in this plant that has a spacing of 4.3 rotor diameters in this direction. The overall plant efficiency is well predicted. This work shows the importance of using local grid refinement to simultaneously capture the meter-scale details of the turbine wake and the kilometer-scale turbulent atmospheric structures. Although this work illustrates the power of large-eddy simulation in producing a time-accurate solution, it required about one million processor-hours, showing the significant cost of large-eddy simulation.« less
Weinstein, Daniel; Launay, Jacques; Pearce, Eiluned; Dunbar, Robin I. M.; Stewart, Lauren
2016-01-01
Over our evolutionary history, humans have faced the problem of how to create and maintain social bonds in progressively larger groups compared to those of our primate ancestors. Evidence from historical and anthropological records suggests that group music-making might act as a mechanism by which this large-scale social bonding could occur. While previous research has shown effects of music making on social bonds in small group contexts, the question of whether this effect ‘scales up’ to larger groups is particularly important when considering the potential role of music for large-scale social bonding. The current study recruited individuals from a community choir that met in both small (n = 20 – 80) and large (a ‘megachoir’ combining individuals from the smaller subchoirs n = 232) group contexts. Participants gave self-report measures (via a survey) of social bonding and had pain threshold measurements taken (as a proxy for endorphin release) before and after 90 minutes of singing. Results showed that feelings of inclusion, connectivity, positive affect, and measures of endorphin release all increased across singing rehearsals and that the influence of group singing was comparable for pain thresholds in the large versus small group context. Levels of social closeness were found to be greater at pre- and post-levels for the small choir condition. However, the large choir condition experienced a greater change in social closeness as compared to the small condition. The finding that singing together fosters social closeness – even in large contexts where individuals are not known to each other – is consistent with evolutionary accounts that emphasize the role of music in social bonding, particularly in the context of creating larger cohesive groups than other primates are able to manage. PMID:27158219
NASA Astrophysics Data System (ADS)
Saksena, S.; Merwade, V.; Singhofen, P.
2017-12-01
There is an increasing global trend towards developing large scale flood models that account for spatial heterogeneity at watershed scales to drive the future flood risk planning. Integrated surface water-groundwater modeling procedures can elucidate all the hydrologic processes taking part during a flood event to provide accurate flood outputs. Even though the advantages of using integrated modeling are widely acknowledged, the complexity of integrated process representation, computation time and number of input parameters required have deterred its application to flood inundation mapping, especially for large watersheds. This study presents a faster approach for creating watershed scale flood models using a hybrid design that breaks down the watershed into multiple regions of variable spatial resolution by prioritizing higher order streams. The methodology involves creating a hybrid model for the Upper Wabash River Basin in Indiana using Interconnected Channel and Pond Routing (ICPR) and comparing the performance with a fully-integrated 2D hydrodynamic model. The hybrid approach involves simplification procedures such as 1D channel-2D floodplain coupling; hydrologic basin (HUC-12) integration with 2D groundwater for rainfall-runoff routing; and varying spatial resolution of 2D overland flow based on stream order. The results for a 50-year return period storm event show that hybrid model (NSE=0.87) performance is similar to the 2D integrated model (NSE=0.88) but the computational time is reduced to half. The results suggest that significant computational efficiency can be obtained while maintaining model accuracy for large-scale flood models by using hybrid approaches for model creation.
A biological rationale for musical scales.
Gill, Kamraan Z; Purves, Dale
2009-12-03
Scales are collections of tones that divide octaves into specific intervals used to create music. Since humans can distinguish about 240 different pitches over an octave in the mid-range of hearing, in principle a very large number of tone combinations could have been used for this purpose. Nonetheless, compositions in Western classical, folk and popular music as well as in many other musical traditions are based on a relatively small number of scales that typically comprise only five to seven tones. Why humans employ only a few of the enormous number of possible tone combinations to create music is not known. Here we show that the component intervals of the most widely used scales throughout history and across cultures are those with the greatest overall spectral similarity to a harmonic series. These findings suggest that humans prefer tone combinations that reflect the spectral characteristics of conspecific vocalizations. The analysis also highlights the spectral similarity among the scales used by different cultures.
A Biological Rationale for Musical Scales
Gill, Kamraan Z.; Purves, Dale
2009-01-01
Scales are collections of tones that divide octaves into specific intervals used to create music. Since humans can distinguish about 240 different pitches over an octave in the mid-range of hearing [1], in principle a very large number of tone combinations could have been used for this purpose. Nonetheless, compositions in Western classical, folk and popular music as well as in many other musical traditions are based on a relatively small number of scales that typically comprise only five to seven tones [2]–[6]. Why humans employ only a few of the enormous number of possible tone combinations to create music is not known. Here we show that the component intervals of the most widely used scales throughout history and across cultures are those with the greatest overall spectral similarity to a harmonic series. These findings suggest that humans prefer tone combinations that reflect the spectral characteristics of conspecific vocalizations. The analysis also highlights the spectral similarity among the scales used by different cultures. PMID:19997506
A multidisciplinary approach to the development of low-cost high-performance lightwave networks
NASA Technical Reports Server (NTRS)
Maitan, Jacek; Harwit, Alex
1991-01-01
Our research focuses on high-speed distributed systems. We anticipate that our results will allow the fabrication of low-cost networks employing multi-gigabit-per-second data links for space and military applications. The recent development of high-speed low-cost photonic components and new generations of microprocessors creates an opportunity to develop advanced large-scale distributed information systems. These systems currently involve hundreds of thousands of nodes and are made up of components and communications links that may fail during operation. In order to realize these systems, research is needed into technologies that foster adaptability and scaleability. Self-organizing mechanisms are needed to integrate a working fabric of large-scale distributed systems. The challenge is to fuse theory, technology, and development methodologies to construct a cost-effective, efficient, large-scale system.
Full-color large-scaled computer-generated holograms using RGB color filters.
Tsuchiyama, Yasuhiro; Matsushima, Kyoji
2017-02-06
A technique using RGB color filters is proposed for creating high-quality full-color computer-generated holograms (CGHs). The fringe of these CGHs is composed of more than a billion pixels. The CGHs reconstruct full-parallax three-dimensional color images with a deep sensation of depth caused by natural motion parallax. The simulation technique as well as the principle and challenges of high-quality full-color reconstruction are presented to address the design of filter properties suitable for large-scaled CGHs. Optical reconstructions of actual fabricated full-color CGHs are demonstrated in order to verify the proposed techniques.
REVIEWS OF TOPICAL PROBLEMS: Large-scale star formation in galaxies
NASA Astrophysics Data System (ADS)
Efremov, Yurii N.; Chernin, Artur D.
2003-01-01
A brief review is given of the history of modern ideas on the ongoing star formation process in the gaseous disks of galaxies. Recent studies demonstrate the key role of the interplay between the gas self-gravitation and its turbulent motions. The large scale supersonic gas flows create structures of enhanced density which then give rise to the gravitational condensation of gas into stars and star clusters. Formation of star clusters, associations and complexes is considered, as well as the possibility of isolated star formation. Special emphasis is placed on star formation under the action of ram pressure.
Small-scale heterogeneity spectra in the Earth mantle resolved by PKP-ab,-bc and -df waves
NASA Astrophysics Data System (ADS)
Zheng, Y.
2016-12-01
Plate tectonics creates heterogeneities at mid ocean ridges and subducts the heterogeneities back to the mantle at subduction zones. Heterogeneities manifest themselves by different densities and seismic wave speeds. The length scales and spatial distribution of the heterogeneities measure the mixing mechanism of the plate tectonics. This information can be mathematically captured as the heterogeneity spatial Fourier spectrum. Since most heterogeneities created are on the order of 10s of km, global seismic tomography is not able to resolve them directly. Here, we use seismic P-waves that transmit through the outer core (phases: PKP-ab and PKP-bc) and through the inner core (PKP-df) to probe the lower-mantle heterogeneities. The differential traveltimes (PKP-ab versus PKP-df; PKP-bc versus PKP-df) are sensitive to lower mantle structures. We have collected more than 10,000 PKP phases recorded by Japan Hi-Net short-period seismic network. We found that the lower mantle was filled with seismic heterogeneities from scale 20km to 200km. The heterogeneity spectrum is similar to an exponential distribution but is more enriched in small-scale heterogeneities at the high-wavenumber end. The spectrum is "red" meaning large scales have more power and heterogeneities show a multiscale nature: small-scale heterogeneities are embedded in large-scale heterogeneities. These small-scale heterogeneities cannot be due to thermal origin and they must be compositional. If all these heterogeneities were located in the D" layer, statistically, it would have a root-mean-square P-wave velocity fluctuation of 1% (i.e., -3% to 3%).
A GENERAL SIMULATION MODEL FOR INFORMATION SYSTEMS: A REPORT ON A MODELLING CONCEPT
The report is concerned with the design of large-scale management information systems (MIS). A special design methodology was created, along with a design model to complement it. The purpose of the paper is to present the model.
ERIC Educational Resources Information Center
Abbott, George L.; And Others
1987-01-01
This special feature focuses on recent developments in optical disk technology. Nine articles discuss current trends, large scale image processing, data structures for optical disks, the use of computer simulators to create optical disks, videodisk use in training, interactive audio video systems, impacts on federal information policy, and…
Outsourcing: a managerial competency for the 21st century.
Shaffer, F A
2000-01-01
The widespread application of outsourcing has been fueled by the changing nature of the work contract between employers and employees. The large-scale corporate downsizing that began in the late 1980s inspired a trend away from employer loyalty. This fact, coupled with today's tight labor market, has created a "guerrilla" work force comprised of deal-hungry professionals conditioned to signing bonuses, stock options, and higher-than-scale salaries.
Bioimmobilization of uranium-practical tools for field applications
NASA Astrophysics Data System (ADS)
Istok, J. D.
2011-12-01
Extensive laboratory and field research has conclusively demonstrated that it is possible to stimulate indigenous microbial activity and create conditions favorable for the reductive precipitation of uranium from groundwater, reducing aqueous U concentrations below regulatory levels. A wide variety of complex and coupled biogeochemical processes have been identified and specific reaction mechanisms and parameters have been quantified for a variety of experimental systems including pure, mixed, and natural microbial cultures, and single mineral, artificial, and natural sediments, and groundwater aquifers at scales ranging from very small (10s nm) to very large (10s m). Multicomponent coupled reactive transport models have also been developed to simulate various aspects of this process in 3D heterogeneous environments. Nevertheless, full-scale application of reductive bioimmobilization of uranium (and other radionuclides and metals) remains problematical because of the technical and logistical difficulties in creating and maintaining reducing environment in the many large U contaminated groundwater aquifers currently under aerobic and oxidizing conditions and often containing high concentrations of competing and more energetically favorable electron acceptors (esp. nitrate). This talk will discuss how simple tools, including small-scale in situ testing and geochemical reaction path modeling, can be used to quickly assess the feasibility of applying bioimmobilization to remediate U contaminated groundwater aquifers and provide data needed for full-scale design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keshner, M. S.; Arya, R.
2004-10-01
Hewlett Packard has created a design for a ''Solar City'' factory that will process 30 million sq. meters of glass panels per year and produce 2.1-3.6 GW of solar panels per year-100x the volume of a typical, thin-film, solar panel manufacturer in 2004. We have shown that with a reasonable selection of materials, and conservative assumptions, this ''Solar City'' can produce solar panels and hit the price target of $1.00 per peak watt (6.5x-8.5x lower than prices in 2004) as the total price for a complete and installed rooftop (or ground mounted) solar energy system. This breakthrough in the pricemore » of solar energy comes without the need for any significant new invention. It comes entirely from the manufacturing scale of a large plant and the cost savings inherent in operating at such a large manufacturing scale. We expect that further optimizations from these simple designs will lead to further improvements in cost. The manufacturing process and cost depend on the choice for the active layer that converts sunlight into electricity. The efficiency by which sunlight is converted into electricity can range from 7% to 15%. This parameter has a large effect on the overall price per watt. There are other impacts, as well, and we have attempted to capture them without creating undue distractions. Our primary purpose is to demonstrate the impact of large-scale manufacturing. This impact is largely independent of the choice of active layer. It is not our purpose to compare the pro's and con's for various types of active layers. Significant improvements in cost per watt can also come from scientific advances in active layers that lead to higher efficiency. But, again, our focus is on manufacturing gains and not on the potential advances in the basic technology.« less
2016-04-30
renewable energy projects with a focus on novel onshore/offshore and small/large scale wind turbine designs for expanding their operational range and...ROA to estimate the values of maintenance options created by the implementation of PHM in wind turbines . When an RUL is predicted for a subsystem...predicted for the system. The section titled Example— Wind Turbine With an Outcome-Based Contract presents a case study for a PHM enabled wind
NASA Astrophysics Data System (ADS)
Pan, Zhenying; Yu, Ye Feng; Valuckas, Vytautas; Yap, Sherry L. K.; Vienne, Guillaume G.; Kuznetsov, Arseniy I.
2018-05-01
Cheap large-scale fabrication of ordered nanostructures is important for multiple applications in photonics and biomedicine including optical filters, solar cells, plasmonic biosensors, and DNA sequencing. Existing methods are either expensive or have strict limitations on the feature size and fabrication complexity. Here, we present a laser-based technique, plasmonic nanoparticle lithography, which is capable of rapid fabrication of large-scale arrays of sub-50 nm holes on various substrates. It is based on near-field enhancement and melting induced under ordered arrays of plasmonic nanoparticles, which are brought into contact or in close proximity to a desired material and acting as optical near-field lenses. The nanoparticles are arranged in ordered patterns on a flexible substrate and can be attached and removed from the patterned sample surface. At optimized laser fluence, the nanohole patterning process does not create any observable changes to the nanoparticles and they have been applied multiple times as reusable near-field masks. This resist-free nanolithography technique provides a simple and cheap solution for large-scale nanofabrication.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schanen, Michel; Marin, Oana; Zhang, Hong
Adjoints are an important computational tool for large-scale sensitivity evaluation, uncertainty quantification, and derivative-based optimization. An essential component of their performance is the storage/recomputation balance in which efficient checkpointing methods play a key role. We introduce a novel asynchronous two-level adjoint checkpointing scheme for multistep numerical time discretizations targeted at large-scale numerical simulations. The checkpointing scheme combines bandwidth-limited disk checkpointing and binomial memory checkpointing. Based on assumptions about the target petascale systems, which we later demonstrate to be realistic on the IBM Blue Gene/Q system Mira, we create a model of the expected performance of our checkpointing approach and validatemore » it using the highly scalable Navier-Stokes spectralelement solver Nek5000 on small to moderate subsystems of the Mira supercomputer. In turn, this allows us to predict optimal algorithmic choices when using all of Mira. We also demonstrate that two-level checkpointing is significantly superior to single-level checkpointing when adjoining a large number of time integration steps. To our knowledge, this is the first time two-level checkpointing had been designed, implemented, tuned, and demonstrated on fluid dynamics codes at large scale of 50k+ cores.« less
Micron-scale lens array having diffracting structures
Goldberg, Kenneth A
2013-10-29
A novel micron-scale lens, a microlens, is engineered to concentrate light efficiently onto an area of interest, such as a small, light-sensitive detector element in an integrated electronic device. Existing microlens designs imitate the form of large-scale lenses and are less effective at small sizes. The microlenses described herein have been designed to accommodate diffraction effects, which dominate the behavior of light at small length scales. Thus a new class of light-concentrating optical elements with much higher relative performance has been created. Furthermore, the new designs are much easier to fabricate than previous designs.
Aerodynamic flow deflector to increase large scale wind turbine power generation by 10%.
DOT National Transportation Integrated Search
2015-11-01
The innovation proposed in this paper has the potential to address both the efficiency demands of wind farm owners as well as to provide a disruptive design innovation to turbine manufacturers. The aerodynamic deflector technology was created to impr...
Environmentalism, Globalization and National Economies, 1980-2000
ERIC Educational Resources Information Center
Schofer, Evan; Granados, Francisco J.
2006-01-01
It is commonly assumed that environmentalism harms national economies because environmental regulations constrain economic activity and create incentives for firms to move production and investment to other countries. We point out that global environmentalism involves large-scale institutional changes that: (1) encourage new kinds of economic…
Resource Recovery from Flooded Underground Mines
Butte, Montana has been the site of hard rock mining activities for over a century. Over 400 hundred underground mines were developed and over 10,000 miles of underground mine workings were created. During active mining, groundwater was removed from the workings by large-scale pu...
Resource Recovery of Flooded Underground Mine Workings
Butte, Montana has been the site of hard rock mining activities for over a century. Over 400 hundred underground mines were developed and over 10,000 miles of underground mine workings were created. During active mining, groundwater was removed from the workings by large-scale pu...
Music in the moment? Revisiting the effect of large scale structures.
Lalitte, P; Bigand, E
2006-12-01
The psychological relevance of large-scale musical structures has been a matter of debate in the music community. This issue was investigated with a method that allows assessing listeners' detection of musical incoherencies in normal and scrambled versions of popular and contemporary music pieces. Musical excerpts were segmented into 28 or 29 chunks. In the scrambled version, the temporal order of these chunks was altered with the constraint that the transitions between two chunks never created local acoustical and musical disruptions. Participants were required (1) to detect on-line incoherent linking of chunks, (2) to rate aesthetic quality of pieces, and (3) to evaluate their overall coherence. The findings indicate a moderate sensitivity to large-scale musical structures for popular and contemporary music in both musically trained and untrained listeners. These data are discussed in light of current models of music cognition.
Barbera, J; Macintyre, A; Gostin, L; Inglesby, T; O'Toole, T; DeAtley, C; Tonat, K; Layton, M
2001-12-05
Concern for potential bioterrorist attacks causing mass casualties has increased recently. Particular attention has been paid to scenarios in which a biological agent capable of person-to-person transmission, such as smallpox, is intentionally released among civilians. Multiple public health interventions are possible to effect disease containment in this context. One disease control measure that has been regularly proposed in various settings is the imposition of large-scale or geographic quarantine on the potentially exposed population. Although large-scale quarantine has not been implemented in recent US history, it has been used on a small scale in biological hoaxes, and it has been invoked in federally sponsored bioterrorism exercises. This article reviews the scientific principles that are relevant to the likely effectiveness of quarantine, the logistic barriers to its implementation, legal issues that a large-scale quarantine raises, and possible adverse consequences that might result from quarantine action. Imposition of large-scale quarantine-compulsory sequestration of groups of possibly exposed persons or human confinement within certain geographic areas to prevent spread of contagious disease-should not be considered a primary public health strategy in most imaginable circumstances. In the majority of contexts, other less extreme public health actions are likely to be more effective and create fewer unintended adverse consequences than quarantine. Actions and areas for future research, policy development, and response planning efforts are provided.
NASA Astrophysics Data System (ADS)
Pfister, Olivier
2017-05-01
When it comes to practical quantum computing, the two main challenges are circumventing decoherence (devastating quantum errors due to interactions with the environmental bath) and achieving scalability (as many qubits as needed for a real-life, game-changing computation). We show that using, in lieu of qubits, the "qumodes" represented by the resonant fields of the quantum optical frequency comb of an optical parametric oscillator allows one to create bona fide, large scale quantum computing processors, pre-entangled in a cluster state. We detail our recent demonstration of 60-qumode entanglement (out of an estimated 3000) and present an extension to combining this frequency-tagged with time-tagged entanglement, in order to generate an arbitrarily large, universal quantum computing processor.
Turbulent dusty boundary layer in an ANFO surface-burst explosion
NASA Astrophysics Data System (ADS)
Kuhl, A. L.; Ferguson, R. E.; Chien, K. Y.; Collins, J. P.
1992-01-01
This paper describes the results of numerical simulations of the dusty, turbulent boundary layer created by a surface burst explosion. The blast wave was generated by the detonation of a 600-T hemisphere of ANFO, similar to those used in large-scale field tests. The surface was assumed to be ideally noncratering but contained an initial loose layer of dust. The dust-air mixture in this fluidized bed was modeled as a dense gas (i.e., an equilibrium model, valid for very small-diameter dust particles). The evolution of the flow was calculated by a high-order Godunov code that solves the nonsteady conservation laws. Shock interactions with dense layer generated vorticity near the wall, a result that is similar to viscous, no-slip effects found in clean flows. The resulting wall shear layer was unstable, and rolled up into large-scale rotational structures. These structures entrained dense material from the wall layer and created a chaotically striated flow. The boundary layer grew due to merging of the large-scale structures and due to local entrainment of the dense material from the fluidized bed. The chaotic flow was averaged along similarity lines (i.e., lines of constant values of x = r/Rs and y = z/Rs where R(sub s) = ct(exp alpha)) to establish the mean-flow profiles and the r.m.s. fluctuating-flow profiles of the boundary layer.
ERIC Educational Resources Information Center
Cor, Ken; Alves, Cecilia; Gierl, Mark J.
2008-01-01
This review describes and evaluates a software add-in created by Frontline Systems, Inc., that can be used with Microsoft Excel 2007 to solve large, complex test assembly problems. The combination of Microsoft Excel 2007 with the Frontline Systems Premium Solver Platform is significant because Microsoft Excel is the most commonly used spreadsheet…
Extreme-Scale Bayesian Inference for Uncertainty Quantification of Complex Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Biros, George
Uncertainty quantification (UQ)—that is, quantifying uncertainties in complex mathematical models and their large-scale computational implementations—is widely viewed as one of the outstanding challenges facing the field of CS&E over the coming decade. The EUREKA project set to address the most difficult class of UQ problems: those for which both the underlying PDE model as well as the uncertain parameters are of extreme scale. In the project we worked on these extreme-scale challenges in the following four areas: 1. Scalable parallel algorithms for sampling and characterizing the posterior distribution that exploit the structure of the underlying PDEs and parameter-to-observable map. Thesemore » include structure-exploiting versions of the randomized maximum likelihood method, which aims to overcome the intractability of employing conventional MCMC methods for solving extreme-scale Bayesian inversion problems by appealing to and adapting ideas from large-scale PDE-constrained optimization, which have been very successful at exploring high-dimensional spaces. 2. Scalable parallel algorithms for construction of prior and likelihood functions based on learning methods and non-parametric density estimation. Constructing problem-specific priors remains a critical challenge in Bayesian inference, and more so in high dimensions. Another challenge is construction of likelihood functions that capture unmodeled couplings between observations and parameters. We will create parallel algorithms for non-parametric density estimation using high dimensional N-body methods and combine them with supervised learning techniques for the construction of priors and likelihood functions. 3. Bayesian inadequacy models, which augment physics models with stochastic models that represent their imperfections. The success of the Bayesian inference framework depends on the ability to represent the uncertainty due to imperfections of the mathematical model of the phenomena of interest. This is a central challenge in UQ, especially for large-scale models. We propose to develop the mathematical tools to address these challenges in the context of extreme-scale problems. 4. Parallel scalable algorithms for Bayesian optimal experimental design (OED). Bayesian inversion yields quantified uncertainties in the model parameters, which can be propagated forward through the model to yield uncertainty in outputs of interest. This opens the way for designing new experiments to reduce the uncertainties in the model parameters and model predictions. Such experimental design problems have been intractable for large-scale problems using conventional methods; we will create OED algorithms that exploit the structure of the PDE model and the parameter-to-output map to overcome these challenges. Parallel algorithms for these four problems were created, analyzed, prototyped, implemented, tuned, and scaled up for leading-edge supercomputers, including UT-Austin’s own 10 petaflops Stampede system, ANL’s Mira system, and ORNL’s Titan system. While our focus is on fundamental mathematical/computational methods and algorithms, we will assess our methods on model problems derived from several DOE mission applications, including multiscale mechanics and ice sheet dynamics.« less
NASA Astrophysics Data System (ADS)
Darema, F.
2016-12-01
InfoSymbiotics/DDDAS embodies the power of Dynamic Data Driven Applications Systems (DDDAS), a concept whereby an executing application model is dynamically integrated, in a feed-back loop, with the real-time data-acquisition and control components, as well as other data sources of the application system. Advanced capabilities can be created through such new computational approaches in modeling and simulations, and in instrumentation methods, and include: enhancing the accuracy of the application model; speeding-up the computation to allow faster and more comprehensive models of a system, and create decision support systems with the accuracy of full-scale simulations; in addition, the notion of controlling instrumentation processes by the executing application results in more efficient management of application-data and addresses challenges of how to architect and dynamically manage large sets of heterogeneous sensors and controllers, an advance over the static and ad-hoc ways of today - with DDDAS these sets of resources can be managed adaptively and in optimized ways. Large-Scale-Dynamic-Data encompasses the next wave of Big Data, and namely dynamic data arising from ubiquitous sensing and control in engineered, natural, and societal systems, through multitudes of heterogeneous sensors and controllers instrumenting these systems, and where opportunities and challenges at these "large-scales" relate not only to data size but the heterogeneity in data, data collection modalities, fidelities, and timescales, ranging from real-time data to archival data. In tandem with this important dimension of dynamic data, there is an extended view of Big Computing, which includes the collective computing by networked assemblies of multitudes of sensors and controllers, this range from the high-end to the real-time seamlessly integrated and unified, and comprising the Large-Scale-Big-Computing. InfoSymbiotics/DDDAS engenders transformative impact in many application domains, ranging from the nano-scale to the terra-scale and to the extra-terra-scale. The talk will address opportunities for new capabilities together with corresponding research challenges, with illustrative examples from several application areas including environmental sciences, geosciences, and space sciences.
HRLSim: a high performance spiking neural network simulator for GPGPU clusters.
Minkovich, Kirill; Thibeault, Corey M; O'Brien, Michael John; Nogin, Aleksey; Cho, Youngkwan; Srinivasa, Narayan
2014-02-01
Modeling of large-scale spiking neural models is an important tool in the quest to understand brain function and subsequently create real-world applications. This paper describes a spiking neural network simulator environment called HRL Spiking Simulator (HRLSim). This simulator is suitable for implementation on a cluster of general purpose graphical processing units (GPGPUs). Novel aspects of HRLSim are described and an analysis of its performance is provided for various configurations of the cluster. With the advent of inexpensive GPGPU cards and compute power, HRLSim offers an affordable and scalable tool for design, real-time simulation, and analysis of large-scale spiking neural networks.
Autonomous Energy Grids | Grid Modernization | NREL
control themselves using advanced machine learning and simulation to create resilient, reliable, and affordable optimized energy systems. Current frameworks to monitor, control, and optimize large-scale energy of optimization theory, control theory, big data analytics, and complex system theory and modeling to
Monitoring aquatic resources for regional assessments requires an accurate and comprehensive inventory of the resource and useful classification of exosystem similarities. Our research effort to create an electronic database and work with various ways to classify coastal wetlands...
Leveraging Web-Based Environments for Mass Atrocity Prevention
ERIC Educational Resources Information Center
Harding, Tucker B.; Whitlock, Mark A.
2013-01-01
A growing literature exploring large-scale, identity-based political violence, including mass killing and genocide, debates the plausibility of, and prospects for, early warning and prevention. An extension of the debate involves the prospects for creating educational experiences that result in more sophisticated analytical products that enhance…
The biogeochemical heterogeneity of tropical forests.
Townsend, Alan R; Asner, Gregory P; Cleveland, Cory C
2008-08-01
Tropical forests are renowned for their biological diversity, but also harbor variable combinations of soil age, chemistry and susceptibility to erosion or tectonic uplift. Here we contend that the combined effects of this biotic and abiotic diversity promote exceptional biogeochemical heterogeneity at multiple scales. At local levels, high plant diversity creates variation in chemical and structural traits that affect plant production, decomposition and nutrient cycling. At regional levels, myriad combinations of soil age, soil chemistry and landscape dynamics create variation and uncertainty in limiting nutrients that do not exist at higher latitudes. The effects of such heterogeneity are not well captured in large-scale estimates of tropical ecosystem function, but we suggest new developments in remote sensing can help bridge the gap.
Aqueous Two-Phase Systems at Large Scale: Challenges and Opportunities.
Torres-Acosta, Mario A; Mayolo-Deloisa, Karla; González-Valdez, José; Rito-Palomares, Marco
2018-06-07
Aqueous two-phase systems (ATPS) have proved to be an efficient and integrative operation to enhance recovery of industrially relevant bioproducts. After ATPS discovery, a variety of works have been published regarding their scaling from 10 to 1000 L. Although ATPS have achieved high recovery and purity yields, there is still a gap between their bench-scale use and potential industrial applications. In this context, this review paper critically analyzes ATPS scale-up strategies to enhance the potential industrial adoption. In particular, large-scale operation considerations, different phase separation procedures, the available optimization techniques (univariate, response surface methodology, and genetic algorithms) to maximize recovery and purity and economic modeling to predict large-scale costs, are discussed. ATPS intensification to increase the amount of sample to process at each system, developing recycling strategies and creating highly efficient predictive models, are still areas of great significance that can be further exploited with the use of high-throughput techniques. Moreover, the development of novel ATPS can maximize their specificity increasing the possibilities for the future industry adoption of ATPS. This review work attempts to present the areas of opportunity to increase ATPS attractiveness at industrial levels. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Do we understand what creates 150-km echoes and gives them their distinct structure?
NASA Astrophysics Data System (ADS)
Oppenheim, M. M.; Kudeki, E.; Salas Reyes, P.; Dimant, Y. S.
2017-12-01
Researchers first discovered 150-km echoes over 50 years ago using the first large VHF radars near the geomagnetic equator. However, the underlying mechanism that creates and modulates them remains largely a mystery. Despite this lack of understanding the aeronomy community uses them to monitor daytime vertical plasma drifts between 130 and 160 km altitude. In a 2016 paper, Oppenheim and Dimant used simulations to show that photoelectrons can generate the type of echoes seen by the radars but this theory doesn't explain any of the detailed structures. This paper will show the modern observations of 150 km echoes using simultaneous radar and ionosonde measurements. It will then describe the latest analysis to attempt to explain these features using large-scale kinetic simulations of photoelectrons interacting with the ambient ionospheric plasma under a range of conditions.
Hydropower and sustainability: resilience and vulnerability in China's powersheds.
McNally, Amy; Magee, Darrin; Wolf, Aaron T
2009-07-01
Large dams represent a whole complex of social, economic and ecological processes, perhaps more than any other large infrastructure project. Today, countries with rapidly developing economies are constructing new dams to provide energy and flood control to growing populations in riparian and distant urban communities. If the system is lacking institutional capacity to absorb these physical and institutional changes there is potential for conflict, thereby threatening human security. In this paper, we propose analyzing sustainability (political, socioeconomic, and ecological) in terms of resilience versus vulnerability, framed within the spatial abstraction of a powershed. The powershed framework facilitates multi-scalar and transboundary analysis while remaining focused on the questions of resilience and vulnerability relating to hydropower dams. Focusing on examples from China, this paper describes the complex nature of dams using the sustainability and powershed frameworks. We then analyze the roles of institutions in China to understand the relationships between power, human security and the socio-ecological system. To inform the study of conflicts over dams China is a particularly useful case study because we can examine what happens at the international, national and local scales. The powershed perspective allows us to examine resilience and vulnerability across political boundaries from a dynamic, process-defined analytical scale while remaining focused on a host of questions relating to hydro-development that invoke drivers and impacts on national and sub-national scales. The ability to disaggregate the affects of hydropower dam construction from political boundaries allows for a deeper analysis of resilience and vulnerability. From our analysis we find that reforms in China's hydropower sector since 1996 have been motivated by the need to create stability at the national scale rather than resilient solutions to China's growing demand for energy and water resource control at the local and international scales. Some measures that improved economic development through the market economy and a combination of dam construction and institutional reform may indeed improve hydro-political resilience at a single scale. However, if China does address large-scale hydropower construction's potential to create multi-scale geopolitical tensions, they may be vulnerable to conflict - though not necessarily violent - in domestic and international political arenas. We conclude with a look toward a resilient basin institution for the Nu/Salween River, the site of a proposed large-scale hydropower development effort in China and Myanmar.
Properties and spatial distribution of galaxy superclusters
NASA Astrophysics Data System (ADS)
Liivamägi, Lauri Juhan
2017-01-01
Astronomy is a science that can offer plenty of unforgettable imagery, and the large-scale distribution of galaxies is no exception. Among the first features the viewer's eye is likely to be drawn to, are large concentrations of galaxies - galaxy superclusters, contrasting to the seemingly empty regions beside them. Superclusters can extend from tens to over hundred megaparsecs, they contain from hundreds to thousands of galaxies, and many galaxy groups and clusters. Unlike galaxy clusters, superclusters are clearly unrelaxed systems, not gravitationally bound as crossing times exceed the age of the universe, and show little to no radial symmetry. Superclusters, as part of the large-scale structure, are sensitive to the initial power spectrum and the following evolution. They are massive enough to leave an imprint on the cosmic microwave background radiation. Superclusters can also provide an unique environment for their constituent galaxies and galaxy clusters. In this study we used two different observational and one simulated galaxy samples to create several catalogues of structures that, we think, correspond to what are generally considered galaxy superclusters. Superclusters were delineated as continuous over-dense regions in galaxy luminosity density fields. When calculating density fields several corrections were applied to remove small-scale redshift distortions and distance-dependent selection effects. Resulting catalogues of objects display robust statistical properties, showing that flux-limited galaxy samples can be used to create nearly volume-limited catalogues of superstructures. Generally, large superclusters can be regarded as massive, often branching filamentary structures, that are mainly characterised by their length. Smaller superclusters, on the other hand, can display a variety of shapes. Spatial distribution of superclusters shows large-scale variations, with high-density concentrations often found in semi-regularly spaced groups. Future studies are needed to quantify the relations between superclusters and finer details of the galaxy distribution. Supercluster catalogues from this thesis have already been used in numerous other studies.
Haugum, Mona; Danielsen, Kirsten; Iversen, Hilde Hestad; Bjertnaes, Oyvind
2014-12-01
An important goal for national and large-scale surveys of user experiences is quality improvement. However, large-scale surveys are normally conducted by a professional external surveyor, creating an institutionalized division between the measurement of user experiences and the quality work that is performed locally. The aim of this study was to identify and describe scientific studies related to the use of national and large-scale surveys of user experiences in local quality work. Ovid EMBASE, Ovid MEDLINE, Ovid PsycINFO and the Cochrane Database of Systematic Reviews. Scientific publications about user experiences and satisfaction about the extent to which data from national and other large-scale user experience surveys are used for local quality work in the health services. Themes of interest were identified and a narrative analysis was undertaken. Thirteen publications were included, all differed substantially in several characteristics. The results show that large-scale surveys of user experiences are used in local quality work. The types of follow-up activity varied considerably from conducting a follow-up analysis of user experience survey data to information sharing and more-systematic efforts to use the data as a basis for improving the quality of care. This review shows that large-scale surveys of user experiences are used in local quality work. However, there is a need for more, better and standardized research in this field. The considerable variation in follow-up activities points to the need for systematic guidance on how to use data in local quality work. © The Author 2014. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.
NASA Astrophysics Data System (ADS)
Mei, D.-M.; Wang, G.-J.; Mei, H.; Yang, G.; Liu, J.; Wagner, M.; Panth, R.; Kooi, K.; Yang, Y.-Y.; Wei, W.-Z.
2018-03-01
Light, MeV-scale dark matter (DM) is an exciting DM candidate that is undetectable by current experiments. A germanium (Ge) detector utilizing internal charge amplification for the charge carriers created by the ionization of impurities is a promising new technology with experimental sensitivity for detecting MeV-scale DM. We analyze the physics mechanisms of the signal formation, charge creation, charge internal amplification, and the projected sensitivity for directly detecting MeV-scale DM particles. We present a design for a novel Ge detector at helium temperature (˜ 4 K) enabling ionization of impurities from DM impacts. With large localized E-fields, the ionized excitations can be accelerated to kinetic energies larger than the Ge bandgap at which point they can create additional electron-hole pairs, producing intrinsic amplification to achieve an ultra-low energy threshold of ˜ 0.1 eV for detecting low-mass DM particles in the MeV scale. Correspondingly, such a Ge detector with 1 kg-year exposure will have high sensitivity to a DM-nucleon cross section of ˜ 5 × 10^{-45} cm2 at a DM mass of ˜ 10 MeV/c2 and a DM-electron cross section of ˜ 5 × 10^{-46} cm2 at a DM mass of ˜ 1 MeV/c^2.
Microfluidic large-scale integration: the evolution of design rules for biological automation.
Melin, Jessica; Quake, Stephen R
2007-01-01
Microfluidic large-scale integration (mLSI) refers to the development of microfluidic chips with thousands of integrated micromechanical valves and control components. This technology is utilized in many areas of biology and chemistry and is a candidate to replace today's conventional automation paradigm, which consists of fluid-handling robots. We review the basic development of mLSI and then discuss design principles of mLSI to assess the capabilities and limitations of the current state of the art and to facilitate the application of mLSI to areas of biology. Many design and practical issues, including economies of scale, parallelization strategies, multiplexing, and multistep biochemical processing, are discussed. Several microfluidic components used as building blocks to create effective, complex, and highly integrated microfluidic networks are also highlighted.
Schiffels, Daniel; Szalai, Veronika A; Liddle, J Alexander
2017-07-25
Robust self-assembly across length scales is a ubiquitous feature of biological systems but remains challenging for synthetic structures. Taking a cue from biology-where disparate molecules work together to produce large, functional assemblies-we demonstrate how to engineer microscale structures with nanoscale features: Our self-assembly approach begins by using DNA polymerase to controllably create double-stranded DNA (dsDNA) sections on a single-stranded template. The single-stranded DNA (ssDNA) sections are then folded into a mechanically flexible skeleton by the origami method. This process simultaneously shapes the structure at the nanoscale and directs the large-scale geometry. The DNA skeleton guides the assembly of RecA protein filaments, which provides rigidity at the micrometer scale. We use our modular design strategy to assemble tetrahedral, rectangular, and linear shapes of defined dimensions. This method enables the robust construction of complex assemblies, greatly extending the range of DNA-based self-assembly methods.
The Prophecy of the Kallikak Family: "Disability" in Postmodern Schools.
ERIC Educational Resources Information Center
Karagiannis, Anastasios
1999-01-01
Due to disappearing jobs and large-scale automated production of goods and services, schools face a choice between "kallikakization" (creating and perpetuating a split between the comfortably endowed and the deprived) and achieving a new balance in social regulation/control arrangements. Postmodern schools can avoid kallikakization by…
ERIC Educational Resources Information Center
Schaffhauser, Dian
2013-01-01
With so many disruptive forces at work in higher education, colleges and universities are faced with the imperative to change not just technologies and processes, but behaviors and mindsets. In part one of a two-part series, change-management experts share six ways to foster large-scale transformations on campus. "Campus Technology"…
Linking Assessment and School Success.
ERIC Educational Resources Information Center
Raham, Helen
1999-01-01
School systems have recently experienced a dramatic shift toward the use of large-scale assessment to improve school performance. Discusses the ways in which external assessment may benefit education, the need for multiple measures of various dimensions of school success, and guidelines for using assessment to create a dynamic cycle of continuous…
ERIC Educational Resources Information Center
Lott, Debra
2011-01-01
Visual art has been defined as a vehicle for expression or communication of emotions and ideas. Leo Tolstoy identified art as a use of indirect means to communicate from one person to another. Contemporary artist HA Schult is an internationally renowned German artist who takes unwanted trash and transforms it into figures, creating large-scale art…
Language and Literacy Shifts in Refugee Populations.
ERIC Educational Resources Information Center
Long, Lynellyn D.
The large scale movements of refugees in many areas of the world are having dramatic impacts on indigenous cultures, languages, and literacies. Both anecdotal evidence and research suggest that the experience of uprooting and displacement creates an increased demand for literacy, new forms of literate expression, and more multilingual…
Fundraising in Community College Foundations. ERIC Digest.
ERIC Educational Resources Information Center
Schuyler, Gwyer
In response to declining local and state appropriations for public education, community colleges have taken steps to formalize fundraising efforts by creating institutional foundations as recipients of tax-deductible contributions. Large-scale external fundraising at community colleges began as a result of the 1965 Higher Education Act and the…
Automated geographic registration and radiometric correction for UAV-based mosaics
USDA-ARS?s Scientific Manuscript database
Texas A&M University has been operating a large-scale, UAV-based, agricultural remote-sensing research project since 2015. To use UAV-based images in agricultural production, many high-resolution images must be mosaicked together to create an image of an agricultural field. Two key difficulties to s...
Grossman, Robert L.; Heath, Allison; Murphy, Mark; Patterson, Maria; Wells, Walt
2017-01-01
Data commons collocate data, storage, and computing infrastructure with core services and commonly used tools and applications for managing, analyzing, and sharing data to create an interoperable resource for the research community. An architecture for data commons is described, as well as some lessons learned from operating several large-scale data commons. PMID:29033693
To the Cloud! A Grassroots Proposal to Accelerate Brain Science Discovery
Vogelstein, Joshua T.; Mensh, Brett; Hausser, Michael; Spruston, Nelson; Evans, Alan; Kording, Konrad; Amunts, Katrin; Ebell, Christoph; Muller, Jeff; Telefont, Martin; Hill, Sean; Koushika, Sandhya P.; Cali, Corrado; Valdés-Sosa, Pedro Antonio; Littlewood, Peter; Koch, Christof; Saalfeld, Stephan; Kepecs, Adam; Peng, Hanchuan; Halchenko, Yaroslav O.; Kiar, Gregory; Poo, Mu-Ming; Poline, Jean-Baptiste; Milham, Michael P.; Schaffer, Alyssa Picchini; Gidron, Rafi; Okano, Hideyuki; Calhoun, Vince D; Chun, Miyoung; Kleissas, Dean M.; Vogelstein, R. Jacob; Perlman, Eric; Burns, Randal; Huganir, Richard; Miller, Michael I.
2018-01-01
The revolution in neuroscientific data acquisition is creating an analysis challenge. We propose leveraging cloud-computing technologies to enable large-scale neurodata storing, exploring, analyzing, and modeling. This utility will empower scientists globally to generate and test theories of brain function and dysfunction. PMID:27810005
Method for replicating an array of nucleic acid probes
Cantor, Charles R.; Przetakiewicz, Marek; Smith, Cassandra L.; Sano, Takeshi
1998-01-01
The invention relates to the replication of probe arrays and methods for replicating arrays of probes which are useful for the large scale manufacture of diagnostic aids used to screen biological samples for specific target sequences. Arrays created using PCR technology may comprise probes with 5'- and/or 3'-overhangs.
NASA Astrophysics Data System (ADS)
McFarland, Jacob A.; Reilly, David; Black, Wolfgang; Greenough, Jeffrey A.; Ranjan, Devesh
2015-07-01
The interaction of a small-wavelength multimodal perturbation with a large-wavelength inclined interface perturbation is investigated for the reshocked Richtmyer-Meshkov instability using three-dimensional simulations. The ares code, developed at Lawrence Livermore National Laboratory, was used for these simulations and a detailed comparison of simulation results and experiments performed at the Georgia Tech Shock Tube facility is presented first for code validation. Simulation results are presented for four cases that vary in large-wavelength perturbation amplitude and the presence of secondary small-wavelength multimode perturbations. Previously developed measures of mixing and turbulence quantities are presented that highlight the large variation in perturbation length scales created by the inclined interface and the multimode complex perturbation. Measures are developed for entrainment, and turbulence anisotropy that help to identify the effects of and competition between each perturbations type. It is shown through multiple measures that before reshock the flow processes a distinct memory of the initial conditions that is present in both large-scale-driven entrainment measures and small-scale-driven mixing measures. After reshock the flow develops to a turbulentlike state that retains a memory of high-amplitude but not low-amplitude large-wavelength perturbations. It is also shown that the high-amplitude large-wavelength perturbation is capable of producing small-scale mixing and turbulent features similar to the small-wavelength multimode perturbations.
2009-09-30
airborne radar images; develop an analysis scheme for the monsoon and storm- scale circulation features that would: a. Define large-scale context...Doppler radar observations of TC mesoscale observations. The TCS-08 field program provided unique aircraft reconnaissance (recon) data that will be...system for WC-130J, as well as developed new system for recording airborne radar video for the first time. 3. Created an archive of all WC-130J
Heterogeneity and scale of sustainable development in cities.
Brelsford, Christa; Lobo, José; Hand, Joe; Bettencourt, Luís M A
2017-08-22
Rapid worldwide urbanization is at once the main cause and, potentially, the main solution to global sustainable development challenges. The growth of cities is typically associated with increases in socioeconomic productivity, but it also creates strong inequalities. Despite a growing body of evidence characterizing these heterogeneities in developed urban areas, not much is known systematically about their most extreme forms in developing cities and their consequences for sustainability. Here, we characterize the general patterns of income and access to services in a large number of developing cities, with an emphasis on an extensive, high-resolution analysis of the urban areas of Brazil and South Africa. We use detailed census data to construct sustainable development indices in hundreds of thousands of neighborhoods and show that their statistics are scale-dependent and point to the critical role of large cities in creating higher average incomes and greater access to services within their national context. We then quantify the general statistical trajectory toward universal basic service provision at different scales to show that it is characterized by varying levels of inequality, with initial increases in access being typically accompanied by growing disparities over characteristic spatial scales. These results demonstrate how extensions of these methods to other goals and data can be used over time and space to produce a simple but general quantitative assessment of progress toward internationally agreed sustainable development goals.
Heterogeneity and scale of sustainable development in cities
Brelsford, Christa; Lobo, José; Hand, Joe
2017-01-01
Rapid worldwide urbanization is at once the main cause and, potentially, the main solution to global sustainable development challenges. The growth of cities is typically associated with increases in socioeconomic productivity, but it also creates strong inequalities. Despite a growing body of evidence characterizing these heterogeneities in developed urban areas, not much is known systematically about their most extreme forms in developing cities and their consequences for sustainability. Here, we characterize the general patterns of income and access to services in a large number of developing cities, with an emphasis on an extensive, high-resolution analysis of the urban areas of Brazil and South Africa. We use detailed census data to construct sustainable development indices in hundreds of thousands of neighborhoods and show that their statistics are scale-dependent and point to the critical role of large cities in creating higher average incomes and greater access to services within their national context. We then quantify the general statistical trajectory toward universal basic service provision at different scales to show that it is characterized by varying levels of inequality, with initial increases in access being typically accompanied by growing disparities over characteristic spatial scales. These results demonstrate how extensions of these methods to other goals and data can be used over time and space to produce a simple but general quantitative assessment of progress toward internationally agreed sustainable development goals. PMID:28461489
Future aircraft networks and schedules
NASA Astrophysics Data System (ADS)
Shu, Yan
2011-07-01
Because of the importance of air transportation scheduling, the emergence of small aircraft and the vision of future fuel-efficient aircraft, this thesis has focused on the study of aircraft scheduling and network design involving multiple types of aircraft and flight services. It develops models and solution algorithms for the schedule design problem and analyzes the computational results. First, based on the current development of small aircraft and on-demand flight services, this thesis expands a business model for integrating on-demand flight services with the traditional scheduled flight services. This thesis proposes a three-step approach to the design of aircraft schedules and networks from scratch under the model. In the first step, both a frequency assignment model for scheduled flights that incorporates a passenger path choice model and a frequency assignment model for on-demand flights that incorporates a passenger mode choice model are created. In the second step, a rough fleet assignment model that determines a set of flight legs, each of which is assigned an aircraft type and a rough departure time is constructed. In the third step, a timetable model that determines an exact departure time for each flight leg is developed. Based on the models proposed in the three steps, this thesis creates schedule design instances that involve almost all the major airports and markets in the United States. The instances of the frequency assignment model created in this thesis are large-scale non-convex mixed-integer programming problems, and this dissertation develops an overall network structure and proposes iterative algorithms for solving these instances. The instances of both the rough fleet assignment model and the timetable model created in this thesis are large-scale mixed-integer programming problems, and this dissertation develops subproblem schemes for solving these instances. Based on these solution algorithms, this dissertation also presents computational results of these large-scale instances. To validate the models and solution algorithms developed, this thesis also compares the daily flight schedules that it designs with the schedules of the existing airlines. Furthermore, it creates instances that represent different economic and fuel-prices conditions and derives schedules under these different conditions. In addition, it discusses the implication of using new aircraft in the future flight schedules. Finally, future research in three areas---model, computational method, and simulation for validation---is proposed.
Ultrasonic Recovery and Modification of Food Ingredients
NASA Astrophysics Data System (ADS)
Vilkhu, Kamaljit; Manasseh, Richard; Mawson, Raymond; Ashokkumar, Muthupandian
There are two general classes of effects that sound, and ultrasound in particular, can have on a fluid. First, very significant modifications to the nature of food and food ingredients can be due to the phenomena of bubble acoustics and cavitation. The applied sound oscillates bubbles in the fluid, creating intense forces at microscopic scales thus driving chemical changes. Second, the sound itself can cause the fluid to flow vigorously, both on a large scale and on a microscopic scale; furthermore, the sound can cause particles in the fluid to move relative to the fluid. These streaming phenomena can redistribute materials within food and food ingredients at both microscopic and macroscopic scales.
Scaling differences between large interplate and intraplate earthquakes
NASA Technical Reports Server (NTRS)
Scholz, C. H.; Aviles, C. A.; Wesnousky, S. G.
1985-01-01
A study of large intraplate earthquakes with well determined source parameters shows that these earthquakes obey a scaling law similar to large interplate earthquakes, in which M sub o varies as L sup 2 or u = alpha L where L is rupture length and u is slip. In contrast to interplate earthquakes, for which alpha approximately equals 1 x .00001, for the intraplate events alpha approximately equals 6 x .0001, which implies that these earthquakes have stress-drops about 6 times higher than interplate events. This result is independent of focal mechanism type. This implies that intraplate faults have a higher frictional strength than plate boundaries, and hence, that faults are velocity or slip weakening in their behavior. This factor may be important in producing the concentrated deformation that creates and maintains plate boundaries.
Bio-inspired scale-like surface textures and their tribological properties.
Greiner, Christian; Schäfer, Michael
2015-06-30
Friction, wear and the associated energy dissipation are major challenges in all systems containing moving parts. Examples range from nanoelectromechanical systems over hip prosthesis to off-shore wind turbines. Bionic approaches have proven to be very successful in many engineering problems, while investigating the potential of a bio-inspired approach in creating morphological surface textures is a relatively new field of research. Here, we developed laser-created textures inspired by the scales found on the skin of snakes and certain lizards. We show that this bio-inspired surface morphology reduced dry sliding friction forces by more than 40%. In lubricated contacts the same morphology increased friction by a factor of three. Two different kinds of morphologies, one with completely overlapping scales and one with the scales arranged in individual rows, were chosen. In lubricated as well as unlubricated contacts, the surface texture with the scales in rows showed lower friction forces than the completely overlapping ones. We anticipate that these results could have significant impact in all dry sliding contacts, ranging from nanoelectromechanical and micro-positioning systems up to large-scale tribological contacts which cannot be lubricated, e.g. because they are employed in a vacuum environment.
Magnetic pattern at supergranulation scale: the void size distribution
NASA Astrophysics Data System (ADS)
Berrilli, F.; Scardigli, S.; Del Moro, D.
2014-08-01
The large-scale magnetic pattern observed in the photosphere of the quiet Sun is dominated by the magnetic network. This network, created by photospheric magnetic fields swept into convective downflows, delineates the boundaries of large-scale cells of overturning plasma and exhibits "voids" in magnetic organization. These voids include internetwork fields, which are mixed-polarity sparse magnetic fields that populate the inner part of network cells. To single out voids and to quantify their intrinsic pattern we applied a fast circle-packing-based algorithm to 511 SOHO/MDI high-resolution magnetograms acquired during the unusually long solar activity minimum between cycles 23 and 24. The computed void distribution function shows a quasi-exponential decay behavior in the range 10-60 Mm. The lack of distinct flow scales in this range corroborates the hypothesis of multi-scale motion flows at the solar surface. In addition to the quasi-exponential decay, we have found that the voids depart from a simple exponential decay at about 35 Mm.
NASA Astrophysics Data System (ADS)
Vogl, Raimund
2001-08-01
In 1997, a large PACS was first introduced at Innsbruck University Hospital in the context of a new traumatology centre. In the subsequent years, this initial PACS setting covering only one department was expanded to most of the hospital campus, with currently some 250 viewing stations attached. Constantly connecting new modalities and viewing stations created the demand for several redesigns from the original PACS configuration to cope with the increasing data load. We give an account of these changes necessary to develop a multi hospital PACS and the considerations that lead us there. Issues of personnel for running a large scale PACS are discussed and we give an outlook to the new information systems currently under development for archiving and communication of general medical imaging data and for simple telemedicine networking between several large university hospitals.
Transforming a Liability Into An Asset-Creating a Market for CO2-based Products
NASA Astrophysics Data System (ADS)
David, B. J.
2016-12-01
This session will discuss converting CO2 from a liability into an asset. It will specifically discuss how at least 25 products can be created using CO2 as a feedstock and deployed in the market at large scale. Focus will be on products that can both achieve scale from a market standpoint as well as climate significance in use of CO2 as a feedstock. The session will describe the market drivers supporting and inhibiting commercial deployment of CO2-based products. It will list key barriers and risks in the various CO2-based product segments. These barriers/risks could occur across technology, policy, institutional, economic, and other dimensions. The means to mitigate each barrier and the likelihood for such means to be deployed will be discussed.
Barrett, Lisa Feldman; Satpute, Ajay
2013-01-01
Understanding how a human brain creates a human mind ultimately depends on mapping psychological categories and concepts to physical measurements of neural response. Although it has long been assumed that emotional, social, and cognitive phenomena are realized in the operations of separate brain regions or brain networks, we demonstrate that it is possible to understand the body of neuroimaging evidence using a framework that relies on domain general, distributed structure-function mappings. We review current research in affective and social neuroscience and argue that the emerging science of large-scale intrinsic brain networks provides a coherent framework for a domain-general functional architecture of the human brain. PMID:23352202
Shock propagation in locally driven granular systems
NASA Astrophysics Data System (ADS)
Joy, Jilmy P.; Pathak, Sudhir N.; Das, Dibyendu; Rajesh, R.
2017-09-01
We study shock propagation in a system of initially stationary hard spheres that is driven by a continuous injection of particles at the origin. The disturbance created by the injection of energy spreads radially outward through collisions between particles. Using scaling arguments, we determine the exponent characterizing the power-law growth of this disturbance in all dimensions. The scaling functions describing the various physical quantities are determined using large-scale event-driven simulations in two and three dimensions for both elastic and inelastic systems. The results are shown to describe well the data from two different experiments on granular systems that are similarly driven.
Shock propagation in locally driven granular systems.
Joy, Jilmy P; Pathak, Sudhir N; Das, Dibyendu; Rajesh, R
2017-09-01
We study shock propagation in a system of initially stationary hard spheres that is driven by a continuous injection of particles at the origin. The disturbance created by the injection of energy spreads radially outward through collisions between particles. Using scaling arguments, we determine the exponent characterizing the power-law growth of this disturbance in all dimensions. The scaling functions describing the various physical quantities are determined using large-scale event-driven simulations in two and three dimensions for both elastic and inelastic systems. The results are shown to describe well the data from two different experiments on granular systems that are similarly driven.
Structural design approaches for creating fat droplet and starch granule mimetics.
McClements, David Julian; Chung, Cheryl; Wu, Bi-Cheng
2017-02-22
This article focuses on hydrogel-based strategies for creating reduced calorie foods with desirable physicochemical, sensory, and nutritional properties. Initially, the role of fat droplets and starch granules in foods is discussed, and then different methods for fabricating hydrogel beads are reviewed, including phase separation, antisolvent precipitation, injection, and emulsion template methods. Finally, the potential application of hydrogel beads as fat droplet and starch granule replacements is discussed. There is still a need for large-scale, high-throughout, and economical methods of fabricating hydrogel beads suitable for utilization within the food industry.
Participatory visualization with Wordle.
Viégas, Fernanda B; Wattenberg, Martin; Feinberg, Jonathan
2009-01-01
We discuss the design and usage of "Wordle," a web-based tool for visualizing text. Wordle creates tag-cloud-like displays that give careful attention to typography, color, and composition. We describe the algorithms used to balance various aesthetic criteria and create the distinctive Wordle layouts. We then present the results of a study of Wordle usage, based both on spontaneous behaviour observed in the wild, and on a large-scale survey of Wordle users. The results suggest that Wordles have become a kind of medium of expression, and that a "participatory culture" has arisen around them.
Designing artificial 2D crystals with site and size controlled quantum dots.
Xie, Xuejun; Kang, Jiahao; Cao, Wei; Chu, Jae Hwan; Gong, Yongji; Ajayan, Pulickel M; Banerjee, Kaustav
2017-08-30
Ordered arrays of quantum dots in two-dimensional (2D) materials would make promising optical materials, but their assembly could prove challenging. Here we demonstrate a scalable, site and size controlled fabrication of quantum dots in monolayer molybdenum disulfide (MoS 2 ), and quantum dot arrays with nanometer-scale spatial density by focused electron beam irradiation induced local 2H to 1T phase change in MoS 2 . By designing the quantum dots in a 2D superlattice, we show that new energy bands form where the new band gap can be controlled by the size and pitch of the quantum dots in the superlattice. The band gap can be tuned from 1.81 eV to 1.42 eV without loss of its photoluminescence performance, which provides new directions for fabricating lasers with designed wavelengths. Our work constitutes a photoresist-free, top-down method to create large-area quantum dot arrays with nanometer-scale spatial density that allow the quantum dots to interfere with each other and create artificial crystals. This technique opens up new pathways for fabricating light emitting devices with 2D materials at desired wavelengths. This demonstration can also enable the assembly of large scale quantum information systems and open up new avenues for the design of artificial 2D materials.
Investigating a link between large and small-scale chaos features on Europa
NASA Astrophysics Data System (ADS)
Tognetti, L.; Rhoden, A.; Nelson, D. M.
2017-12-01
Chaos is one of the most recognizable, and studied, features on Europa's surface. Most models of chaos formation invoke liquid water at shallow depths within the ice shell; the liquid destabilizes the overlying ice layer, breaking it into mobile rafts and destroying pre-existing terrain. This class of model has been applied to both large-scale chaos like Conamara and small-scale features (i.e. microchaos), which are typically <10 km in diameter. Currently unknown, however, is whether both large-scale and small-scale features are produced together, e.g. through a network of smaller sills linked to a larger liquid water pocket. If microchaos features do form as satellites of large-scale chaos features, we would expect a drop off in the number density of microchaos with increasing distance from the large chaos feature; the trend should not be observed in regions without large-scale chaos features. Here, we test the hypothesis that large chaos features create "satellite" systems of smaller chaos features. Either outcome will help us better understand the relationship between large-scale chaos and microchaos. We focus first on regions surrounding the large chaos features Conamara and Murias (e.g. the Mitten). We map all chaos features within 90,000 sq km of the main chaos feature and assign each one a ranking (High Confidence, Probable, or Low Confidence) based on the observed characteristics of each feature. In particular, we look for a distinct boundary, loss of preexisting terrain, the existence of rafts or blocks, and the overall smoothness of the feature. We also note features that are chaos-like but lack sufficient characteristics to be classified as chaos. We then apply the same criteria to map microchaos features in regions of similar area ( 90,000 sq km) that lack large chaos features. By plotting the distribution of microchaos with distance from the center point of the large chaos feature or the mapping region (for the cases without a large feature), we determine whether there is a distinct signature linking large-scale chaos features with nearby microchaos. We discuss the implications of these results on the process of chaos formation and the extent of liquid water within Europa's ice shell.
Symmetric large momentum transfer for atom interferometry with BECs
NASA Astrophysics Data System (ADS)
Abend, Sven; Gebbe, Martina; Gersemann, Matthias; Rasel, Ernst M.; Quantus Collaboration
2017-04-01
We develop and demonstrate a novel scheme for a symmetric large momentum transfer beam splitter for interferometry with Bose-Einstein condensates. Large momentum transfer beam splitters are a key technique to enhance the scaling factor and sensitivity of an atom interferometer and to create largely delocalized superposition states. To realize the beam splitter, double Bragg diffraction is used to create a superposition of two symmetric momentum states. Afterwards both momentum states are loaded into a retro-reflected optical lattice and accelerated by Bloch oscillations on opposite directions, keeping the initial symmetry. The favorable scaling behavior of this symmetric acceleration, allows to transfer more than 1000 ℏk of total differential splitting in a single acceleration sequence of 6 ms duration while we still maintain a fraction of approx. 25% of the initial atom number. As a proof of the coherence of this beam splitter, contrast in a closed Mach-Zehnder atom interferometer has been observed with up to 208 ℏk of momentum separation, which equals a differential wave-packet velocity of approx. 1.1 m/s for 87Rb. The presented work is supported by the CRC 1128 geo-Q and the DLR with funds provided by the Federal Ministry of Economic Affairs and Energy (BMWi) due to an enactment of the German Bundestag under Grant No. DLR 50WM1552-1557 (QUANTUS-IV-Fallturm).
Polarization of the prompt gamma-ray emission from the gamma-ray burst of 6 December 2002.
Coburn, Wayne; Boggs, Steven E
2003-05-22
Observations of the afterglows of gamma-ray bursts (GRBs) have revealed that they lie at cosmological distances, and so correspond to the release of an enormous amount of energy. The nature of the central engine that powers these events and the prompt gamma-ray emission mechanism itself remain enigmatic because, once a relativistic fireball is created, the physics of the afterglow is insensitive to the nature of the progenitor. Here we report the discovery of linear polarization in the prompt gamma-ray emission from GRB021206, which indicates that it is synchrotron emission from relativistic electrons in a strong magnetic field. The polarization is at the theoretical maximum, which requires a uniform, large-scale magnetic field over the gamma-ray emission region. A large-scale magnetic field constrains possible progenitors to those either having or producing organized fields. We suggest that the large magnetic energy densities in the progenitor environment (comparable to the kinetic energy densities of the fireball), combined with the large-scale structure of the field, indicate that magnetic fields drive the GRB explosion.
Method for replicating an array of nucleic acid probes
Cantor, C.R.; Przetakiewicz, M.; Smith, C.L.; Sano, T.
1998-08-18
The invention relates to the replication of probe arrays and methods for replicating arrays of probes which are useful for the large scale manufacture of diagnostic aids used to screen biological samples for specific target sequences. Arrays created using PCR technology may comprise probes with 5{prime}- and/or 3{prime}-overhangs. 16 figs.
Comparing Learner Community Behavior in Multiple Presentations of a Massive Open Online Course
ERIC Educational Resources Information Center
Gallagher, Silvia Elena; Savage, Timothy
2015-01-01
Massive Online Open Courses (MOOCs) can create large scale communities of learners who collaborate, interact and discuss learning materials and activities. MOOCs are often delivered multiple times with similar content to different cohorts of learners. However, research into the differences of learner communication, behavior and expectation between…
Comparing Learner Community Behavior in Multiple Presentations of a Massive Open Online Course
ERIC Educational Resources Information Center
Gallagher, Silvia Elena; Savage, Timothy
2016-01-01
Massive Online Open Courses (MOOCs) can create large scale communities of learners who collaborate, interact and discuss learning materials and activities. MOOCs are often delivered multiple times with similar content to different cohorts of learners. However, research into the differences of learner communication, behavior and expectation between…
ERIC Educational Resources Information Center
Xiong, Yao; Suen, Hoi K.
2018-01-01
The development of massive open online courses (MOOCs) has launched an era of large-scale interactive participation in education. While massive open enrolment and the advances of learning technology are creating exciting potentials for lifelong learning in formal and informal ways, the implementation of efficient and effective assessment is still…
Markets vs. Monopolies in Education: A Global Review of the Evidence. Policy Analysis. No. 620
ERIC Educational Resources Information Center
Coulson, Andrew J.
2008-01-01
Would large-scale, free-market reforms improve educational outcomes for American children? That question cannot be answered by looking at domestic evidence alone. Though innumerable "school choice" programs have been implemented around the United States, none has created a truly free and competitive education marketplace. Existing…
USDA-ARS?s Scientific Manuscript database
Russian knapweed is an outcrossing perennial invasive weed in North America that can spread by both seed and horizontal rhizome growth leading to new shoots. The predominant mode of spread at the local and long-distance scales has not been quantitatively researched. We used Amplified Fragment Length...
Program Development: Identification and Formulation of Desirable Educational Goals.
ERIC Educational Resources Information Center
Goodlad, John I.
In this speech, the author suggests that the success of public schools depends heavily on commitment to and large-scale agreement on educational goals. He examines the difficulty in creating rational programs to carry out specific behavioral goals and the more remote ends usually stated for educational systems. The author then discusses the…
New Ways of Classroom Assessment. Revised
ERIC Educational Resources Information Center
Brown, J. D., Ed.
2013-01-01
In this revised edition in the popular New Ways Series, teachers have once again been given an opportunity to show how they do assessment in their classrooms on an everyday basis. Often feeling helpless when confronted with large-scale standardized testing practices, teachers here offer classroom testing created with the direct aim of helping…
Multiscale socioeconomic assessment across large ecosystems: lessons from practice
Rebecca J. McLain; Ellen M. Donoghue; Jonathan Kusel; Lita Buttolph; Susan Charnley
2008-01-01
Implementation of ecosystem management projects has created a demand for socioeconomic assessments to predict or evaluate the impacts of ecosystem policies. Social scientists for these assessments face challenges that, although not unique to such projects, are more likely to arise than in smaller scale ones. This article summarizes lessons from our experiences with...
ERIC Educational Resources Information Center
Abedi, Jamal
This policy brief addresses the inclusion of English language learners (ELLs) in large-scale assessments and ELL assessment accommodations. The inclusion of ELL students creates specific accountability policy challenges. States differ in the students they include and their inclusion policies and accommodation practices, and, at present, inclusion…
Floating Data and the Problem with Illustrating Multiple Regression.
ERIC Educational Resources Information Center
Sachau, Daniel A.
2000-01-01
Discusses how to introduce basic concepts of multiple regression by creating a large-scale, three-dimensional regression model using the classroom walls and floor. Addresses teaching points that should be covered and reveals student reaction to the model. Finds that the greatest benefit of the model is the low fear, walk-through, nonmathematical…
Informal Nature Experience on the School Playground
ERIC Educational Resources Information Center
Raith, Andreas
2015-01-01
In Germany, all-day care and all-day schooling are currently increasing on a large-scale. The extended time children spend in educational institutions could potentially result in limited access to nature experience for children. On the other hand, it could equally create opportunities for informal nature experience if school playgrounds have a…
Creating a Culture of Success for Staff: Five Lessons
ERIC Educational Resources Information Center
Scully, Lisa Mosele
2011-01-01
After finishing her undergraduate degree in English, the author's first employment was in an academic department at her alma mater. At a large public state institution, the salary scale for non-academic appointees was--and remains--woefully low. Knowing that monetary compensation wasn't the source of staff satisfaction, her chairman made a…
Mental Health Workforce Change through Social Work Education: A California Case Study
ERIC Educational Resources Information Center
Foster, Gwen; Morris, Meghan Brenna; Sirojudin, Sirojudin
2013-01-01
The 2004 California Mental Health Services Act requires large-scale system change in the public mental health system through a shift to recovery-oriented services for diverse populations. This article describes an innovative strategy for workforce recruitment and retention to create and sustain these systemic changes. The California Social Work…
3D Visible-Light Invisibility Cloak.
Zheng, Bin; Zhu, Rongrong; Jing, Liqiao; Yang, Yihao; Shen, Lian; Wang, Huaping; Wang, Zuojia; Zhang, Xianmin; Liu, Xu; Li, Erping; Chen, Hongsheng
2018-06-01
The concept of an invisibility cloak is a fixture of science fiction, fantasy, and the collective imagination. However, a real device that can hide an object from sight in visible light from absolutely any viewpoint would be extremely challenging to build. The main obstacle to creating such a cloak is the coupling of the electromagnetic components of light, which would necessitate the use of complex materials with specific permittivity and permeability tensors. Previous cloaking solutions have involved circumventing this obstacle by functioning either in static (or quasistatic) fields where these electromagnetic components are uncoupled or in diffusive light scattering media where complex materials are not required. In this paper, concealing a large-scale spherical object from human sight from three orthogonal directions is reported. This result is achieved by developing a 3D homogeneous polyhedral transformation and a spatially invariant refractive index discretization that considerably reduce the coupling of the electromagnetic components of visible light. This approach allows for a major simplification in the design of 3D invisibility cloaks, which can now be created at a large scale using homogeneous and isotropic materials.
Transitioning to a new nursing home: one organization's experience.
O'Brien, Kelli; Welsh, Darlene; Lundrigan, Elaine; Doyle, Anne
2013-01-01
Restructuring of long-term care in Western Health, a regional health authority within Newfoundland and Labrador, created a unique opportunity to study the widespread impacts of the transition. Staff and long-term-care residents were relocated from a variety of settings to a newly constructed facility. A plan was developed to assess the impact of relocation on staff, residents, and families. Indicators included fall rates, medication errors, complaints, media database, sick leave, overtime, injuries, and staff and family satisfaction. This article reports on the findings and lessons learned from an organizational perspective with such a large-scale transition. Some of the key findings included the necessity of premove and postmove strategies to minimize negative impacts, ongoing communication and involvement in decision making during transitions, tracking of key indicators, recognition from management regarding increased workload and stress experienced by staff, engagement of residents and families throughout the transition, and assessing the timing of large-scale relocations. These findings would be of interest to health care managers and leadership team in organizations planning large-scale changes.
NASA Astrophysics Data System (ADS)
Fitzgerald, Michael; McKinnon, David H.; Danaia, Lena
2015-12-01
In this paper, we outline the theory behind the educational design used to implement a large-scale high school astronomy education project. This design was created in response to the realization of ineffective educational design in the initial early stages of the project. The new design follows an iterative improvement model where the materials and general approach can evolve in response to solicited feedback. The improvement cycle concentrates on avoiding overly positive self-evaluation while addressing relevant external school and community factors while concentrating on backward mapping from clearly set goals. Limiting factors, including time, resources, support and the potential for failure in the classroom, are dealt with as much as possible in the large-scale design allowing teachers the best chance of successful implementation in their real-world classroom. The actual approach adopted following the principles of this design is also outlined, which has seen success in bringing real astronomical data and access to telescopes into the high school classroom.
In Defense of the National Labs and Big-Budget Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goodwin, J R
2008-07-29
The purpose of this paper is to present the unofficial and unsanctioned opinions of a Visiting Scientist at Lawrence Livermore National Laboratory on the values of LLNL and the other National Labs. The basic founding value and goal of the National Labs is big-budget scientific research, along with smaller-budget scientific research that cannot easily be done elsewhere. The most important example in the latter category is classified defense-related research. The historical guiding light here is the Manhattan Project. This endeavor was unique in human history, and might remain so. The scientific expertise and wealth of an entire nation was tappedmore » in a project that was huge beyond reckoning, with no advance guarantee of success. It was in many respects a clash of scientific titans, with a large supporting cast, collaborating toward a single well-defined goal. Never had scientists received so much respect, so much money, and so much intellectual freedom to pursue scientific progress. And never was the gap between theory and implementation so rapidly narrowed, with results that changed the world, completely. Enormous resources are spent at the national or international level on large-scale scientific projects. LLNL has the most powerful computer in the world, Blue Gene/L. (Oops, Los Alamos just seized the title with Roadrunner; such titles regularly change hands.) LLNL also has the largest laser in the world, the National Ignition Facility (NIF). Lawrence Berkeley National Lab (LBNL) has the most powerful microscope in the world. Not only is it beyond the resources of most large corporations to make such expenditures, but the risk exceeds the possible rewards for those corporations that could. Nor can most small countries afford to finance large scientific projects, and not even the richest can afford largess, especially if Congress is under major budget pressure. Some big-budget research efforts are funded by international consortiums, such as the Large Hadron Collider (LHC) at CERN, and the International Tokamak Experimental Reactor (ITER) in Cadarache, France, a magnetic-confinement fusion research project. The postWWII histories of particle and fusion physics contain remarkable examples of both international competition, with an emphasis on secrecy, and international cooperation, with an emphasis on shared knowledge and resources. Initiatives to share sometimes came from surprising directions. Most large-scale scientific projects have potential defense applications. NIF certainly does; it is primarily designed to create small-scale fusion explosions. Blue Gene/L operates in part in service to NIF, and in part to various defense projects. The most important defense projects include stewardship of the national nuclear weapons stockpile, and the proposed redesign and replacement of those weapons with fewer, safer, more reliable, longer-lived, and less apocalyptic warheads. Many well-meaning people will consider the optimal lifetime of a nuclear weapon to be zero, but most thoughtful people, when asked how much longer they think this nation will require them, will ask for some time to think. NIF is also designed to create exothermic small-scale fusion explosions. The malapropos 'exothermic' here is a convenience to cover a profusion of complexities, but the basic idea is that the explosions will create more recoverable energy than was used to create them. One can hope that the primary future benefits of success for NIF will be in cost-effective generation of electrical power through controlled small-scale fusion reactions, rather than in improved large-scale fusion explosions. Blue Gene/L also services climate research, genomic research, materials research, and a myriad of other computational problems that become more feasible, reliable, and precise the larger the number of computational nodes employed. Blue Gene/L has to be sited within a security complex for obvious reasons, but its value extends to the nation and the world. There is a duality here between large-scale scientific research machines and the supercomputers used to model them. An astounding example is illustrated in a graph released by EFDAJET, at Oxfordshire, UK, presently the largest operating magnetic-confinement fusion experiment. The graph shows plasma confinement times (an essential performance parameter) for all the major tokamaks in the international fusion program, over their existing lifetimes. The remarkable thing about the data is not so much confinement-time versus date or scale, but the fact that the data are given for both the computer model predictions and the actual experimental measurements, and the two are in phenomenal agreement over the extended range of scales. Supercomputer models, sometimes operating with the intricacy of Schroedinger's equation at quantum physical scales, have become a costly but enormously cost-saving tool.« less
NASA Astrophysics Data System (ADS)
Beers, A.; Ray, C.
2015-12-01
Climate change is likely to affect mountainous areas unevenly due to the complex interactions between topography, vegetation, and the accumulation of snow and ice. This heterogeneity will complicate relationships between species presence and large-scale drivers such as precipitation and make predicting habitat extent and connectivity much more difficult. We studied the potential for fine-scale variation in climate and habitat use throughout the year in the American pika (Ochotona princeps), a talus specialist of mountainous western North America known for strong microhabitat affiliation. Not all areas of talus are likely to be equally hospitable, which may reduce connectivity more than predicted by large-scale occupancy drivers. We used high resolution remotely sensed data to create metrics of the terrain and land cover in the Niwot Ridge (NWT) LTER site in Colorado. We hypothesized that pikas preferentially use heterogeneous terrain, as it might foster greater snow accumulation, and used radio telemetry to test this with radio-collared pikas. Pikas use heterogeneous terrain during snow covered periods and less heterogeneous area during the summer. This suggests that not all areas of talus habitat are equally suitable as shelter from extreme conditions but that pikas need more than just shelter from winter cold. With those results we created a predictive map using the same habitat metrics to model the extent of suitable habitat across the NWT area. These strong effects of terrain on pika habitat use and territory occupancy show the great utility that high resolution remotely sensed data can have in ecological applications. With increasing effects of climate change in mountainous regions, this modeling approach is crucial for quantifying habitat connectivity at both small and large scales and to identify potential refugia for threatened or isolated species.
A Commercialization Roadmap for Carbon-Negative Energy Systems
NASA Astrophysics Data System (ADS)
Sanchez, D.
2016-12-01
The Intergovernmental Panel on Climate Change (IPCC) envisages the need for large-scale deployment of net-negative CO2 emissions technologies by mid-century to meet stringent climate mitigation goals and yield a net drawdown of atmospheric carbon. Yet there are few commercial deployments of BECCS outside of niche markets, creating uncertainty about commercialization pathways and sustainability impacts at scale. This uncertainty is exacerbated by the absence of a strong policy framework, such as high carbon prices and research coordination. Here, we propose a strategy for the potential commercial deployment of BECCS. This roadmap proceeds via three steps: 1) via capture and utilization of biogenic CO2 from existing bioenergy facilities, notably ethanol fermentation, 2) via thermochemical co-conversion of biomass and fossil fuels, particularly coal, and 3) via dedicated, large-scale BECCS. Although biochemical conversion is a proven first market for BECCS, this trajectory alone is unlikely to drive commercialization of BECCS at the gigatonne scale. In contrast to biochemical conversion, thermochemical conversion of coal and biomass enables large-scale production of fuels and electricity with a wide range of carbon intensities, process efficiencies and process scales. Aside from systems integration, primarily technical barriers are involved in large-scale biomass logistics, gasification and gas cleaning. Key uncertainties around large-scale BECCS deployment are not limited to commercialization pathways; rather, they include physical constraints on biomass cultivation or CO2 storage, as well as social barriers, including public acceptance of new technologies and conceptions of renewable and fossil energy, which co-conversion systems confound. Despite sustainability risks, this commercialization strategy presents a pathway where energy suppliers, manufacturers and governments could transition from laggards to leaders in climate change mitigation efforts.
NASA Astrophysics Data System (ADS)
Siwak, Michal; Rucinski, Slavek M.; Matthews, Jaymie M.; Guenther, David B.; Kuschnig, Rainer; Moffat, Anthony F. J.; Rowe, Jason F.; Sasselov, Dimitar; Weiss, Werner W.
2014-10-01
We present an analysis of the 2011 photometric observations of TW Hya by the MOST satellite; this is the fourth continuous series of this type. The large-scale light variations are dominated by a strong, quasi-periodic 4.18-d oscillation with superimposed, apparently chaotic flaring activity. The former is probably produced by stellar rotation with one large hotspot created by a stable accretion funnel, while the latter may be produced by small hotspots, created at moderate latitudes by unstable accretion tongues. A new, previously unnoticed feature is a series of semiperiodic, well-defined brightness dips of unknown nature, of which 19 were observed during 43 d of our nearly continuous observations. Re-analysis of the 2009 MOST light-curve revealed the presence of three similar dips. On the basis of recent theoretical results, we tentatively conclude that the dips may represent occultations of the small hotspots created by unstable accretion tongues by hypothetical optically thick clumps of dust.
3D Visualization of Earthquake Focal Mechanisms Using ArcScene
Labay, Keith A.; Haeussler, Peter J.
2007-01-01
In addition to the default settings, there are several other options in 3DFM that can be adjusted. The appearance of the symbols can be changed by (1) creating rings around the fault planes that are colored based on magnitude, (2) showing only the fault planes instead of a sphere, (3) drawing a flat disc that identifies the primary nodal plane, (4) or by displaying the null, pressure, and tension axes. The size of the symbols can be changed by adjusting their diameter, scaling them based on the magnitude of the earthquake, or scaling them by the estimated size of the rupture patch based on earthquake magnitude. It is also possible to filter the data using any combination of the strike, dip, rake, magnitude, depth, null axis plunge, pressure axis plunge, tension axis plunge, or fault type values of the points. For a large dataset, these filters can be used to create different subsets of symbols. Symbols created by 3DFM are stored in graphics layers that appear in the ArcScene® table of contents. Multiple graphics layers can be created and saved to preserve the output from different symbol options.
Too many swipes for today: The development of the Problematic Tinder Use Scale (PTUS)
Orosz, Gábor; Tóth-Király, István; Bőthe, Beáta; Melher, Dóra
2016-01-01
Background and aims Tinder is a very popular smartphone-based geolocated dating application. The goal of the present study was creating a short Problematic Tinder Use Scale (PTUS). Methods Griffiths’ (2005) six-component model was implemented for covering all components of problematic Tinder use. Confirmatory factor analyses were carried out on a Tinder user sample (N = 430). Results Both the 12- and the 6-item versions were tested. The 6-item unidimensional structure has appropriate reliability and factor structure. No salient demography-related differences were found. Users irrespectively to their relationship status have similar scores on PTUS. Discussion Tinder users deserve the attention of scientific examination considering their large proportion among smartphone users. It is especially true considering the emerging trend of geolocated online dating applications. Conclusions Before PTUS, no prior scale has been created to measure problematic Tinder use. The PTUS is a suitable and reliable measure to assess problematic Tinder use. PMID:27415602
Too many swipes for today: The development of the Problematic Tinder Use Scale (PTUS).
Orosz, Gábor; Tóth-Király, István; Bőthe, Beáta; Melher, Dóra
2016-09-01
Background and aims Tinder is a very popular smartphone-based geolocated dating application. The goal of the present study was creating a short Problematic Tinder Use Scale (PTUS). Methods Griffiths' ( 2005 ) six-component model was implemented for covering all components of problematic Tinder use. Confirmatory factor analyses were carried out on a Tinder user sample (N = 430). Results Both the 12- and the 6-item versions were tested. The 6-item unidimensional structure has appropriate reliability and factor structure. No salient demography-related differences were found. Users irrespectively to their relationship status have similar scores on PTUS. Discussion Tinder users deserve the attention of scientific examination considering their large proportion among smartphone users. It is especially true considering the emerging trend of geolocated online dating applications. Conclusions Before PTUS, no prior scale has been created to measure problematic Tinder use. The PTUS is a suitable and reliable measure to assess problematic Tinder use.
Development of EnergyPlus Utility to Batch Simulate Building Energy Performance on a National Scale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valencia, Jayson F.; Dirks, James A.
2008-08-29
EnergyPlus is a simulation program that requires a large number of details to fully define and model a building. Hundreds or even thousands of lines in a text file are needed to run the EnergyPlus simulation depending on the size of the building. To manually create these files is a time consuming process that would not be practical when trying to create input files for thousands of buildings needed to simulate national building energy performance. To streamline the process needed to create the input files for EnergyPlus, two methods were created to work in conjunction with the National Renewable Energymore » Laboratory (NREL) Preprocessor; this reduced the hundreds of inputs needed to define a building in EnergyPlus to a small set of high-level parameters. The first method uses Java routines to perform all of the preprocessing on a Windows machine while the second method carries out all of the preprocessing on the Linux cluster by using an in-house built utility called Generalized Parametrics (GPARM). A comma delimited (CSV) input file is created to define the high-level parameters for any number of buildings. Each method then takes this CSV file and uses the data entered for each parameter to populate an extensible markup language (XML) file used by the NREL Preprocessor to automatically prepare EnergyPlus input data files (idf) using automatic building routines and macro templates. Using a Linux utility called “make”, the idf files can then be automatically run through the Linux cluster and the desired data from each building can be aggregated into one table to be analyzed. Creating a large number of EnergyPlus input files results in the ability to batch simulate building energy performance and scale the result to national energy consumption estimates.« less
Creating a biopower agenda through grassroots organizing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hauter, W.
1995-11-01
Biomass electricity provides both opportunities for strengthening the rural economy and advancing environmental goals. However, while large scale biomass development can be done in a manner that both furthers economic development and helps prevent environmental degradation, its commercialization requires a complex coordination of activities between utilities and farmers. Inherent problems exist in creating parallel development of a resource base and technological advancements. In fact, an understanding of the anthropology of biopower is necessary in order to advance it on a large scale. The Union of Concerned Scientists (UCS) published a report on renewable electricity, released in March 1992, that hasmore » been used as a foundation for state-based work promoting renewables. In several Midwestern states, such as Nebraska, Minnesota, and Wisconsin, we have used classic grassroots organizing skills to educate the public and key constituencies about the benefits of biomass. Besides working directly with utilities to promote biomass development, we also have a legislative agenda that helps create a climate favorable to biopower. This paper will focus on the grassroots aspect of our campaigns. It will also include an overview of some anthropological work that the author has done in communities with farmers. The main tool for this has been focus groups. We have found that people can be organized around biomass issues and that a grassroots base furthers biomass development.« less
Job Management and Task Bundling
NASA Astrophysics Data System (ADS)
Berkowitz, Evan; Jansen, Gustav R.; McElvain, Kenneth; Walker-Loud, André
2018-03-01
High Performance Computing is often performed on scarce and shared computing resources. To ensure computers are used to their full capacity, administrators often incentivize large workloads that are not possible on smaller systems. Measurements in Lattice QCD frequently do not scale to machine-size workloads. By bundling tasks together we can create large jobs suitable for gigantic partitions. We discuss METAQ and mpi_jm, software developed to dynamically group computational tasks together, that can intelligently backfill to consume idle time without substantial changes to users' current workflows or executables.
Techniques for automatic large scale change analysis of temporal multispectral imagery
NASA Astrophysics Data System (ADS)
Mercovich, Ryan A.
Change detection in remotely sensed imagery is a multi-faceted problem with a wide variety of desired solutions. Automatic change detection and analysis to assist in the coverage of large areas at high resolution is a popular area of research in the remote sensing community. Beyond basic change detection, the analysis of change is essential to provide results that positively impact an image analyst's job when examining potentially changed areas. Present change detection algorithms are geared toward low resolution imagery, and require analyst input to provide anything more than a simple pixel level map of the magnitude of change that has occurred. One major problem with this approach is that change occurs in such large volume at small spatial scales that a simple change map is no longer useful. This research strives to create an algorithm based on a set of metrics that performs a large area search for change in high resolution multispectral image sequences and utilizes a variety of methods to identify different types of change. Rather than simply mapping the magnitude of any change in the scene, the goal of this research is to create a useful display of the different types of change in the image. The techniques presented in this dissertation are used to interpret large area images and provide useful information to an analyst about small regions that have undergone specific types of change while retaining image context to make further manual interpretation easier. This analyst cueing to reduce information overload in a large area search environment will have an impact in the areas of disaster recovery, search and rescue situations, and land use surveys among others. By utilizing a feature based approach founded on applying existing statistical methods and new and existing topological methods to high resolution temporal multispectral imagery, a novel change detection methodology is produced that can automatically provide useful information about the change occurring in large area and high resolution image sequences. The change detection and analysis algorithm developed could be adapted to many potential image change scenarios to perform automatic large scale analysis of change.
Bioremediation at a global scale: from the test tube to planet Earth.
de Lorenzo, Víctor; Marlière, Philippe; Solé, Ricard
2016-09-01
Planet Earth's biosphere has evolved over billions of years as a balanced bio-geological system ultimately sustained by sunpower and the large-scale cycling of elements largely run by the global environmental microbiome. Humans have been part of this picture for much of their existence. But the industrial revolution started in the XIX century and the subsequent advances in medicine, chemistry, agriculture and communications have impacted such balances to an unprecedented degree - and the problem has nothing but exacerbated in the last 20 years. Human overpopulation, industrial growth along with unsustainable use of natural resources have driven many sites and perhaps the planetary ecosystem as a whole, beyond recovery by spontaneous natural means, even if the immediate causes could be stopped. The most conspicuous indications of such a state of affairs include the massive change in land use, the accelerated increase in the levels of greenhouse gases, the frequent natural disasters associated to climate change and the growing non-recyclable waste (e.g. plastics and recalcitrant chemicals) that we release to the Environment. While the whole planet is afflicted at a global scale by chemical pollution and anthropogenic emissions, the ongoing development of systems and synthetic biology, metagenomics, modern chemistry and some key concepts from ecological theory allow us to tackle this phenomenal challenge and propose large-scale interventions aimed at reversing and even improving the situation. This involves (i) identification of key reactions or processes that need to be re-established (or altogether created) for ecosystem reinstallation, (ii) implementation of such reactions in natural or designer hosts able to self-replicate and deliver the corresponding activities when/where needed in a fashion guided by sound ecological modelling, (iii) dispersal of niche-creating agents at a global scale and (iv) containment, monitoring and risk assessment of the whole process. © 2016 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.
Conceptual hierarchical modeling to describe wetland plant community organization
Little, A.M.; Guntenspergen, G.R.; Allen, T.F.H.
2010-01-01
Using multivariate analysis, we created a hierarchical modeling process that describes how differently-scaled environmental factors interact to affect wetland-scale plant community organization in a system of small, isolated wetlands on Mount Desert Island, Maine. We followed the procedure: 1) delineate wetland groups using cluster analysis, 2) identify differently scaled environmental gradients using non-metric multidimensional scaling, 3) order gradient hierarchical levels according to spatiotem-poral scale of fluctuation, and 4) assemble hierarchical model using group relationships with ordination axes and post-hoc tests of environmental differences. Using this process, we determined 1) large wetland size and poor surface water chemistry led to the development of shrub fen wetland vegetation, 2) Sphagnum and water chemistry differences affected fen vs. marsh / sedge meadows status within small wetlands, and 3) small-scale hydrologic differences explained transitions between forested vs. non-forested and marsh vs. sedge meadow vegetation. This hierarchical modeling process can help explain how upper level contextual processes constrain biotic community response to lower-level environmental changes. It creates models with more nuanced spatiotemporal complexity than classification and regression tree procedures. Using this process, wetland scientists will be able to generate more generalizable theories of plant community organization, and useful management models. ?? Society of Wetland Scientists 2009.
The statistical power to detect cross-scale interactions at macroscales
Wagner, Tyler; Fergus, C. Emi; Stow, Craig A.; Cheruvelil, Kendra S.; Soranno, Patricia A.
2016-01-01
Macroscale studies of ecological phenomena are increasingly common because stressors such as climate and land-use change operate at large spatial and temporal scales. Cross-scale interactions (CSIs), where ecological processes operating at one spatial or temporal scale interact with processes operating at another scale, have been documented in a variety of ecosystems and contribute to complex system dynamics. However, studies investigating CSIs are often dependent on compiling multiple data sets from different sources to create multithematic, multiscaled data sets, which results in structurally complex, and sometimes incomplete data sets. The statistical power to detect CSIs needs to be evaluated because of their importance and the challenge of quantifying CSIs using data sets with complex structures and missing observations. We studied this problem using a spatially hierarchical model that measures CSIs between regional agriculture and its effects on the relationship between lake nutrients and lake productivity. We used an existing large multithematic, multiscaled database, LAke multiscaled GeOSpatial, and temporal database (LAGOS), to parameterize the power analysis simulations. We found that the power to detect CSIs was more strongly related to the number of regions in the study rather than the number of lakes nested within each region. CSI power analyses will not only help ecologists design large-scale studies aimed at detecting CSIs, but will also focus attention on CSI effect sizes and the degree to which they are ecologically relevant and detectable with large data sets.
Creating Community from the Inside Out: A Concentric Perspective on Collective Artmaking
ERIC Educational Resources Information Center
Blatt-Gross, Carolina
2017-01-01
As the original antidote to social alienation (and its tragic repercussions), community art through its large-scale, public, and collaborative nature has the potential to rebuild a sense of community from the inside out. This concentric notion of community art is addressed at both the individual and classroom level, exploring the origins of…
Effects of a Data-Driven District-Level Reform Model
ERIC Educational Resources Information Center
Slavin, Robert E.; Holmes, GwenCarol; Madden, Nancy A.; Chamberlain, Anne; Cheung, Alan
2010-01-01
Despite a quarter-century of reform, US schools serving students in poverty continue to lag far behind other schools. There are proven programs, but these are not widely used. This large-scale experiment evaluated a district-level reform model created by the Center for DataDriven Reform in Education (CDDRE). The CDDRE model provided consultation…
Large-scale monitoring of air pollution in remote and ecologically important areas
Andrzej Bytnerowicz; Witold Fraczek
2013-01-01
New advances in air quality monitoring techniques, such as passive samplers for nitrogenous (N) or sulphurous (S) pollutants and ozone (O3), have allowed for an improved understanding of concentrations of these pollutants in remote areas. Mountains create special problems with regard to the feasibility of establishing and maintaining air pollution monitoring networks,...
Computer-based tools for decision support in agroforestry: Current state and future needs
E.A. Ellis; G. Bentrup; Michelle M. Schoeneberger
2004-01-01
Successful design of agroforestry practices hinges on the ability to pull together very diverse and sometimes large sets of information (i.e., biophysical, economic and social factors), and then implementing the synthesis of this information across several spatial scales from site to landscape. Agroforestry, by its very nature, creates complex systems with impacts...
Beyond the Classroom: Creating and Implementing New Models for Teaching.
ERIC Educational Resources Information Center
Free, Coen; Moerman, Yvonne
In the Netherlands, educational innovation related to community colleges has mostly emphasized enlarging the scale of the colleges, resulting in large regional educational centers serving 10,000 students or more. New demand on the educational process has also resulted in a growing examination of the role of the teacher in the classroom. At King…
ERIC Educational Resources Information Center
Costley, Debra
2007-01-01
This article explores the possibilities and opportunities created by large-scale property developers for new ways of learning and working in master-planned communities. The discussion is based on the findings from research of one developer's innovative solutions to learning in newly developed communities and specifically draws on data from one…
USDA-ARS?s Scientific Manuscript database
Genotyping-by-sequencing allows for large-scale genetic analyses in plant species with no reference genome, creating the challenge of sound inference in the presence of uncertain genotypes. Here we report an imputation-based genome-wide association study (GWAS) in reed canarygrass (Phalaris arundina...
What Students Say about Bullying
ERIC Educational Resources Information Center
Davis, Stan; Nixon, Charisse
2011-01-01
Educators striving to create safe, respectful, bully-free school climates have many programs and approaches to choose from--but it's difficult to know which will work best. The experiences of students who have been bullied can help educators decide what works and what doesn't. The authors conducted a large-scale survey of students, and asked 3,000…
The Promise Neighborhoods Movement: Creating Communities of Opportunity from Cradle to Career
ERIC Educational Resources Information Center
McAfee, Michael; Torre, Mauricio
2015-01-01
In this article, Michael McAfee and Mauricio Torre reflect on the successes and challenges of the Promise Neighborhoods movement as it works toward education equity, and on what it takes to effect large-scale, sustainable change for low-income communities and communities of color. Together they discuss the Chula Vista Promise Neighborhood project…
Saltcedar control and water salvage on the Pecos River, Texas, 1999 to 2003
Charles R. Hart; Larry D. White; Alyson McDonald; Zhuping Sheng
2007-01-01
A large scale ecosystem restoration program was initiated in 1997 on the Pecos River in western Texas. Saltcedar (Tamarix spp.), a non-native invasive tree, had created a near monoculture along the banks of the river by replacing most native vegetation. Local irrigation districts, private landowners, federal and state agencies, and private industry...
E-Mentoring for Social Equity: Review of Research to Inform Program Development
ERIC Educational Resources Information Center
Single, Peg Boyle; Single, Richard M.
2005-01-01
The advent of user-friendly email programs and web browsers created possibilities for widespread use of e-mentoring programs. In this review of the research, we presented the history of e-mentoring programs and defined e-mentoring and structured e-mentoring programs, focusing on large-scale e-mentoring programs that addressed issues of social…
Changing Schools from the inside out: Small Wins in Hard Times. Third Edition
ERIC Educational Resources Information Center
Larson, Robert
2011-01-01
At any time, public schools labor under great economic, political, and social pressures that make it difficult to create large-scale, "whole school" change. But current top-down mandates require that schools close achievement gaps while teaching more problem solving, inquiry, and research skills--with fewer resources. Failure to meet test-based…
Information Tailoring Enhancements for Large Scale Social Data
2016-03-15
i.com) 1 Work Performed within This Reporting Period .................................................... 2 1.1 Implemented Temporal Analytics ...following tasks. Implemented Temporal Analysis Algorithms for Advanced Analytics in Scraawl. We implemented our backend web service design for the...temporal analysis and we created a prototyope GUI web service of Scraawl analytics dashboard. Upgraded Scraawl computational framework to increase
Creating Grander Families: Older Adults Adopting Younger Kin and Nonkin
ERIC Educational Resources Information Center
Hinterlong, James; Ryan, Scott
2008-01-01
Purpose: There is a dearth of research on older adoptive parents caring for minor children, despite a growing number of such adoptions finalized each year. This study offers a large-scale investigation of adoptive families headed by older parents. We describe these families and explore how preadoptive kinship between the adoptive parent and the…
ERIC Educational Resources Information Center
Weissman, Evan; O'Connell, Jesse
2016-01-01
"Aid Like A Paycheck" is a large-scale pilot evaluation of whether an innovative approach to disbursing financial aid can improve academic and financial outcomes for low-income community college students. Lessons from the pilot evaluation were used to create and fine-tune a logic model depicting activities, outputs, mediators, and…
Dilemmas of Leading National Curriculum Reform in a Global Era: A Chinese Perspective
ERIC Educational Resources Information Center
Yin, Hongbiao; Lee, John Chi-Kin; Wang, Wenlan
2014-01-01
Since the mid-1980s, a global resurgence of large-scale reform in the field of education has been witnessed. Implementing these reforms has created many dilemmas for change leaders. Following a three-year qualitative research project, the present study explores the dilemmas leaders faced during the implementation of the national curriculum reform…
The Common Core State Standards: School Reform at Three Suburban Middle Schools
ERIC Educational Resources Information Center
Morante-Brock, Sandra
2014-01-01
A growing body of research supports the idea that large scale school reform efforts often fail to create sustained change within the public school sector. Proponents of school reform argue that implementing school reform, effectively and with fidelity, can work to ensure the success of reform initiatives in public education. When implementing deep…
Ryan A. McManamay; Donald J. Orth; Charles A. Dolloff; David C. Mathews
2013-01-01
In order for habitat restoration in regulated rivers to be effective at large scales, broadly applicable frameworks are needed that provide measurable objectives and contexts for management. The Ecological Limits of Hydrologic Alteration (ELOHA) framework was created as a template to assess hydrologic alterations, develop relationships between altered streamflow and...
The EpiSLI Database: A Publicly Available Database on Speech and Language
ERIC Educational Resources Information Center
Tomblin, J. Bruce
2010-01-01
Purpose: This article describes a database that was created in the process of conducting a large-scale epidemiologic study of specific language impairment (SLI). As such, this database will be referred to as the EpiSLI database. Children with SLI have unexpected and unexplained difficulties learning and using spoken language. Although there is no…
Multi-resource and multi-scale approaches for meeting the challenge of managing multiple species
Frank R. Thompson; Deborah M. Finch; John R. Probst; Glen D. Gaines; David S. Dobkin
1999-01-01
The large number of Neotropical migratory bird (NTMB) species and their diverse habitat requirements create conflicts and difficulties for land managers and conservationists. We provide examples of assessments or conservation efforts that attempt to address the problem of managing for multiple NTMB species. We advocate approaches at a variety of spatial and geographic...
Bioinspired large-scale aligned porous materials assembled with dual temperature gradients
Bai, Hao; Chen, Yuan; Delattre, Benjamin; Tomsia, Antoni P.; Ritchie, Robert O.
2015-01-01
Natural materials, such as bone, teeth, shells, and wood, exhibit outstanding properties despite being porous and made of weak constituents. Frequently, they represent a source of inspiration to design strong, tough, and lightweight materials. Although many techniques have been introduced to create such structures, a long-range order of the porosity as well as a precise control of the final architecture remain difficult to achieve. These limitations severely hinder the scale-up fabrication of layered structures aimed for larger applications. We report on a bidirectional freezing technique to successfully assemble ceramic particles into scaffolds with large-scale aligned, lamellar, porous, nacre-like structure and long-range order at the centimeter scale. This is achieved by modifying the cold finger with a polydimethylsiloxane (PDMS) wedge to control the nucleation and growth of ice crystals under dual temperature gradients. Our approach could provide an effective way of manufacturing novel bioinspired structural materials, in particular advanced materials such as composites, where a higher level of control over the structure is required. PMID:26824062
NASA Astrophysics Data System (ADS)
Sawira, S.; Rahman, T.
2018-05-01
Self-organized settlements are formed within the limited capacity of the inhabitants with or without the Government’s interventions. This pattern is mostly found in the informal settlements, where occupants are the planners who are guided by their needs, limited resources and vernacular knowledge about place making. Understanding the process of its development and transformation could be a way of unfolding the complexity it offers to a formal urban setting. To identify the patterns of adaptation process, a study of morphological elements (i.e. house form, streets) could be a possible way. A case study of an informal settlement (Kampung of Tamansari, Bandung in Indonesia) has been taken to dissect these elements. Two of important components of the study area: house forms and streets created the first layer of urban fabric. High population density demanded layers of needs and activities which eventually guided the multifunctional characteristics of streets and house forms. Thus, streets create dialogue with the complex built forms-often known as interface is the key element to understand the underneath order of Tamansari. Here interface can be divided into two categories depending on their scale – small and large. Small scale interfaces are comprised of small elements such as, extended platform, fence, steps, low height wall, blank wall and elements to set above, set forth, set over in house forms. These components help to create and define semipublic spaces in the settlement. These spaces could be visually and physically interactive or no interactive which result into active or inactive spaces respectively. Small scale interfaces are common features of the settlement, whereas large scale interfaces are placed at strategic locations and act as active spaces. Connecting bridges, open spaces and contours often create special dialogue within and beyond the study area. Interfaces cater diversity in the settlement by creating hierarchy of spaces. Sense of belonging and scope of personalization of the inhabitants are integral parts of alleyways and thus they create a complex yet coherent urban fabric. Apart from the physical elements, the settlement embodies some intangible assets like social bonding, trust, kinship, empathy and sense of belonging that add value to the spatial quality which is a distinctive character of Tamansari kampung. Informal settlements are certainly complex in nature, as it is an outcome of multiple people working to accommodate multidimensional needs. Whereas in a formal system, approach to cater for need is guided by a set of rules developed by a set of professionals end up in creating prototypes irrespective of necessity, affordability and cultural diversity. Cities throughout the world, are experiencing rapid urbanization creating different urban issues. Therefore, it is highly necessary to address different need and affordability of users and come up with suitable urban solutions. Understanding Tamansari Kampung as an informal settlement will enrich the knowledge and expertise to work in complex urban settings.
Global Scale Solar Disturbances
NASA Astrophysics Data System (ADS)
Title, A. M.; Schrijver, C. J.; DeRosa, M. L.
2013-12-01
The combination of the STEREO and SDO missions have allowed for the first time imagery of the entire Sun. This coupled with the high cadence, broad thermal coverage, and the large dynamic range of the Atmospheric Imaging Assembly on SDO has allowed discovery of impulsive solar disturbances that can significantly affect a hemisphere or more of the solar volume. Such events are often, but not always, associated with M and X class flares. GOES C and even B class flares are also associated with these large scale disturbances. Key to the recognition of the large scale disturbances was the creation of log difference movies. By taking the log of images before differencing events in the corona become much more evident. Because such events cover such a large portion of the solar volume their passage can effect the dynamics of the entire corona as it adjusts to and recovers from their passage. In some cases this may lead to a another flare or filament ejection, but in general direct causal evidence of 'sympathetic' behavior is lacking. However, evidence is accumulating these large scale events create an environment that encourages other solar instabilities to occur. Understanding the source of these events and how the energy that drives them is built up, stored, and suddenly released is critical to understanding the origins of space weather. Example events and comments of their relevance will be presented.
Impurity engineering of Czochralski silicon used for ultra large-scaled-integrated circuits
NASA Astrophysics Data System (ADS)
Yang, Deren; Chen, Jiahe; Ma, Xiangyang; Que, Duanlin
2009-01-01
Impurities in Czochralski silicon (Cz-Si) used for ultra large-scaled-integrated (ULSI) circuits have been believed to deteriorate the performance of devices. In this paper, a review of the recent processes from our investigation on internal gettering in Cz-Si wafers which were doped with nitrogen, germanium and/or high content of carbon is presented. It has been suggested that those impurities enhance oxygen precipitation, and create both denser bulk microdefects and enough denuded zone with the desirable width, which is benefit of the internal gettering of metal contamination. Based on the experimental facts, a potential mechanism of impurity doping on the internal gettering structure is interpreted and, a new concept of 'impurity engineering' for Cz-Si used for ULSI is proposed.
Photogrammetry of a Hypersonic Inflatable Aerodynamic Decelerator
NASA Technical Reports Server (NTRS)
Kushner, Laura Kathryn; Littell, Justin D.; Cassell, Alan M.
2013-01-01
In 2012, two large-scale models of a Hypersonic Inflatable Aerodynamic decelerator were tested in the National Full-Scale Aerodynamic Complex at NASA Ames Research Center. One of the objectives of this test was to measure model deflections under aerodynamic loading that approximated expected flight conditions. The measurements were acquired using stereo photogrammetry. Four pairs of stereo cameras were mounted inside the NFAC test section, each imaging a particular section of the HIAD. The views were then stitched together post-test to create a surface deformation profile. The data from the photogram- metry system will largely be used for comparisons to and refinement of Fluid Structure Interaction models. This paper describes how a commercial photogrammetry system was adapted to make the measurements and presents some preliminary results.
Advances in the manufacture of MIP nanoparticles.
Poma, Alessandro; Turner, Anthony P F; Piletsky, Sergey A
2010-12-01
Molecularly imprinted polymers (MIPs) are prepared by creating a three-dimensional polymeric matrix around a template molecule. After the matrix is removed, complementary cavities with respect to shape and functional groups remain. MIPs have been produced for applications in in vitro diagnostics, therapeutics and separations. However, this promising technology still lacks widespread application because of issues related to large-scale production and optimization of the synthesis. Recent developments in the area of MIP nanoparticles might offer solutions to several problems associated with performance and application. This review discusses various approaches used in the preparation of MIP nanoparticles, focusing in particular on the issues associated with large-scale manufacture and implications for the performance of synthesized nanomaterials. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
McGranaghan, Ryan M.; Mannucci, Anthony J.; Forsyth, Colin
2017-12-01
We explore the characteristics, controlling parameters, and relationships of multiscale field-aligned currents (FACs) using a rigorous, comprehensive, and cross-platform analysis. Our unique approach combines FAC data from the Swarm satellites and the Advanced Magnetosphere and Planetary Electrodynamics Response Experiment (AMPERE) to create a database of small-scale (˜10-150 km, <1° latitudinal width), mesoscale (˜150-250 km, 1-2° latitudinal width), and large-scale (>250 km) FACs. We examine these data for the repeatable behavior of FACs across scales (i.e., the characteristics), the dependence on the interplanetary magnetic field orientation, and the degree to which each scale "departs" from nominal large-scale specification. We retrieve new information by utilizing magnetic latitude and local time dependence, correlation analyses, and quantification of the departure of smaller from larger scales. We find that (1) FACs characteristics and dependence on controlling parameters do not map between scales in a straight forward manner, (2) relationships between FAC scales exhibit local time dependence, and (3) the dayside high-latitude region is characterized by remarkably distinct FAC behavior when analyzed at different scales, and the locations of distinction correspond to "anomalous" ionosphere-thermosphere behavior. Comparing with nominal large-scale FACs, we find that differences are characterized by a horseshoe shape, maximizing across dayside local times, and that difference magnitudes increase when smaller-scale observed FACs are considered. We suggest that both new physics and increased resolution of models are required to address the multiscale complexities. We include a summary table of our findings to provide a quick reference for differences between multiscale FACs.
NASA Astrophysics Data System (ADS)
Alberts, Samantha J.
The investigation of microgravity fluid dynamics emerged out of necessity with the advent of space exploration. In particular, capillary research took a leap forward in the 1960s with regards to liquid settling and interfacial dynamics. Due to inherent temperature variations in large spacecraft liquid systems, such as fuel tanks, forces develop on gas-liquid interfaces which induce thermocapillary flows. To date, thermocapillary flows have been studied in small, idealized research geometries usually under terrestrial conditions. The 1 to 3m lengths in current and future large tanks and hardware are designed based on hardware rather than research, which leaves spaceflight systems designers without the technological tools to effectively create safe and efficient designs. This thesis focused on the design and feasibility of a large length-scale thermocapillary flow experiment, which utilizes temperature variations to drive a flow. The design of a helical channel geometry ranging from 1 to 2.5m in length permits a large length-scale thermocapillary flow experiment to fit in a seemingly small International Space Station (ISS) facility such as the Fluids Integrated Rack (FIR). An initial investigation determined the proposed experiment produced measurable data while adhering to the FIR facility limitations. The computational portion of this thesis focused on the investigation of functional geometries of fuel tanks and depots using Surface Evolver. This work outlines the design of a large length-scale thermocapillary flow experiment for the ISS FIR. The results from this work improve the understanding thermocapillary flows and thus improve technological tools for predicting heat and mass transfer in large length-scale thermocapillary flows. Without the tools to understand the thermocapillary flows in these systems, engineers are forced to design larger, heavier vehicles to assure safety and mission success.
Eigenvalue Solvers for Modeling Nuclear Reactors on Leadership Class Machines
Slaybaugh, R. N.; Ramirez-Zweiger, M.; Pandya, Tara; ...
2018-02-20
In this paper, three complementary methods have been implemented in the code Denovo that accelerate neutral particle transport calculations with methods that use leadership-class computers fully and effectively: a multigroup block (MG) Krylov solver, a Rayleigh quotient iteration (RQI) eigenvalue solver, and a multigrid in energy (MGE) preconditioner. The MG Krylov solver converges more quickly than Gauss Seidel and enables energy decomposition such that Denovo can scale to hundreds of thousands of cores. RQI should converge in fewer iterations than power iteration (PI) for large and challenging problems. RQI creates shifted systems that would not be tractable without the MGmore » Krylov solver. It also creates ill-conditioned matrices. The MGE preconditioner reduces iteration count significantly when used with RQI and takes advantage of the new energy decomposition such that it can scale efficiently. Each individual method has been described before, but this is the first time they have been demonstrated to work together effectively. The combination of solvers enables the RQI eigenvalue solver to work better than the other available solvers for large reactors problems on leadership-class machines. Using these methods together, RQI converged in fewer iterations and in less time than PI for a full pressurized water reactor core. These solvers also performed better than an Arnoldi eigenvalue solver for a reactor benchmark problem when energy decomposition is needed. The MG Krylov, MGE preconditioner, and RQI solver combination also scales well in energy. Finally, this solver set is a strong choice for very large and challenging problems.« less
Eigenvalue Solvers for Modeling Nuclear Reactors on Leadership Class Machines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slaybaugh, R. N.; Ramirez-Zweiger, M.; Pandya, Tara
In this paper, three complementary methods have been implemented in the code Denovo that accelerate neutral particle transport calculations with methods that use leadership-class computers fully and effectively: a multigroup block (MG) Krylov solver, a Rayleigh quotient iteration (RQI) eigenvalue solver, and a multigrid in energy (MGE) preconditioner. The MG Krylov solver converges more quickly than Gauss Seidel and enables energy decomposition such that Denovo can scale to hundreds of thousands of cores. RQI should converge in fewer iterations than power iteration (PI) for large and challenging problems. RQI creates shifted systems that would not be tractable without the MGmore » Krylov solver. It also creates ill-conditioned matrices. The MGE preconditioner reduces iteration count significantly when used with RQI and takes advantage of the new energy decomposition such that it can scale efficiently. Each individual method has been described before, but this is the first time they have been demonstrated to work together effectively. The combination of solvers enables the RQI eigenvalue solver to work better than the other available solvers for large reactors problems on leadership-class machines. Using these methods together, RQI converged in fewer iterations and in less time than PI for a full pressurized water reactor core. These solvers also performed better than an Arnoldi eigenvalue solver for a reactor benchmark problem when energy decomposition is needed. The MG Krylov, MGE preconditioner, and RQI solver combination also scales well in energy. Finally, this solver set is a strong choice for very large and challenging problems.« less
(abstract) Scaling Nominal Solar Cell Impedances for Array Design
NASA Technical Reports Server (NTRS)
Mueller, Robert L; Wallace, Matthew T.; Iles, Peter
1994-01-01
This paper discusses a task the objective of which is to characterize solar cell array AC impedance and develop scaling rules for impedance characterization of large arrays by testing single solar cells and small arrays. This effort is aimed at formulating a methodology for estimating the AC impedance of the Mars Pathfinder (MPF) cruise and lander solar arrays based upon testing single cells and small solar cell arrays and to create a basis for design of a single shunt limiter for MPF power control of flight solar arrays having very different inpedances.
Lampa, Samuel; Alvarsson, Jonathan; Spjuth, Ola
2016-01-01
Predictive modelling in drug discovery is challenging to automate as it often contains multiple analysis steps and might involve cross-validation and parameter tuning that create complex dependencies between tasks. With large-scale data or when using computationally demanding modelling methods, e-infrastructures such as high-performance or cloud computing are required, adding to the existing challenges of fault-tolerant automation. Workflow management systems can aid in many of these challenges, but the currently available systems are lacking in the functionality needed to enable agile and flexible predictive modelling. We here present an approach inspired by elements of the flow-based programming paradigm, implemented as an extension of the Luigi system which we name SciLuigi. We also discuss the experiences from using the approach when modelling a large set of biochemical interactions using a shared computer cluster.Graphical abstract.
Skyscape Archaeology: an emerging interdiscipline for archaeoastronomers and archaeologists
NASA Astrophysics Data System (ADS)
Henty, Liz
2016-02-01
For historical reasons archaeoastronomy and archaeology differ in their approach to prehistoric monuments and this has created a divide between the disciplines which adopt seemingly incompatible methodologies. The reasons behind the impasse will be explored to show how these different approaches gave rise to their respective methods. Archaeology investigations tend to concentrate on single site analysis whereas archaeoastronomical surveys tend to be data driven from the examination of a large number of similar sets. A comparison will be made between traditional archaeoastronomical data gathering and an emerging methodology which looks at sites on a small scale and combines archaeology and astronomy. Silva's recent research in Portugal and this author's survey in Scotland have explored this methodology and termed it skyscape archaeology. This paper argues that this type of phenomenological skyscape archaeology offers an alternative to large scale statistical studies which analyse astronomical data obtained from a large number of superficially similar archaeological sites.
Electromagnetic Waves and Bursty Electron Acceleration: Implications from Freja
NASA Technical Reports Server (NTRS)
Andersson, Laila; Ivchenko, N.; Wahlund, J.-E.; Clemmons, J.; Gustavsson, B.; Eliasson, L.
2000-01-01
Dispersive Alfven wave activity is identified in four dayside auroral oval events measured by the Freja satellite. The events are characterized by ion injection, bursty electron precipitation below about I keV, transverse ion heating and broadband extremely low frequency (ELF) emissions below the lower hybrid cutoff frequency (a few kHz). The broadband emissions are observed to become more electrostatic towards higher frequencies. Large-scale density depletions/cavities, as determined by the Langmuir probe measurements, and strong electrostatic emissions are often observed simultaneously. A correlation study has been carried out between the E- and B-field fluctuations below 64 Hz (the dc instrument's upper threshold) and the characteristics of the precipitating electrons. This study revealed that the energization of electrons is indeed related to the broadband ELF emissions and that the electrostatic component plays a predominant role during very active magnetospheric conditions. Furthermore, the effect of the ELF electromagnetic emissions on the larger scale field-aligned current systems has been investigated, and it is found that such an effect cannot be detected. Instead, the Alfvenic activity creates a local region of field-aligned currents. It is suggested that dispersive Alfven waves set up these local field-aligned current regions and in turn trigger more electrostatic emissions during certain conditions. In these regions ions are transversely heated, and large-scale density depletions/cavities may be created during especially active periods.
Non-linear scaling of a musculoskeletal model of the lower limb using statistical shape models.
Nolte, Daniel; Tsang, Chui Kit; Zhang, Kai Yu; Ding, Ziyun; Kedgley, Angela E; Bull, Anthony M J
2016-10-03
Accurate muscle geometry for musculoskeletal models is important to enable accurate subject-specific simulations. Commonly, linear scaling is used to obtain individualised muscle geometry. More advanced methods include non-linear scaling using segmented bone surfaces and manual or semi-automatic digitisation of muscle paths from medical images. In this study, a new scaling method combining non-linear scaling with reconstructions of bone surfaces using statistical shape modelling is presented. Statistical Shape Models (SSMs) of femur and tibia/fibula were used to reconstruct bone surfaces of nine subjects. Reference models were created by morphing manually digitised muscle paths to mean shapes of the SSMs using non-linear transformations and inter-subject variability was calculated. Subject-specific models of muscle attachment and via points were created from three reference models. The accuracy was evaluated by calculating the differences between the scaled and manually digitised models. The points defining the muscle paths showed large inter-subject variability at the thigh and shank - up to 26mm; this was found to limit the accuracy of all studied scaling methods. Errors for the subject-specific muscle point reconstructions of the thigh could be decreased by 9% to 20% by using the non-linear scaling compared to a typical linear scaling method. We conclude that the proposed non-linear scaling method is more accurate than linear scaling methods. Thus, when combined with the ability to reconstruct bone surfaces from incomplete or scattered geometry data using statistical shape models our proposed method is an alternative to linear scaling methods. Copyright © 2016 The Author. Published by Elsevier Ltd.. All rights reserved.
Pynamic: the Python Dynamic Benchmark
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, G L; Ahn, D H; de Supinksi, B R
2007-07-10
Python is widely used in scientific computing to facilitate application development and to support features such as computational steering. Making full use of some of Python's popular features, which improve programmer productivity, leads to applications that access extremely high numbers of dynamically linked libraries (DLLs). As a result, some important Python-based applications severely stress a system's dynamic linking and loading capabilities and also cause significant difficulties for most development environment tools, such as debuggers. Furthermore, using the Python paradigm for large scale MPI-based applications can create significant file IO and further stress tools and operating systems. In this paper, wemore » present Pynamic, the first benchmark program to support configurable emulation of a wide-range of the DLL usage of Python-based applications for large scale systems. Pynamic has already accurately reproduced system software and tool issues encountered by important large Python-based scientific applications on our supercomputers. Pynamic provided insight for our system software and tool vendors, and our application developers, into the impact of several design decisions. As we describe the Pynamic benchmark, we will highlight some of the issues discovered in our large scale system software and tools using Pynamic.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cree, Johnathan Vee; Delgado-Frias, Jose
Large scale wireless sensor networks have been proposed for applications ranging from anomaly detection in an environment to vehicle tracking. Many of these applications require the networks to be distributed across a large geographic area while supporting three to five year network lifetimes. In order to support these requirements large scale wireless sensor networks of duty-cycled devices need a method of efficient and effective autonomous configuration/maintenance. This method should gracefully handle the synchronization tasks duty-cycled networks. Further, an effective configuration solution needs to recognize that in-network data aggregation and analysis presents significant benefits to wireless sensor network and should configuremore » the network in a way such that said higher level functions benefit from the logically imposed structure. NOA, the proposed configuration and maintenance protocol, provides a multi-parent hierarchical logical structure for the network that reduces the synchronization workload. It also provides higher level functions with significant inherent benefits such as but not limited to: removing network divisions that are created by single-parent hierarchies, guarantees for when data will be compared in the hierarchy, and redundancies for communication as well as in-network data aggregation/analysis/storage.« less
NASA Astrophysics Data System (ADS)
Beichner, Robert
2015-03-01
The Student Centered Active Learning Environment with Upside-down Pedagogies (SCALE-UP) project was developed nearly 20 years ago as an economical way to provide collaborative, interactive instruction even for large enrollment classes. Nearly all research-based pedagogies have been designed with fairly high faculty-student ratios. The economics of introductory courses at large universities often precludes that situation, so SCALE-UP was created as a way to facilitate highly collaborative active learning with large numbers of students served by only a few faculty and assistants. It enables those students to learn and succeed not only in acquiring content, but also to practice important 21st century skills like problem solving, communication, and teamsmanship. The approach was initially targeted at undergraduate science and engineering students taking introductory physics courses in large enrollment sections. It has since expanded to multiple content areas, including chemistry, math, engineering, biology, business, nursing, and even the humanities. Class sizes range from 24 to over 600. Data collected from multiple sites around the world indicates highly successful implementation at more than 250 institutions. NSF support was critical for initial development and dissemination efforts. Generously supported by NSF (9752313, 9981107) and FIPSE (P116B971905, P116B000659).
Robopedia: Leveraging Sensorpedia for Web-Enabled Robot Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Resseguie, David R
There is a growing interest in building Internetscale sensor networks that integrate sensors from around the world into a single unified system. In contrast, robotics application development has primarily focused on building specialized systems. These specialized systems take scalability and reliability into consideration, but generally neglect exploring the key components required to build a large scale system. Integrating robotic applications with Internet-scale sensor networks will unify specialized robotics applications and provide answers to large scale implementation concerns. We focus on utilizing Internet-scale sensor network technology to construct a framework for unifying robotic systems. Our framework web-enables a surveillance robot smore » sensor observations and provides a webinterface to the robot s actuators. This lets robots seamlessly integrate into web applications. In addition, the framework eliminates most prerequisite robotics knowledge, allowing for the creation of general web-based robotics applications. The framework also provides mechanisms to create applications that can interface with any robot. Frameworks such as this one are key to solving large scale mobile robotics implementation problems. We provide an overview of previous Internetscale sensor networks, Sensorpedia (an ad-hoc Internet-scale sensor network), our framework for integrating robots with Sensorpedia, two applications which illustrate our frameworks ability to support general web-based robotic control, and offer experimental results that illustrate our framework s scalability, feasibility, and resource requirements.« less
Matsushima, Kyoji; Nakahara, Sumio
2009-12-01
A large-scale full-parallax computer-generated hologram (CGH) with four billion (2(16) x 2(16)) pixels is created to reconstruct a fine true 3D image of a scene, with occlusions. The polygon-based method numerically generates the object field of a surface object, whose shape is provided by a set of vertex data of polygonal facets, while the silhouette method makes it possible to reconstruct the occluded scene. A novel technique using the segmented frame buffer is presented for handling and propagating large wave fields even in the case where the whole wave field cannot be stored in memory. We demonstrate that the full-parallax CGH, calculated by the proposed method and fabricated by a laser lithography system, reconstructs a fine 3D image accompanied by a strong sensation of depth.
Monitoring Million Trees LA: Tree performance during the early years and future benefits
E. Gregory McPherson
2014-01-01
Million Trees LA (MTLA) is one of several large-scale mayoral tree planting initiatives striving to create more livable cities through urban forestry. This study combined field sampling of tree survival and growth with numerical modeling of future benefits to assess performance of MTLA plantings. From 2006 to 2010 MTLA planted a diverse mix of 91,786 trees....
Growth responses of mature loblolly pine to dead wood manipulations
Michael D. Ulyshen; Scott Horn; James L. Hanula
2012-01-01
Large-scale manipulations of dead wood in mature Pinus taeda L. stands in the southeastern United States included a major one-time input of logs (fivefold increase in log volume) created by felling trees onsite, annual removals of all dead wood above ≥10 cm in diameter and ≥60 cm in length, and a reference in which no...
Multiple Shells Around Wolf-Rayet Stars: Space Based Astrometric Observing
NASA Technical Reports Server (NTRS)
Marston, Anthony P.
1995-01-01
The completion of a complementary optical emission-line survey of the nebulae associated with Wolf-Rayet stars in the southern sky is reported, along with the completion of a survey the large-scale environments of Wolf-Rayet stars using IRAS Skyflux data. HIRES IRAS maps in the four IRAS wavebands for appoximately half of all galactic Wolf-Rayet stars are created.
Roy Mann
1979-01-01
Drilling rigs, confined dredged material disposal sites power and sewage treatment facilities, and other built objects on or near shorelines have often created appreciable impacts on the aesthetic perceptions of residents and recreational users. Techniques for assessing such impacts that are reviewed in this paper include viewscape analysis for large-scale shore...
Daniel Kashian; Gregory Corace; Lindsey Shartell; Deahn M. Donner; Philip Huber
2011-01-01
Stand-replacing wildfires have historically shaped the forest structure of dry, sandy jack pine-dominated ecosystems at stand and landscape scales in northern Lower Michigan. Unique fire behavior during large wildfire events often preserves long strips of unburned trees arranged perpendicular to the direction of fire spread. These biological legacies create...
ERIC Educational Resources Information Center
Behizadeh, Nadia; Engelhard, George, Jr.
2015-01-01
In his focus article, Koretz (this issue) argues that accountability has become the primary function of large-scale testing in the United States. He then points out that tests being used for accountability purposes are flawed and that the high-stakes nature of these tests creates a context that encourages score inflation. Koretz is concerned about…
ERIC Educational Resources Information Center
Unlu, Ali; Schurig, Michael
2015-01-01
Recently, performance profiles in reading, mathematics and science were created using the data collectively available in the Trends in International Mathematics and Science Study (TIMSS) and the Progress in International Reading Literacy Study (PIRLS) 2011. In addition, a classification of children to the end of their primary school years was…
ITC Guidelines on Quality Control in Scoring, Test Analysis, and Reporting of Test Scores
ERIC Educational Resources Information Center
Allalouf, Avi
2014-01-01
The Quality Control (QC) Guidelines are intended to increase the efficiency, precision, and accuracy of the scoring, analysis, and reporting process of testing. The QC Guidelines focus on large-scale testing operations where multiple forms of tests are created for use on set dates. However, they may also be used for a wide variety of other testing…
Children as Artists: The Preschool as a Community of Creative Practice
ERIC Educational Resources Information Center
Cutcher, Alexandra; Boyd, Wendy
2016-01-01
Picasso once famously said "All children are artists. The problem is how to remain an artist once he grows up." This visual inquiry is engaged through a community of creative practice in two rural children's centers where the researchers along with 4- and 5-year-old children collaborated to create a large-scale canvas and several smaller…
ArcFuels: an ArcMap toolbar for fuel treatment planning and wildfire risk assessment
Nicole M. Vaillant; Alan A. Ager
2014-01-01
Fire behavior modeling and geospatial analysis can provide tremendous insight to land managers in defining both the benefits and potential impacts of fuel treatments in the context of land management goals and public expectations. ArcFuels is a streamlined fuel management planning and wildfire risk assessment system that creates a trans-scale (stand to large landscape...
Studies of Global Solar Magnetic Field Patterns Using a Newly Digitized Archive
NASA Astrophysics Data System (ADS)
Hewins, I.; Webb, D. F.; Gibson, S. E.; McFadden, R.; Emery, B. A.; Malanushenko, A. V.
2017-12-01
The McIntosh Archive consists of a set of hand-drawn solar Carrington maps created by Patrick McIntosh from 1964 to 2009. McIntosh used mainly Ha, He 10830Å and photospheric magnetic measurements from both ground-based and NASA satellite observations. With these he traced polarity inversion lines (PILs), filaments, sunspots and plage and, later, coronal holes, yielding a unique 45-year record of features associated with the large-scale organization of the solar magnetic field. We discuss our efforts to preserve and digitize this archive; the original hand-drawn maps have been scanned, a method for processing these scans into digital, searchable format has been developed, and a website and an archival repository at NOAA's National Centers for Environmental Information (NCEI) has been created. The archive is complete for SC 23 and partially complete for SCs 21 and 22. In this paper we show examples of how the data base can be utilized for scientific applications. We compare the evolution of the areas and boundaries of CHs with other recent results, and we use the maps to track the global, SC-evolution of filaments, large-scale positive and negative polarity regions, PILs and sunspots.
A Scalable Framework For Segmenting Magnetic Resonance Images
Hore, Prodip; Goldgof, Dmitry B.; Gu, Yuhua; Maudsley, Andrew A.; Darkazanli, Ammar
2009-01-01
A fast, accurate and fully automatic method of segmenting magnetic resonance images of the human brain is introduced. The approach scales well allowing fast segmentations of fine resolution images. The approach is based on modifications of the soft clustering algorithm, fuzzy c-means, that enable it to scale to large data sets. Two types of modifications to create incremental versions of fuzzy c-means are discussed. They are much faster when compared to fuzzy c-means for medium to extremely large data sets because they work on successive subsets of the data. They are comparable in quality to application of fuzzy c-means to all of the data. The clustering algorithms coupled with inhomogeneity correction and smoothing are used to create a framework for automatically segmenting magnetic resonance images of the human brain. The framework is applied to a set of normal human brain volumes acquired from different magnetic resonance scanners using different head coils, acquisition parameters and field strengths. Results are compared to those from two widely used magnetic resonance image segmentation programs, Statistical Parametric Mapping and the FMRIB Software Library (FSL). The results are comparable to FSL while providing significant speed-up and better scalability to larger volumes of data. PMID:20046893
NASA Astrophysics Data System (ADS)
Bates, P. D.; Quinn, N.; Sampson, C. C.; Smith, A.; Wing, O.; Neal, J. C.
2017-12-01
Remotely sensed data has transformed the field of large scale hydraulic modelling. New digital elevation, hydrography and river width data has allowed such models to be created for the first time, and remotely sensed observations of water height, slope and water extent has allowed them to be calibrated and tested. As a result, we are now able to conduct flood risk analyses at national, continental or even global scales. However, continental scale analyses have significant additional complexity compared to typical flood risk modelling approaches. Traditional flood risk assessment uses frequency curves to define the magnitude of extreme flows at gauging stations. The flow values for given design events, such as the 1 in 100 year return period flow, are then used to drive hydraulic models in order to produce maps of flood hazard. Such an approach works well for single gauge locations and local models because over relatively short river reaches (say 10-60km) one can assume that the return period of an event does not vary. At regional to national scales and across multiple river catchments this assumption breaks down, and for a given flood event the return period will be different at different gauging stations, a pattern known as the event `footprint'. Despite this, many national scale risk analyses still use `constant in space' return period hazard layers (e.g. the FEMA Special Flood Hazard Areas) in their calculations. Such an approach can estimate potential exposure, but will over-estimate risk and cannot determine likely flood losses over a whole region or country. We address this problem by using a stochastic model to simulate many realistic extreme event footprints based on observed gauged flows and the statistics of gauge to gauge correlations. We take the entire USGS gauge data catalogue for sites with > 45 years of record and use a conditional approach for multivariate extreme values to generate sets of flood events with realistic return period variation in space. We undertake a number of quality checks of the stochastic model and compare real and simulated footprints to show that the method is able to re-create realistic patterns even at continental scales where there is large variation in flood generating mechanisms. We then show how these patterns can be used to drive a large scale 2D hydraulic to predict regional scale flooding.
geoknife: Reproducible web-processing of large gridded datasets
Read, Jordan S.; Walker, Jordan I.; Appling, Alison P.; Blodgett, David L.; Read, Emily K.; Winslow, Luke A.
2016-01-01
Geoprocessing of large gridded data according to overlap with irregular landscape features is common to many large-scale ecological analyses. The geoknife R package was created to facilitate reproducible analyses of gridded datasets found on the U.S. Geological Survey Geo Data Portal web application or elsewhere, using a web-enabled workflow that eliminates the need to download and store large datasets that are reliably hosted on the Internet. The package provides access to several data subset and summarization algorithms that are available on remote web processing servers. Outputs from geoknife include spatial and temporal data subsets, spatially-averaged time series values filtered by user-specified areas of interest, and categorical coverage fractions for various land-use types.
Pen, Ue-Li; Turok, Neil
2016-09-23
We point out a surprising consequence of the usually assumed initial conditions for cosmological perturbations. Namely, a spectrum of Gaussian, linear, adiabatic, scalar, growing mode perturbations not only creates acoustic oscillations of the kind observed on very large scales today, it also leads to the production of shocks in the radiation fluid of the very early Universe. Shocks cause departures from local thermal equilibrium as well as create vorticity and gravitational waves. For a scale-invariant spectrum and standard model physics, shocks form for temperatures 1 GeV
Satellite radar altimetry over ice. Volume 4: Users' guide for Antarctica elevation data from Seasat
NASA Technical Reports Server (NTRS)
Zwally, H. Jay; Major, Judith A.; Brenner, Anita C.; Bindschadler, Robert A.; Martin, Thomas V.
1990-01-01
A gridded surface-elevation data set and a geo-referenced data base for the Seasat radar altimeter data over Greenland are described. This is a user guide to accompany the data provided to data centers and other users. The grid points are on a polar stereographic projection with a nominal spacing of 20 km. The gridded elevations are derived from the elevation data in the geo-referenced data base by a weighted fitting of a surface in the neighborhood of each grid point. The gridded elevations are useful for the creating of large-scale contour maps, and the geo-referenced data base is useful for regridding, creating smaller-scale contour maps, and examinating individual elevation measurements in specific geographic areas. Tape formats are described, and a FORTRAN program for reading the data tape is listed and provided on the tape.
The complexity and robustness of metro networks
NASA Astrophysics Data System (ADS)
Derrible, Sybil; Kennedy, Christopher
2010-09-01
Transportation systems, being real-life examples of networks, are particularly interesting to analyze from the viewpoint of the new and rapidly emerging field of network science. Two particular concepts seem to be particularly relevant: scale-free patterns and small-worlds. By looking at 33 metro systems in the world, this paper adapts network science methodologies to the transportation literature, and offers one application to the robustness of metros; here, metro refers to urban rail transit with exclusive right-of-way, whether it is underground, at grade or elevated. We find that most metros are indeed scale-free (with scaling factors ranging from 2.10 to 5.52) and small-worlds; they show atypical behaviors, however, with increasing size. In particular, the presence of transfer-hubs (stations hosting more than three lines) results in relatively large scaling factors. The analysis provides insights/recommendations for increasing the robustness of metro networks. Smaller networks should focus on creating transfer stations, thus generating cycles to offer alternative routes. For larger networks, few stations seem to detain a certain monopole on transferring, it is therefore important to create additional transfers, possibly at the periphery of city centers; the Tokyo system seems to remarkably incorporate these properties.
NASA Astrophysics Data System (ADS)
Fort, Monique
2016-04-01
Hillslope geomorphology results from a large range of denudational processes mainly controlled by relief, structure, lithology, climate, land-cover and land use. In most areas of the world, the "critical zone" concept is a good integrator of denudation that operates on a long-term scale. However, in large and high mountain areas, short-time scale factors often play a significant role in the denudational pattern, accelerating and/or delaying the transfer of denudation products and fluxes, and creating specific, spatially limited disturbances. We focus on the Nepal Himalayas, where the wide altitudinal range of bio-climatic zones and the intense geodynamic activity create a complex mosaic of landforms, as expressed by the present geomorphology of mountain slopes. On the basis of examples selected in the different Himalayan mountain belts (Siwaliks hills, middle mountains, High Himalaya), we illustrate different types of slopes and disturbances induced by active tectonics, climate extremes, and climate warming trends. Special attention is paid to recent events, such as landslide damming, triggered by either intense rainfalls (Kali Gandaki and Sun Kosi valleys) or the last April-May 2015 Gorkha seismic sequence (southern Khumbu). Lastly, references to older, larger events show that despite the highly dynamic environment, landforms caused by large magnitude disturbances may persist in the landscape in the long term.
X-ray techniques for innovation in industry
Lawniczak-Jablonska, Krystyna; Cutler, Jeffrey
2014-01-01
The smart specialization declared in the European program Horizon 2020, and the increasing cooperation between research and development found in companies and researchers at universities and research institutions have created a new paradigm where many calls for proposals require participation and funding from public and private entities. This has created a unique opportunity for large-scale facilities, such as synchrotron research laboratories, to participate in and support applied research programs. Scientific staff at synchrotron facilities have developed many advanced tools that make optimal use of the characteristics of the light generated by the storage ring. These tools have been exceptionally valuable for materials characterization including X-ray absorption spectroscopy, diffraction, tomography and scattering, and have been key in solving many research and development issues. Progress in optics and detectors, as well as a large effort put into the improvement of data analysis codes, have resulted in the development of reliable and reproducible procedures for materials characterization. Research with photons has contributed to the development of a wide variety of products such as plastics, cosmetics, chemicals, building materials, packaging materials and pharma. In this review, a few examples are highlighted of successful cooperation leading to solutions of a variety of industrial technological problems which have been exploited by industry including lessons learned from the Science Link project, supported by the European Commission, as a new approach to increase the number of commercial users at large-scale research infrastructures. PMID:25485139
Development of a Two-Stage Microalgae Dewatering Process – A Life Cycle Assessment Approach
Soomro, Rizwan R.; Zeng, Xianhai; Lu, Yinghua; Lin, Lu; Danquah, Michael K.
2016-01-01
Even though microalgal biomass is leading the third generation biofuel research, significant effort is required to establish an economically viable commercial-scale microalgal biofuel production system. Whilst a significant amount of work has been reported on large-scale cultivation of microalgae using photo-bioreactors and pond systems, research focus on establishing high performance downstream dewatering operations for large-scale processing under optimal economy is limited. The enormous amount of energy and associated cost required for dewatering large-volume microalgal cultures has been the primary hindrance to the development of the needed biomass quantity for industrial-scale microalgal biofuels production. The extremely dilute nature of large-volume microalgal suspension and the small size of microalgae cells in suspension create a significant processing cost during dewatering and this has raised major concerns towards the economic success of commercial-scale microalgal biofuel production as an alternative to conventional petroleum fuels. This article reports an effective framework to assess the performance of different dewatering technologies as the basis to establish an effective two-stage dewatering system. Bioflocculation coupled with tangential flow filtration (TFF) emerged a promising technique with total energy input of 0.041 kWh, 0.05 kg CO2 emissions and a cost of $ 0.0043 for producing 1 kg of microalgae biomass. A streamlined process for operational analysis of two-stage microalgae dewatering technique, encompassing energy input, carbon dioxide emission, and process cost, is presented. PMID:26904075
Computer-generated forces in distributed interactive simulation
NASA Astrophysics Data System (ADS)
Petty, Mikel D.
1995-04-01
Distributed Interactive Simulation (DIS) is an architecture for building large-scale simulation models from a set of independent simulator nodes communicating via a common network protocol. DIS is most often used to create a simulated battlefield for military training. Computer Generated Forces (CGF) systems control large numbers of autonomous battlefield entities in a DIS simulation using computer equipment and software rather than humans in simulators. CGF entities serve as both enemy forces and supplemental friendly forces in a DIS exercise. Research into various aspects of CGF systems is ongoing. Several CGF systems have been implemented.
Effects of tidal current phase at the junction of two straits
Warner, J.; Schoellhamer, D.; Burau, J.; Schladow, G.
2002-01-01
Estuaries typically have a monotonic increase in salinity from freshwater at the head of the estuary to ocean water at the mouth, creating a consistent direction for the longitudinal baroclinic pressure gradient. However, Mare Island Strait in San Francisco Bay has a local salinity minimum created by the phasing of the currents at the junction of Mare Island and Carquinez Straits. The salinity minimum creates converging baroclinic pressure gradients in Mare Island Strait. Equipment was deployed at four stations in the straits for 6 months from September 1997 to March 1998 to measure tidal variability of velocity, conductivity, temperature, depth, and suspended sediment concentration. Analysis of the measured time series shows that on a tidal time scale in Mare Island Strait, the landward and seaward baroclinic pressure gradients in the local salinity minimum interact with the barotropic gradient, creating regions of enhanced shear in the water column during the flood and reduced shear during the ebb. On a tidally averaged time scale, baroclinic pressure gradients converge on the tidally averaged salinity minimum and drive a converging near-bed and diverging surface current circulation pattern, forming a "baroclinic convergence zone" in Mare Island Strait. Historically large sedimentation rates in this area are attributed to the convergence zone.
Chemical Warfare and Medical Response During World War I
Fitzgerald, Gerard J.
2008-01-01
The first large-scale use of a traditional weapon of mass destruction (chemical, biological, or nuclear) involved the successful deployment of chemical weapons during World War I (1914–1918). Historians now refer to the Great War as the chemist’s war because of the scientific and engineering mobilization efforts by the major belligerents. The development, production, and deployment of war gases such as chlorine, phosgene, and mustard created a new and complex public health threat that endangered not only soldiers and civilians on the battlefield but also chemical workers on the home front involved in the large-scale manufacturing processes. The story of chemical weapons research and development during that war provides useful insights for current public health practitioners faced with a possible chemical weapons attack against civilian or military populations. PMID:18356568
Chemical warfare and medical response during World War I.
Fitzgerald, Gerard J
2008-04-01
The first large-scale use of a traditional weapon of mass destruction (chemical, biological, or nuclear) involved the successful deployment of chemical weapons during World War I (1914-1918). Historians now refer to the Great War as the chemist's war because of the scientific and engineering mobilization efforts by the major belligerents. The development, production, and deployment of war gases such as chlorine, phosgene, and mustard created a new and complex public health threat that endangered not only soldiers and civilians on the battlefield but also chemical workers on the home front involved in the large-scale manufacturing processes. The story of chemical weapons research and development during that war provides useful insights for current public health practitioners faced with a possible chemical weapons attack against civilian or military populations.
Smith, Andrew B; Lloyd, Graeme T; McGowan, Alistair J
2012-11-07
Sampling bias created by a heterogeneous rock record can seriously distort estimates of marine diversity and makes a direct reading of the fossil record unreliable. Here we compare two independent estimates of Phanerozoic marine diversity that explicitly take account of variation in sampling-a subsampling approach that standardizes for differences in fossil collection intensity, and a rock area modelling approach that takes account of differences in rock availability. Using the fossil records of North America and Western Europe, we demonstrate that a modelling approach applied to the combined data produces results that are significantly correlated with those derived from subsampling. This concordance between independent approaches argues strongly for the reality of the large-scale trends in diversity we identify from both approaches.
NASA Astrophysics Data System (ADS)
Menezes, Shannon John
Nanoimprint Lithography (NIL) has existed since the mid 1990s as a proven concept of creating micro- and nanostructures using direct mechanical pattern transfer. Initially seen as a viable option to replace conventional lithography methods, the lack of technology to support large-scale manufacturing using NIL has motivated researchers to explore the application of NIL to create a better, more cost-efficient process with the ability to integrate NIL into a mass manufacturing system. One such method is the roll-to-roll process, similar to that used in printing presses of newspapers and plastics. This thesis is an investigation to characterize polymer deposition using a piezoelectric jetting head and attempt to create micro- and nanostructures on the polymer using R2RNIL technique.
Computational biology in the cloud: methods and new insights from computing at scale.
Kasson, Peter M
2013-01-01
The past few years have seen both explosions in the size of biological data sets and the proliferation of new, highly flexible on-demand computing capabilities. The sheer amount of information available from genomic and metagenomic sequencing, high-throughput proteomics, experimental and simulation datasets on molecular structure and dynamics affords an opportunity for greatly expanded insight, but it creates new challenges of scale for computation, storage, and interpretation of petascale data. Cloud computing resources have the potential to help solve these problems by offering a utility model of computing and storage: near-unlimited capacity, the ability to burst usage, and cheap and flexible payment models. Effective use of cloud computing on large biological datasets requires dealing with non-trivial problems of scale and robustness, since performance-limiting factors can change substantially when a dataset grows by a factor of 10,000 or more. New computing paradigms are thus often needed. The use of cloud platforms also creates new opportunities to share data, reduce duplication, and to provide easy reproducibility by making the datasets and computational methods easily available.
Crowdsourcing biomedical research: leveraging communities as innovation engines
Saez-Rodriguez, Julio; Costello, James C.; Friend, Stephen H.; Kellen, Michael R.; Mangravite, Lara; Meyer, Pablo; Norman, Thea; Stolovitzky, Gustavo
2018-01-01
The generation of large-scale biomedical data is creating unprecedented opportunities for basic and translational science. Typically, the data producers perform initial analyses, but it is very likely that the most informative methods may reside with other groups. Crowdsourcing the analysis of complex and massive data has emerged as a framework to find robust methodologies. When the crowdsourcing is done in the form of collaborative scientific competitions, known as Challenges, the validation of the methods is inherently addressed. Challenges also encourage open innovation, create collaborative communities to solve diverse and important biomedical problems, and foster the creation and dissemination of well-curated data repositories. PMID:27418159
Crowdsourcing biomedical research: leveraging communities as innovation engines.
Saez-Rodriguez, Julio; Costello, James C; Friend, Stephen H; Kellen, Michael R; Mangravite, Lara; Meyer, Pablo; Norman, Thea; Stolovitzky, Gustavo
2016-07-15
The generation of large-scale biomedical data is creating unprecedented opportunities for basic and translational science. Typically, the data producers perform initial analyses, but it is very likely that the most informative methods may reside with other groups. Crowdsourcing the analysis of complex and massive data has emerged as a framework to find robust methodologies. When the crowdsourcing is done in the form of collaborative scientific competitions, known as Challenges, the validation of the methods is inherently addressed. Challenges also encourage open innovation, create collaborative communities to solve diverse and important biomedical problems, and foster the creation and dissemination of well-curated data repositories.
NASA Astrophysics Data System (ADS)
Niwa, Masaki; Takashina, Shoichi; Mori, Yojiro; Hasegawa, Hiroshi; Sato, Ken-ichi; Watanabe, Toshio
2015-01-01
With the continuous increase in Internet traffic, reconfigurable optical add-drop multiplexers (ROADMs) have been widely adopted in the core and metro core networks. Current ROADMs, however, allow only static operation. To realize future dynamic optical-network services, and to minimize any human intervention in network operation, the optical signal add/drop part should have colorless/directionless/contentionless (C/D/C) capabilities. This is possible with matrix switches or a combination of splitter-switches and optical tunable filters. The scale of the matrix switch increases with the square of the number of supported channels, and hence, the matrix-switch-based architecture is not suitable for creating future large-scale ROADMs. In contrast, the numbers of splitter ports, switches, and tunable filters increase linearly with the number of supported channels, and hence the tunable-filter-based architecture will support all future traffic. So far, we have succeeded in fabricating a compact tunable filter that consists of multi-stage cyclic arrayed-waveguide gratings (AWGs) and switches by using planar-lightwave-circuit (PLC) technologies. However, this multistage configuration suffers from large insertion loss and filter narrowing. Moreover, power-consuming temperature control is necessary since it is difficult to make cyclic AWGs athermal. We propose here novel tunable-filter architecture that sandwiches a single-stage non-cyclic athermal AWG having flatter-topped passbands between small-scale switches. With this configuration, the optical tunable filter attains low insertion loss, large passband bandwidths, low power consumption, compactness, and high cost-effectiveness. A prototype is monolithically fabricated with PLC technologies and its excellent performance is experimentally confirmed utilizing 80-channel 30-GBaud dual-polarization quadrature phase-shift-keying (QPSK) signals.
Interagency Collaborative Team Model for Capacity Building to Scale-Up Evidence-Based Practice
Hurlburt, Michael; Aarons, Gregory A; Fettes, Danielle; Willging, Cathleen; Gunderson, Lara; Chaffin, Mark J
2015-01-01
Background System-wide scale up of evidence-based practice (EBP) is a complex process. Yet, few strategic approaches exist to support EBP implementation and sustainment across a service system. Building on the Exploration, Preparation, Implementation, and Sustainment (EPIS) implementation framework, we developed and are testing the Interagency Collaborative Team (ICT) process model to implement an evidence-based child neglect intervention (i.e., SafeCare®) within a large children’s service system. The ICT model emphasizes the role of local agency collaborations in creating structural supports for successful implementation. Methods We describe the ICT model and present preliminary qualitative results from use of the implementation model in one large scale EBP implementation. Qualitative interviews were conducted to assess challenges in building system, organization, and home visitor collaboration and capacity to implement the EBP. Data collection and analysis centered on EBP implementation issues, as well as the experiences of home visitors under the ICT model. Results Six notable issues relating to implementation process emerged from participant interviews, including: (a) initial commitment and collaboration among stakeholders, (b) leadership, (c) communication, (d) practice fit with local context, (e) ongoing negotiation and problem solving, and (f) early successes. These issues highlight strengths and areas for development in the ICT model. Conclusions Use of the ICT model led to sustained and widespread use of SafeCare in one large county. Although some aspects of the implementation model may benefit from enhancement, qualitative findings suggest that the ICT process generates strong structural supports for implementation and creates conditions in which tensions between EBP structure and local contextual variations can be resolved in ways that support the expansion and maintenance of an EBP while preserving potential for public health benefit. PMID:27512239
Benchmark of Client and Server-Side Catchment Delineation Approaches on Web-Based Systems
NASA Astrophysics Data System (ADS)
Demir, I.; Sermet, M. Y.; Sit, M. A.
2016-12-01
Recent advances in internet and cyberinfrastructure technologies have provided the capability to acquire large scale spatial data from various gauges and sensor networks. The collection of environmental data increased demand for applications which are capable of managing and processing large-scale and high-resolution data sets. With the amount and resolution of data sets provided, one of the challenging tasks for organizing and customizing hydrological data sets is delineation of watersheds on demand. Watershed delineation is a process for creating a boundary that represents the contributing area for a specific control point or water outlet, with intent of characterization and analysis of portions of a study area. Although many GIS tools and software for watershed analysis are available on desktop systems, there is a need for web-based and client-side techniques for creating a dynamic and interactive environment for exploring hydrological data. In this project, we demonstrated several watershed delineation techniques on the web with various techniques implemented on the client-side using JavaScript and WebGL, and on the server-side using Python and C++. We also developed a client-side GPGPU (General Purpose Graphical Processing Unit) algorithm to analyze high-resolution terrain data for watershed delineation which allows parallelization using GPU. The web-based real-time analysis of watershed segmentation can be helpful for decision-makers and interested stakeholders while eliminating the need of installing complex software packages and dealing with large-scale data sets. Utilization of the client-side hardware resources also eliminates the need of servers due its crowdsourcing nature. Our goal for future work is to improve other hydrologic analysis methods such as rain flow tracking by adapting presented approaches.
Interagency Collaborative Team Model for Capacity Building to Scale-Up Evidence-Based Practice.
Hurlburt, Michael; Aarons, Gregory A; Fettes, Danielle; Willging, Cathleen; Gunderson, Lara; Chaffin, Mark J
2014-04-01
System-wide scale up of evidence-based practice (EBP) is a complex process. Yet, few strategic approaches exist to support EBP implementation and sustainment across a service system. Building on the Exploration, Preparation, Implementation, and Sustainment (EPIS) implementation framework, we developed and are testing the Interagency Collaborative Team (ICT) process model to implement an evidence-based child neglect intervention (i.e., SafeCare®) within a large children's service system. The ICT model emphasizes the role of local agency collaborations in creating structural supports for successful implementation. We describe the ICT model and present preliminary qualitative results from use of the implementation model in one large scale EBP implementation. Qualitative interviews were conducted to assess challenges in building system, organization, and home visitor collaboration and capacity to implement the EBP. Data collection and analysis centered on EBP implementation issues, as well as the experiences of home visitors under the ICT model. Six notable issues relating to implementation process emerged from participant interviews, including: (a) initial commitment and collaboration among stakeholders, (b) leadership, (c) communication, (d) practice fit with local context, (e) ongoing negotiation and problem solving, and (f) early successes. These issues highlight strengths and areas for development in the ICT model. Use of the ICT model led to sustained and widespread use of SafeCare in one large county. Although some aspects of the implementation model may benefit from enhancement, qualitative findings suggest that the ICT process generates strong structural supports for implementation and creates conditions in which tensions between EBP structure and local contextual variations can be resolved in ways that support the expansion and maintenance of an EBP while preserving potential for public health benefit.
NASA Astrophysics Data System (ADS)
Alexander, L.; Hupp, C. R.; Forman, R. T.
2002-12-01
Many geodisturbances occur across large spatial scales, spanning entire landscapes and creating ecological phenomena in their wake. Ecological study at large scales poses special problems: (1) large-scale studies require large-scale resources, and (2) sampling is not always feasible at the appropriate scale, and researchers rely on data collected at smaller scales to interpret patterns across broad regions. A criticism of landscape ecology is that findings at small spatial scales are "scaled up" and applied indiscriminately across larger spatial scales. In this research, landscape scaling is addressed through process-pattern relationships between hydrogeomorphic processes and patterns of plant diversity in forested wetlands. The research addresses: (1) whether patterns and relationships between hydrogeomorphic, vegetation, and spatial variables can transcend scale; and (2) whether data collected at small spatial scales can be used to describe patterns and relationships across larger spatial scales. Field measurements of hydrologic, geomorphic, spatial, and vegetation data were collected or calculated for 15- 1-ha sites on forested floodplains of six (6) Chesapeake Bay Coastal Plain streams over a total area of about 20,000 km2. Hydroperiod (day/yr), floodplain surface elevation range (m), discharge (m3/s), stream power (kg-m/s2), sediment deposition (mm/yr), relative position downstream and other variables were used in multivariate analyses to explain differences in species richness, tree diversity (Shannon-Wiener Diversity Index H'), and plant community composition at four spatial scales. Data collected at the plot (400-m2) and site- (c. 1-ha) scales are applied to and tested at the river watershed and regional spatial scales. Results indicate that plant species richness and tree diversity (Shannon-Wiener diversity index H') can be described by hydrogeomorphic conditions at all scales, but are best described at the site scale. Data collected at plot and site scales are tested for spatial heterogeneity across the Chesapeake Bay Coastal Plain using a geostatistical variogram, and multiple regression analysis is used to relate plant diversity, spatial, and hydrogeomorphic variables across Coastal Plain regions and hydrologic regimes. Results indicate that relationships between hydrogeomorphic processes and patterns of plant diversity at finer scales can proxy relationships at coarser scales in some, not all, cases. Findings also suggest that data collected at small scales can be used to describe trends across broader scales under limited conditions.
The status, recent progress and promise of superconducting materials for practical applications
NASA Astrophysics Data System (ADS)
Rowell, J. M.
1989-03-01
The author summarizes the progress in materials science and engineering that created today's superconducting technology. He reviews the state of the technology with conventional materials by looking at two particular applications: large-scale applications involving conductors, for example, magnets; and electronics and instrumentation applications. The state-of-the art is contrasted with the present understanding of the high-Tc oxide materials.
Optical and Radio Remote Sensing of Space Plasma Turbulence
2008-03-31
Helbert, Guilhelm Moreaux, Pierre-Emmanuel Godet (2006), Ground based GPS tomography of ionospheric post-seismic signal., Planet. Space. Science, 54...occurring and radio wave-induced ionospheric plasma turbulence. The intriguing phenomena reported here include large-scale turbulence created by tsunami...in Puerto Rico [Labno et al., J. Geophys. Res., 2007]. Presented are ionospheric measurements using Arecibo 430 MHz radar supported by data from
Brian S. Hughett; Wayne K. Clatterbuck
2014-01-01
Differences in composition, structure, and growth under canopy gaps created by the mortality of a single stem were analyzed using analysis of variance under two scenarios, with stem removed or with stem left as a standing snag. There were no significant differences in composition and structure of large diameter residual stems within upper canopy strata. Some...
NASA Technical Reports Server (NTRS)
Curreri, Peter A.; Detweiler, Michael
2010-01-01
Creating large space habitats by launching all materials from Earth is prohibitively expensive. Using space resources and space based labor to build space solar power satellites can yield extraordinary profits after a few decades. The economic viability of this program depends on the use of space resources and space labor. To maximize the return on the investment, the early use of high density bolo habitats is required. Other shapes do not allow for the small initial scale required for a quick population increase in space. This study found that 5 Man Year, or 384 person bolo high density habitats will be the most economically feasible for a program started at year 2010 and will cause a profit by year 24 of the program, put over 45,000 people into space, and create a large system of space infrastructure for the further exploration and development of space.
Genome-scale approaches to the epigenetics of common human disease
2011-01-01
Traditionally, the pathology of human disease has been focused on microscopic examination of affected tissues, chemical and biochemical analysis of biopsy samples, other available samples of convenience, such as blood, and noninvasive or invasive imaging of varying complexity, in order to classify disease and illuminate its mechanistic basis. The molecular age has complemented this armamentarium with gene expression arrays and selective analysis of individual genes. However, we are entering a new era of epigenomic profiling, i.e., genome-scale analysis of cell-heritable nonsequence genetic change, such as DNA methylation. The epigenome offers access to stable measurements of cellular state and to biobanked material for large-scale epidemiological studies. Some of these genome-scale technologies are beginning to be applied to create the new field of epigenetic epidemiology. PMID:19844740
Study of alumina-trichite reinforcement of a nickel-based matric by means of powder metallurgy
NASA Technical Reports Server (NTRS)
Walder, A.; Hivert, A.
1982-01-01
Research was conducted on reinforcing nickel based matrices with alumina trichites by using powder metallurgy. Alumina trichites previously coated with nickel are magnetically aligned. The felt obtained is then sintered under a light pressure at a temperature just below the melting point of nickel. The halogenated atmosphere technique makes it possible to incorporate a large number of additive elements such as chromium, titanium, zirconium, tantalum, niobium, aluminum, etc. It does not appear that going from laboratory scale to a semi-industrial scale in production would create any major problems.
Podolak, Charles J.
2013-01-01
An ensemble of rule-based models was constructed to assess possible future braided river planform configurations for the Toklat River in Denali National Park and Preserve, Alaska. This approach combined an analysis of large-scale influences on stability with several reduced-complexity models to produce the predictions at a practical level for managers concerned about the persistence of bank erosion while acknowledging the great uncertainty in any landscape prediction. First, a model of confluence angles reproduced observed angles of a major confluence, but showed limited susceptibility to a major rearrangement of the channel planform downstream. Second, a probabilistic map of channel locations was created with a two-parameter channel avulsion model. The predicted channel belt location was concentrated in the same area as the current channel belt. Finally, a suite of valley-scale channel and braid plain characteristics were extracted from a light detection and ranging (LiDAR)-derived surface. The characteristics demonstrated large-scale stabilizing topographic influences on channel planform. The combination of independent analyses increased confidence in the conclusion that the Toklat River braided planform is a dynamically stable system due to large and persistent valley-scale influences, and that a range of avulsive perturbations are likely to result in a relatively unchanged planform configuration in the short term.
Optimisation Of a Magnetostrictive Wave Energy Converter
NASA Astrophysics Data System (ADS)
Mundon, T. R.; Nair, B.
2014-12-01
Oscilla Power, Inc. (OPI) is developing a patented magnetostrictive wave energy converter aimed at reducing the cost of grid-scale electricity from ocean waves. Designed to operate cost-effectively across a wide range of wave conditions, this will be the first use of reverse magnetostriction for large-scale energy production. The device architecture is a straightforward two-body, point absorbing system that has been studied at length by various researchers. A large surface float is anchored to a submerged heave (reaction) plate by multiple taut tethers that are largely made up of discrete, robust power takeoff modules that house the magnetostrictive generators. The unique generators developed by OPI utilize the phenomenon of reverse magnetostriction, which through the application of load to a specific low cost alloy, can generate significant magnetic flux changes, and thus create power through electromagnetic induction. Unlike traditional generators, the mode of operation is low-displacement, high-force, high damping which in combination with the specific multi-tether configuration creates some unique effects and interesting optimization challenges. Using an empirical approach with a combination of numerical tools, such as ORCAFLEX, and physical models, we investigated the properties and sensitivities of this system arrangement, including various heave plate geometries, with the overall goal of identifying the mass and hydrodynamic parameters required for optimum performance. Furthermore, through a detailed physical model test program at the University of New Hampshire, we were able to study in more detail how the heave plate geometry affects the drag and added mass coefficients. In presenting this work we will discuss how alternate geometries could be used to optimize the hydrodynamic parameters of the heave plate, allowing maximum inertial forces in operational conditions, while simultaneously minimizing the forces generated in extreme waves. This presentation will cover the significant findings from this research, including physical model results and identified sensitivity parameters. In addition, we will discuss some preliminary results from our large-scale ocean trial conducted in August & September of this year.
Formulating a subgrid-scale breakup model for microbubble generation from interfacial collisions
NASA Astrophysics Data System (ADS)
Chan, Wai Hong Ronald; Mirjalili, Shahab; Urzay, Javier; Mani, Ali; Moin, Parviz
2017-11-01
Multiphase flows often involve impact events that engender important effects like the generation of a myriad of tiny bubbles that are subsequently transported in large liquid bodies. These impact events are created by large-scale phenomena like breaking waves on ocean surfaces, and often involve the relative approach of liquid surfaces. This relative motion generates continuously shrinking length scales as the entrapped gas layer thins and eventually breaks up into microbubbles. The treatment of this disparity in length scales is computationally challenging. In this presentation, a framework is presented that addresses a subgrid-scale (SGS) model aimed at capturing the process of microbubble generation. This work sets up the components in an overarching volume-of-fluid (VoF) toolset and investigates the analytical foundations of an SGS model for describing the breakup of a thin air film trapped between two approaching water bodies in a physical regime corresponding to Mesler entrainment. Constituents of the SGS model, such as the identification of impact events and the accurate computation of the local characteristic curvature in a VoF-based architecture, and the treatment of the air layer breakup, are discussed and illustrated in simplified scenarios. Supported by Office of Naval Research (ONR)/A*STAR (Singapore).
International Halley Watch: Discipline specialists for large scale phenomena
NASA Technical Reports Server (NTRS)
Brandt, J. C.; Niedner, M. B., Jr.
1986-01-01
The largest scale structures of comets, their tails, are extremely interesting from a physical point of view, and some of their properties are among the most spectacular displayed by comets. Because the tail(s) is an important component part of a comet, the Large-Scale Phenomena (L-SP) Discipline was created as one of eight different observational methods in which Halley data would be encouraged and collected from all around the world under the aspices of the International Halley Watch (IHW). The L-SP Discipline Specialist (DS) Team resides at NASA/Goddard Space Flight Center under the leadership of John C. Brandt, Malcolm B. Niedner, and their team of image-processing and computer specialists; Jurgan Rahe at NASA Headquarters completes the formal DS science staff. The team has adopted the study of disconnection events (DE) as its principal science target, and it is because of the rapid changes which occur in connection with DE's that such extensive global coverage was deemed necessary to assemble a complete record.
Perfusion directed 3D mineral formation within cell-laden hydrogels.
Sawyer, Stephen William; Shridhar, Shivkumar Vishnempet; Zhang, Kairui; Albrecht, Lucas; Filip, Alex; Horton, Jason; Soman, Pranav
2018-06-08
Despite the promise of stem cell engineering and the new advances in bioprinting technologies, one of the major challenges in the manufacturing of large scale bone tissue scaffolds is the inability to perfuse nutrients throughout thick constructs. Here, we report a scalable method to create thick, perfusable bone constructs using a combination of cell-laden hydrogels and a 3D printed sacrificial polymer. Osteoblast-like Saos-2 cells were encapsulated within a gelatin methacrylate (GelMA) hydrogel and 3D printed polyvinyl alcohol (PVA) pipes were used to create perfusable channels. A custom-built bioreactor was used to perfuse osteogenic media directly through the channels in order to induce mineral deposition which was subsequently quantified via microCT. Histological staining was used to verify mineral deposition around the perfused channels, while COMSOL modeling was used to simulate oxygen diffusion between adjacent channels. This information was used to design a scaled-up construct containing a 3D array of perfusable channels within cell-laden GelMA. Progressive matrix mineralization was observed by cells surrounding perfused channels as opposed to random mineral deposition in static constructs. MicroCT confirmed that there was a direct relationship between channel mineralization within perfused constructs and time within the bioreactor. Furthermore, the scalable method presented in this work serves as a model on how large-scale bone tissue replacement constructs could be made using commonly available 3D printers, sacrificial materials, and hydrogels. © 2018 IOP Publishing Ltd.
Multiscale/multiresolution landslides susceptibility mapping
NASA Astrophysics Data System (ADS)
Grozavu, Adrian; Cătălin Stanga, Iulian; Valeriu Patriche, Cristian; Toader Juravle, Doru
2014-05-01
Within the European strategies, landslides are considered an important threatening that requires detailed studies to identify areas where these processes could occur in the future and to design scientific and technical plans for landslide risk mitigation. In this idea, assessing and mapping the landslide susceptibility is an important preliminary step. Generally, landslide susceptibility at small scale (for large regions) can be assessed through qualitative approach (expert judgements), based on a few variables, while studies at medium and large scale requires quantitative approach (e.g. multivariate statistics), a larger set of variables and, necessarily, the landslide inventory. Obviously, the results vary more or less from a scale to another, depending on the available input data, but also on the applied methodology. Since it is almost impossible to have a complete landslide inventory on large regions (e.g. at continental level), it is very important to verify the compatibility and the validity of results obtained at different scales, identifying the differences and fixing the inherent errors. This paper aims at assessing and mapping the landslide susceptibility at regional level through a multiscale-multiresolution approach from small scale and low resolution to large scale and high resolution of data and results, comparing the compatibility of results. While the first ones could be used for studies at european and national level, the later ones allows results validation, including through fields surveys. The test area, namely the Barlad Plateau (more than 9000 sq.km) is located in Eastern Romania, covering a region where both the natural environment and the human factor create a causal context that favor these processes. The landslide predictors were initially derived from various databases available at pan-european level and progressively completed and/or enhanced together with scale and the resolution: the topography (from SRTM at 90 meters to digital elevation models based on topographical maps, 1:25,000 and 1:5,000), the lithology (from geological maps, 1:200,000), land cover and land use (from CLC 2006 to maps derived from orthorectified aerial images, 0.5 meters resolution), rainfall (from Worldclim, ECAD to our own data), the seismicity (the seismic zonation of Romania) etc. The landslide inventory was created as polygonal data based on aerial images (resolution 0.5 meters), the information being considered at county level (NUTS 3) and, eventually, at communal level (LAU2). The methodological framework is based on the logistic regression as a quantitative method and the analytic hierarchy process as a semi-qualitative methods, both being applied once identically for all scales and once recalibrated for each scale and resolution (from 1:1,000,000 and one km pixel resolution to 1:25,000 and ten meters resolution). The predictive performance of the two models was assessed using the ROC (Receiver Operating Characteristic) curve and the AUC (Area Under Curve) parameter and the results indicate a good correspondence between the susceptibility estimated for the test samples (0.855-0.890) and for the validation samples (0.830-0.865). Finally, the results were compared in pairs in order to fix the errors at small scale and low resolution and to optimize the methodology for landslide susceptibility mapping on large areas.
Kelley, Shana O.; Mirkin, Chad A.; Walt, David R.; Ismagilov, Rustem F.; Toner, Mehmet; Sargent, Edward H.
2015-01-01
Rapid progress in identifying disease biomarkers has increased the importance of creating high-performance detection technologies. Over the last decade, the design of many detection platforms has focused on either the nano or micro length scale. Here, we review recent strategies that combine nano- and microscale materials and devices to produce large improvements in detection sensitivity, speed and accuracy, allowing previously undetectable biomarkers to be identified in clinical samples. Microsensors that incorporate nanoscale features can now rapidly detect disease-related nucleic acids expressed in patient samples. New microdevices that separate large clinical samples into nanocompartments allow precise quantitation of analytes, and microfluidic systems that utilize nanoscale binding events can detect rare cancer cells in the bloodstream more accurately than before. These advances will lead to faster and more reliable clinical diagnostic devices. PMID:25466541
NASA Astrophysics Data System (ADS)
Kelley, Shana O.; Mirkin, Chad A.; Walt, David R.; Ismagilov, Rustem F.; Toner, Mehmet; Sargent, Edward H.
2014-12-01
Rapid progress in identifying disease biomarkers has increased the importance of creating high-performance detection technologies. Over the last decade, the design of many detection platforms has focused on either the nano or micro length scale. Here, we review recent strategies that combine nano- and microscale materials and devices to produce large improvements in detection sensitivity, speed and accuracy, allowing previously undetectable biomarkers to be identified in clinical samples. Microsensors that incorporate nanoscale features can now rapidly detect disease-related nucleic acids expressed in patient samples. New microdevices that separate large clinical samples into nanocompartments allow precise quantitation of analytes, and microfluidic systems that utilize nanoscale binding events can detect rare cancer cells in the bloodstream more accurately than before. These advances will lead to faster and more reliable clinical diagnostic devices.
Large Scale Simulation Platform for NODES Validation Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sotorrio, P.; Qin, Y.; Min, L.
2017-04-27
This report summarizes the Large Scale (LS) simulation platform created for the Eaton NODES project. The simulation environment consists of both wholesale market simulator and distribution simulator and includes the CAISO wholesale market model and a PG&E footprint of 25-75 feeders to validate the scalability under a scenario of 33% RPS in California with additional 17% of DERS coming from distribution and customers. The simulator can generate hourly unit commitment, 5-minute economic dispatch, and 4-second AGC regulation signals. The simulator is also capable of simulating greater than 10k individual controllable devices. Simulated DERs include water heaters, EVs, residential and lightmore » commercial HVAC/buildings, and residential-level battery storage. Feeder-level voltage regulators and capacitor banks are also simulated for feeder-level real and reactive power management and Vol/Var control.« less
Shock heating in numerical simulations of kink-unstable coronal loops
Bareford, M. R.; Hood, A. W.
2015-01-01
An analysis of the importance of shock heating within coronal magnetic fields has hitherto been a neglected area of study. We present new results obtained from nonlinear magnetohydrodynamic simulations of straight coronal loops. This work shows how the energy released from the magnetic field, following an ideal instability, can be converted into thermal energy, thereby heating the solar corona. Fast dissipation of magnetic energy is necessary for coronal heating and this requirement is compatible with the time scales associated with ideal instabilities. Therefore, we choose an initial loop configuration that is susceptible to the fast-growing kink, an instability that is likely to be created by convectively driven vortices, occurring where the loop field intersects the photosphere (i.e. the loop footpoints). The large-scale deformation of the field caused by the kinking creates the conditions for the formation of strong current sheets and magnetic reconnection, which have previously been considered as sites of heating, under the assumption of an enhanced resistivity. However, our simulations indicate that slow mode shocks are the primary heating mechanism, since, as well as creating current sheets, magnetic reconnection also generates plasma flows that are faster than the slow magnetoacoustic wave speed. PMID:25897092
A Weyl-Dirac cosmological model with DM and DE
NASA Astrophysics Data System (ADS)
Israelit, Mark
2011-03-01
In the Weyl-Dirac (W-D) framework a spatially closed cosmological model is considered. It is assumed that the space-time of the universe has a chaotic Weylian microstructure but is described on a large scale by Riemannian geometry. Locally fields of the Weyl connection vector act as creators of massive bosons having spin 1. It is suggested that these bosons, called weylons, provide most of the dark matter in the universe. At the beginning the universe is a spherically symmetric geometric entity without matter. Primary matter is created by Dirac’s gauge function very close to the beginning. In the early epoch, when the temperature of the universe achieves its maximum, chaotically oriented Weyl vector fields being localized in micro-cells create weylons. In the dust dominated period Dirac’s gauge function is giving rise to dark energy, the latter causing the cosmic acceleration at present. This oscillatory universe has an initial radius identical to the Plank length = 1.616 exp (-33) cm, at present the cosmic scale factor is 3.21 exp (28) cm, while its maximum value is 8.54 exp (28) cm. All forms of matter are created by geometrically based functions of the W-D theory.
Magesa, Stephen M; Lengeler, Christian; deSavigny, Don; Miller, Jane E; Njau, Ritha JA; Kramer, Karen; Kitua, Andrew; Mwita, Alex
2005-01-01
Introduction Malaria is the largest cause of health services attendance, hospital admissions and child deaths in Tanzania. At the Abuja Summit in April 2000 Tanzania committed itself to protect 60% of its population at high risk of malaria by 2005. The country is, therefore, determined to ensure that sustainable malaria control using insecticide-treated nets is carried out on a national scale. Case description Tanzania has been involved for two decades in the research process for developing insecticide-treated nets as a malaria control tool, from testing insecticides and net types, to assessing their efficacy and effectiveness, and exploring new ways of distribution. Since 2000, the emphasis has changed from a project approach to that of a concerted multi-stakeholder action for taking insecticide-treated nets to national scale (NATNETS). This means creating conditions that make insecticide-treated nets accessible and affordable to all those at risk of malaria in the country. This paper describes Tanzania's experience in (1) creating an enabling environment for insecticide-treated nets scale-up, (2) promoting the development of a commercial sector for insecticide-treated nets, and (3) targeting pregnant women with highly subsidized insecticide-treated nets through a national voucher scheme. As a result, nearly 2 million insecticide-treated nets and 2.2 million re-treatment kits were distributed in 2004. Conclusion National upscaling of insecticide-treated nets is possible when the programme is well designed, coordinated and supported by committed stakeholders; the Abuja target of protecting 60% of those at high risk is feasible, even for large endemic countries. PMID:16042780
Geopolymers and Their Uses: Review
NASA Astrophysics Data System (ADS)
Burduhos Nergis, D. D.; Abdullah, M. M. A. B.; Vizureanu, P.; Tahir, M. F. M.
2018-06-01
Outlining the past-present history of the study of alumino-silicate materials, it is well known that geopolymers are inorganic polymers obtained from chemical reaction, also known as geopolymerisation, between an alkaline solution and a solid reach in aluminium and silicone. There is still some controversy surrounding the alkaline activators used to create geopolymer concrete, because homogeneous mixture composed of two (NaOH and Na2SO3) or more chemical in varying proportions are usually highly corrosive and hard to handle. In order to overcome Portland cement many wastes have been used in recent studies to create “friendly” cements by geopolymerisation. In this short review we present basic information’s about how to create and use geopolymers, alkaline activators and raw materials that can be used and conclusions. One question that needs to be asked: Can those materials replace on large scale Portland cement?
Development of the Large-Scale Forcing Data to Support MC3E Cloud Modeling Studies
NASA Astrophysics Data System (ADS)
Xie, S.; Zhang, Y.
2011-12-01
The large-scale forcing fields (e.g., vertical velocity and advective tendencies) are required to run single-column and cloud-resolving models (SCMs/CRMs), which are the two key modeling frameworks widely used to link field data to climate model developments. In this study, we use an advanced objective analysis approach to derive the required forcing data from the soundings collected by the Midlatitude Continental Convective Cloud Experiment (MC3E) in support of its cloud modeling studies. MC3E is the latest major field campaign conducted during the period 22 April 2011 to 06 June 2011 in south-central Oklahoma through a joint effort between the DOE ARM program and the NASA Global Precipitation Measurement Program. One of its primary goals is to provide a comprehensive dataset that can be used to describe the large-scale environment of convective cloud systems and evaluate model cumulus parameterizations. The objective analysis used in this study is the constrained variational analysis method. A unique feature of this approach is the use of domain-averaged surface and top-of-the atmosphere (TOA) observations (e.g., precipitation and radiative and turbulent fluxes) as constraints to adjust atmospheric state variables from soundings by the smallest possible amount to conserve column-integrated mass, moisture, and static energy so that the final analysis data is dynamically and thermodynamically consistent. To address potential uncertainties in the surface observations, an ensemble forcing dataset will be developed. Multi-scale forcing will be also created for simulating various scale convective systems. At the meeting, we will provide more details about the forcing development and present some preliminary analysis of the characteristics of the large-scale forcing structures for several selected convective systems observed during MC3E.
The Large Scale Distribution of Water Ice in the Polar Regions of the Moon
NASA Astrophysics Data System (ADS)
Jordan, A.; Wilson, J. K.; Schwadron, N.; Spence, H. E.
2017-12-01
For in situ resource utilization, one must know where water ice is on the Moon. Many datasets have revealed both surface deposits of water ice and subsurface deposits of hydrogen near the lunar poles, but it has proved difficult to resolve the differences among the locations of these deposits. Despite these datasets disagreeing on how deposits are distributed on small scales, we show that most of these datasets do agree on the large scale distribution of water ice. We present data from the Cosmic Ray Telescope for the Effects of Radiation (CRaTER) on the Lunar Reconnaissance Orbiter (LRO), LRO's Lunar Exploration Neutron Detector (LEND), the Neutron Spectrometer on Lunar Prospector (LPNS), LRO's Lyman Alpha Mapping Project (LAMP), LRO's Lunar Orbiter Laser Altimeter (LOLA), and Chandrayaan-1's Moon Mineralogy Mapper (M3). All, including those that show clear evidence for water ice, reveal surprisingly similar trends with latitude, suggesting that both surface and subsurface datasets are measuring ice. All show that water ice increases towards the poles, and most demonstrate that its signature appears at about ±70° latitude and increases poleward. This is consistent with simulations of how surface and subsurface cold traps are distributed with latitude. This large scale agreement constrains the origin of the ice, suggesting that an ancient cometary impact (or impacts) created a large scale deposit that has been rendered locally heterogeneous by subsequent impacts. Furthermore, it also shows that water ice may be available down to ±70°—latitudes that are more accessible than the poles for landing.
Updates to WFC3/UVIS Filter-Dependent and Filter-Distinct Distortion Corrections
NASA Astrophysics Data System (ADS)
Martlin, Catherine; Kozhurina-Platais, Vera; McKay, Myles; Sabbi, Elena
2018-06-01
The WFC3/UVIS filter wheel contains 63 filters that cover a large range of wavelengths from near ultraviolet to the near infrared. Previously, analysis was completed on the 14 most used UVIS filters to calibrate geometric distortions. These distortions are due to a combination of the optical assembly of HST as well as the variabilities in the composition of individual filters. We report recent updates to reference files that aid in correcting for these distortions of an additional 22 UVIS narrow and medium band filters and 4 unique UVIS filters. They were created following a calibration of the large-scale optical distortions and fine-scale filter-dependent distortions. Furthermore, we present results on a study into a selection of unique polynomial coefficient terms from all solved filters which allows us to better investigate the filter-dependent patterns across a large range of wavelengths.These updates will provide important enhancements for HST/WFC3 users as they allow more accurate alignment of images across the range of UVIS filters.
Large-Eddy Simulation of Internal Flow through Human Vocal Folds
NASA Astrophysics Data System (ADS)
Lasota, Martin; Šidlof, Petr
2018-06-01
The phonatory process occurs when air is expelled from the lungs through the glottis and the pressure drop causes flow-induced oscillations of the vocal folds. The flow fields created in phonation are highly unsteady and the coherent vortex structures are also generated. For accuracy it is essential to compute on humanlike computational domain and appropriate mathematical model. The work deals with numerical simulation of air flow within the space between plicae vocales and plicae vestibulares. In addition to the dynamic width of the rima glottidis, where the sound is generated, there are lateral ventriculus laryngis and sacculus laryngis included in the computational domain as well. The paper presents the results from OpenFOAM which are obtained with a large-eddy simulation using second-order finite volume discretization of incompressible Navier-Stokes equations. Large-eddy simulations with different subgrid scale models are executed on structured mesh. In these cases are used only the subgrid scale models which model turbulence via turbulent viscosity and Boussinesq approximation in subglottal and supraglottal area in larynx.
Dispersion in Fractures with Ramified Dissolution Patterns
NASA Astrophysics Data System (ADS)
Xu, Le; Marks, Benjy; Toussaint, Renaud; Flekkøy, Eirik G.; Måløy, Knut J.
2018-04-01
The injection of a reactive fluid into an open fracture may modify the fracture surface locally and create a ramified structure around the injection point. This structure will have a significant impact on the dispersion of the injected fluid due to increased permeability, which will introduce large velocity fluctuations into the fluid. Here, we have injected a fluorescent tracer fluid into a transparent artificial fracture with such a ramified structure. The transparency of the model makes it possible to follow the detailed dispersion of the tracer concentration. The experiments have been compared to two dimensional (2D) computer simulations which include both convective motion and molecular diffusion. A comparison was also performed between the dispersion from an initially ramified dissolution structure and the dispersion from an initially circular region. A significant difference was seen both at small and large length scales. At large length scales, the persistence of the anisotropy of the concentration distribution far from the ramified structure is discussed with reference to some theoretical considerations and comparison with simulations.
OMERO and Bio-Formats 5: flexible access to large bioimaging datasets at scale
NASA Astrophysics Data System (ADS)
Moore, Josh; Linkert, Melissa; Blackburn, Colin; Carroll, Mark; Ferguson, Richard K.; Flynn, Helen; Gillen, Kenneth; Leigh, Roger; Li, Simon; Lindner, Dominik; Moore, William J.; Patterson, Andrew J.; Pindelski, Blazej; Ramalingam, Balaji; Rozbicki, Emil; Tarkowska, Aleksandra; Walczysko, Petr; Allan, Chris; Burel, Jean-Marie; Swedlow, Jason
2015-03-01
The Open Microscopy Environment (OME) has built and released Bio-Formats, a Java-based proprietary file format conversion tool and OMERO, an enterprise data management platform under open source licenses. In this report, we describe new versions of Bio-Formats and OMERO that are specifically designed to support large, multi-gigabyte or terabyte scale datasets that are routinely collected across most domains of biological and biomedical research. Bio- Formats reads image data directly from native proprietary formats, bypassing the need for conversion into a standard format. It implements the concept of a file set, a container that defines the contents of multi-dimensional data comprised of many files. OMERO uses Bio-Formats to read files natively, and provides a flexible access mechanism that supports several different storage and access strategies. These new capabilities of OMERO and Bio-Formats make them especially useful for use in imaging applications like digital pathology, high content screening and light sheet microscopy that create routinely large datasets that must be managed and analyzed.
Karasek, Robert; Choi, BongKyoo; Ostergren, Per-Olof; Ferrario, Marco; De Smet, Patrick
2007-01-01
Scale comparative properties of "JCQ-like" questionnaires with respect to the JCQ have been little known. Assessing validity and reliability of two methods for generating comparable scale scores between the Job Content Questionnaire (JCQ) and JCQ-like questionnaires in sub-populations of the large Job Stress, Absenteeism and Coronary Heart Disease European Cooperative (JACE) study: the Swedish version of Demand-Control Questionnaire (DCQ) and a transformed Multinational Monitoring of Trends and Determinants in Cardiovascular Disease Project (MONICA) questionnaire. A random population sample of all Malmo males and females aged 52-58 (n = 682) years was given a new test questionnaire with both instruments (the JCQ and the DCQ). Comparability-facilitating algorithms were created (Method I). For the transformed Milan MONICA questionnaire, a simple weighting system was used (Method II). The converted scale scores from the JCQ-like questionnaires were found to be reliable and highly correlated to those of the original JCQ. However, agreements for the high job strain group between the JCQ and the DCQ, and between the JCQ and the DCQ (Method I applied) were only moderate (Kappa). Use of a multiple level job strain scale generated higher levels of job strain agreement, as did a new job strain definition that excludes the intermediate levels of the job strain distribution. The two methods were valid and generally reliable.
Is There Future Utility in Nuclear Weapons Nuclear Weapons Save Lives
2014-02-13
operate with relative impunity short of large-scale conflict. Some point to a nuclear India and Pakistan as an example of instability concern. In...1997, South Asia observer Neil Joeck argued that “ India and Pakistan’s nuclear capabilities have not created strategic stability (and) do not reduce...elimination of illiteracy , provision of sustainable energy, debt relief for developing countries, clearance of landmines and more has been estimated
William W. Oliver
2001-01-01
Conflicts over changing demands on our increasingly scarce stands of late successional ponderosa pine could be abated by increasing the proportion of stands with late successional attributes in the forest land base. However, we don't know whether these attributes can be developed through the management of younger stands. Nor do we know whether late successional...
Selected Papers on Low-Energy Antiprotons and Possible Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noble, Robert
1998-09-19
The only realistic means by which to create a facility at Fermilab to produce large amounts of low energy antiprotons is to use resources which already exist. There is simply too little money and manpower at this point in time to generate new accelerators on a time scale before the turn of the century. Therefore, innovation is required to modify existing equipment to provide the services required by experimenters.
Strategic Plan for the U.S. Climate Change Science Program
2003-07-01
the Amazon have been conducted in the framework of the Large- Scale Biosphere-Atmosphere Experiment in Amazonia (LBA), a cooperative international...native rodent , the deer mouse (Peromyscus maniculatus). Public health officials wanted to understand the cause of the outbreak so they could develop...winter of 1992, were thought to have created favorable conditions for an increase in local rodent populations. It was suggested that a cascading series of
Jennifer C. Jenkins; Richard A. Birdsey
2000-01-01
As interest grows in the role of forest growth in the carbon cycle, and as simulation models are applied to predict future forest productivity at large spatial scales, the need for reliable and field-based data for evaluation of model estimates is clear. We created estimates of potential forest biomass and annual aboveground production for the Chesapeake Bay watershed...
Full-color large-scaled computer-generated holograms for physical and non-physical objects
NASA Astrophysics Data System (ADS)
Matsushima, Kyoji; Tsuchiyama, Yasuhiro; Sonobe, Noriaki; Masuji, Shoya; Yamaguchi, Masahiro; Sakamoto, Yuji
2017-05-01
Several full-color high-definition CGHs are created for reconstructing 3D scenes including real-existing physical objects. The field of the physical objects are generated or captured by employing three techniques; 3D scanner, synthetic aperture digital holography, and multi-viewpoint images. Full-color reconstruction of high-definition CGHs is realized by RGB color filters. The optical reconstructions are presented for verifying these techniques.
NASA Technical Reports Server (NTRS)
Wang, Wenlong; Mandra, Salvatore; Katzgraber, Helmut G.
2016-01-01
In this paper, we propose a patch planting method for creating arbitrarily large spin glass instances with known ground states. The scaling of the computational complexity of these instances with various block numbers and sizes is investigated and compared with random instances using population annealing Monte Carlo and the quantum annealing DW2X machine. The method can be useful for benchmarking tests for future generation quantum annealing machines, classical and quantum mechanical optimization algorithms.
ERIC Educational Resources Information Center
Lewin, Keith M.; Sabates, Ricardo
2011-01-01
This paper explores patterns of growth in participation in six Anglophone and seven Francophone countries in SSA. The countries are Kenya, Malawi, Nigeria, Tanzania, Uganda, Zambia, Benin, Burkina Faso, Cameroon, Madagascar, Mali, Niger and Senegal. These countries all have large scale Universal Primary Education programmes and all have…
ERIC Educational Resources Information Center
Vien, Hguyen Khac
Educational policy in Viet Nam has closely followed the revolutionary movement. In the essentially democratic period from 1945 to 1960, Viet Nam created a nationwide 10-grade school system and fought illiteracy on a large scale. By 1960, as socialism began to predominate, especially in the North, traditional educational methods and values began to…
Characterizing Multiple Wireless Sensor Networks for Large-Scale Radio Tomography
2015-03-01
with other transceivers over a wireless frequency. A base station transceiver collects the information and processes the information into something...or most other obstructions in between the two links [4]. A base station transceiver is connected to a processing computer to collect the RSS of each... transceivers at four different heights to create a Three-Dimensional (3-D) RTI network. Using shadowing- based RTI, this research demonstrated that RTI
Optical metasurfaces for high angle steering at visible wavelengths
Lin, Dianmin; Melli, Mauro; Poliakov, Evgeni; ...
2017-05-23
Metasurfaces have facilitated the replacement of conventional optical elements with ultrathin and planar photonic structures. Previous designs of metasurfaces were limited to small deflection angles and small ranges of the angle of incidence. Here, we have created two types of Si-based metasurfaces to steer visible light to a large deflection angle. These structures exhibit high diffraction efficiencies over a broad range of angles of incidence. We have demonstrated metasurfaces working both in transmission and reflection modes based on conventional thin film silicon processes that are suitable for the large-scale fabrication of high-performance devices.
An effective online data monitoring and saving strategy for large-scale climate simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xian, Xiaochen; Archibald, Rick; Mayer, Benjamin
Large-scale climate simulation models have been developed and widely used to generate historical data and study future climate scenarios. These simulation models often have to run for a couple of months to understand the changes in the global climate over the course of decades. This long-duration simulation process creates a huge amount of data with both high temporal and spatial resolution information; however, how to effectively monitor and record the climate changes based on these large-scale simulation results that are continuously produced in real time still remains to be resolved. Due to the slow process of writing data to disk,more » the current practice is to save a snapshot of the simulation results at a constant, slow rate although the data generation process runs at a very high speed. This study proposes an effective online data monitoring and saving strategy over the temporal and spatial domains with the consideration of practical storage and memory capacity constraints. Finally, our proposed method is able to intelligently select and record the most informative extreme values in the raw data generated from real-time simulations in the context of better monitoring climate changes.« less
An effective online data monitoring and saving strategy for large-scale climate simulations
Xian, Xiaochen; Archibald, Rick; Mayer, Benjamin; ...
2018-01-22
Large-scale climate simulation models have been developed and widely used to generate historical data and study future climate scenarios. These simulation models often have to run for a couple of months to understand the changes in the global climate over the course of decades. This long-duration simulation process creates a huge amount of data with both high temporal and spatial resolution information; however, how to effectively monitor and record the climate changes based on these large-scale simulation results that are continuously produced in real time still remains to be resolved. Due to the slow process of writing data to disk,more » the current practice is to save a snapshot of the simulation results at a constant, slow rate although the data generation process runs at a very high speed. This study proposes an effective online data monitoring and saving strategy over the temporal and spatial domains with the consideration of practical storage and memory capacity constraints. Finally, our proposed method is able to intelligently select and record the most informative extreme values in the raw data generated from real-time simulations in the context of better monitoring climate changes.« less
Reid, Beth; Ho, Shirley; Padmanabhan, Nikhil; ...
2015-11-17
The Baryon Oscillation Spectroscopic Survey (BOSS), part of the Sloan Digital Sky Survey (SDSS) III project, has provided the largest survey of galaxy redshifts available to date, in terms of both the number of galaxy redshifts measured by a single survey, and the effective cosmological volume covered. Key to analysing the clustering of these data to provide cosmological measurements is understanding the detailed properties of this sample. Potential issues include variations in the target catalogue caused by changes either in the targeting algorithm or properties of the data used, the pattern of spectroscopic observations, the spatial distribution of targets formore » which redshifts were not obtained, and variations in the target sky density due to observational systematics. We document here the target selection algorithms used to create the galaxy samples that comprise BOSS. We also present the algorithms used to create large-scale structure catalogues for the final Data Release (DR12) samples and the associated random catalogues that quantify the survey mask. The algorithms are an evolution of those used by the BOSS team to construct catalogues from earlier data, and have been designed to accurately quantify the galaxy sample. Furthermore, the code used, designated mksample, is released with this paper.« less
Extra-Tropical Cyclones at Climate Scales: Comparing Models to Observations
NASA Astrophysics Data System (ADS)
Tselioudis, G.; Bauer, M.; Rossow, W.
2009-04-01
Climate is often defined as the accumulation of weather, and weather is not the concern of climate models. Justification for this latter sentiment has long been hidden behind coarse model resolutions and blunt validation tools based on climatological maps. The spatial-temporal resolutions of today's climate models and observations are converging onto meteorological scales, however, which means that with the correct tools we can test the largely unproven assumption that climate model weather is correct enough that its accumulation results in a robust climate simulation. Towards this effort we introduce a new tool for extracting detailed cyclone statistics from observations and climate model output. These include the usual cyclone characteristics (centers, tracks), but also adaptive cyclone-centric composites. We have created a novel dataset, the MAP Climatology of Mid-latitude Storminess (MCMS), which provides a detailed 6 hourly assessment of the areas under the influence of mid-latitude cyclones, using a search algorithm that delimits the boundaries of each system from the outer-most closed SLP contour. Using this we then extract composites of cloud, radiation, and precipitation properties from sources such as ISCCP and GPCP to create a large comparative dataset for climate model validation. A demonstration of the potential usefulness of these tools in process-based climate model evaluation studies will be shown.
NASA Astrophysics Data System (ADS)
Harrington, Kathleen; CLASS Collaboration
2018-01-01
The search for inflationary primordial gravitational waves and the optical depth to reionization, both through their imprint on the large angular scale correlations in the polarization of the cosmic microwave background (CMB), has created the need for high sensitivity measurements of polarization across large fractions of the sky at millimeter wavelengths. These measurements are subjected to instrumental and atmospheric 1/f noise, which has motivated the development of polarization modulators to facilitate the rejection of these large systematic effects.Variable-delay polarization modulators (VPMs) are used in the Cosmology Large Angular Scale Surveyor (CLASS) telescopes as the first element in the optical chain to rapidly modulate the incoming polarization. VPMs consist of a linearly polarizing wire grid in front of a moveable flat mirror; varying the distance between the grid and the mirror produces a changing phase shift between polarization states parallel and perpendicular to the grid which modulates Stokes U (linear polarization at 45°) and Stokes V (circular polarization). The reflective and scalable nature of the VPM enables its placement as the first optical element in a reflecting telescope. This simultaneously allows a lock-in style polarization measurement and the separation of sky polarization from any instrumental polarization farther along in the optical chain.The Q-Band CLASS VPM was the first VPM to begin observing the CMB full time in 2016. I will be presenting its design and characterization as well as demonstrating how modulating polarization significantly rejects atmospheric and instrumental long time scale noise.
Using First Differences to Reduce Inhomogeneity in Radiosonde Temperature Datasets.
NASA Astrophysics Data System (ADS)
Free, Melissa; Angell, James K.; Durre, Imke; Lanzante, John; Peterson, Thomas C.; Seidel, Dian J.
2004-11-01
The utility of a “first difference” method for producing temporally homogeneous large-scale mean time series is assessed. Starting with monthly averages, the method involves dropping data around the time of suspected discontinuities and then calculating differences in temperature from one year to the next, resulting in a time series of year-to-year differences for each month at each station. These first difference time series are then combined to form large-scale means, and mean temperature time series are constructed from the first difference series. When applied to radiosonde temperature data, the method introduces random errors that decrease with the number of station time series used to create the large-scale time series and increase with the number of temporal gaps in the station time series. Root-mean-square errors for annual means of datasets produced with this method using over 500 stations are estimated at no more than 0.03 K, with errors in trends less than 0.02 K decade-1 for 1960 97 at 500 mb. For a 50-station dataset, errors in trends in annual global means introduced by the first differencing procedure may be as large as 0.06 K decade-1 (for six breaks per series), which is greater than the standard error of the trend. Although the first difference method offers significant resource and labor advantages over methods that attempt to adjust the data, it introduces an error in large-scale mean time series that may be unacceptable in some cases.
Sundramoorthy, Ashok K.; Wang, Yilei; Wang, Jing; Che, Jianfei; Thong, Ya Xuan; Lu, Albert Chee W.; Chan-Park, Mary B.
2015-01-01
Graphene is a promising candidate material for transparent conductive films because of its excellent conductivity and one-carbon-atom thickness. Graphene oxide flakes prepared by Hummers method are typically several microns in size and must be pieced together in order to create macroscopic films. We report a macro-scale thin film fabrication method which employs a three-dimensional (3-D) surfactant, 4-sulfocalix[4]arene (SCX), as a lateral aggregating agent. After electrochemical exfoliation, the partially oxidized graphene (oGr) flakes are dispersed with SCX. The SCX forms micelles, which adsorb on the oGr flakes to enhance their dispersion, also promote aggregation into large-scale thin films under vacuum filtration. A thin oGr/SCX film can be shaved off from the aggregated oGr/SCX cake by immersing the cake in water. The oGr/SCX thin-film floating on the water can be subsequently lifted from the water surface with a substrate. The reduced oGr (red-oGr) films can be as thin as 10−20 nm with a transparency of >90% and sheet resistance of 890 ± 47 kΩ/sq. This method of electrochemical exfoliation followed by SCX-assisted suspension and hydrazine reduction, avoids using large amounts of strong acid (unlike Hummers method), is relatively simple and can easily form a large scale conductive and transparent film from oGr/SCX suspension. PMID:26040436
Li, Jian-Feng; Bush, Jenifer; Xiong, Yan; Li, Lei; McCormack, Matthew
2011-01-01
Protein-protein interactions (PPIs) constitute the regulatory network that coordinates diverse cellular functions. There are growing needs in plant research for creating protein interaction maps behind complex cellular processes and at a systems biology level. However, only a few approaches have been successfully used for large-scale surveys of PPIs in plants, each having advantages and disadvantages. Here we present split firefly luciferase complementation (SFLC) as a highly sensitive and noninvasive technique for in planta PPI investigation. In this assay, the separate halves of a firefly luciferase can come into close proximity and transiently restore its catalytic activity only when their fusion partners, namely the two proteins of interest, interact with each other. This assay was conferred with quantitativeness and high throughput potential when the Arabidopsis mesophyll protoplast system and a microplate luminometer were employed for protein expression and luciferase measurement, respectively. Using the SFLC assay, we could monitor the dynamics of rapamycin-induced and ascomycin-disrupted interaction between Arabidopsis FRB and human FKBP proteins in a near real-time manner. As a proof of concept for large-scale PPI survey, we further applied the SFLC assay to testing 132 binary PPIs among 8 auxin response factors (ARFs) and 12 Aux/IAA proteins from Arabidopsis. Our results demonstrated that the SFLC assay is ideal for in vivo quantitative PPI analysis in plant cells and is particularly powerful for large-scale binary PPI screens.
Kinetic dissipation and anisotropic heating in a turbulent collisionless plasma
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parashar, T. N.; Shay, M. A.; Cassak, P. A.
The kinetic evolution of the Orszag-Tang vortex is studied using collisionless hybrid simulations. In magnetohydrodynamics (MHD) this configuration leads rapidly to broadband turbulence. At large length scales, the evolution of the hybrid simulations is very similar to MHD, with magnetic power spectra displaying scaling similar to a Kolmogorov scaling of -5/3. At small scales, differences from MHD arise, as energy dissipates into heat almost exclusively through the magnetic field. The magnetic energy spectrum of the hybrid simulation shows a break where linear theory predicts that the Hall term in Ohm's law becomes significant, leading to dispersive kinetic Alfven waves. Amore » key result is that protons are heated preferentially in the plane perpendicular to the mean magnetic field, creating a proton temperature anisotropy of the type observed in the corona and solar wind.« less
Systematic Analysis of Splice-Site-Creating Mutations in Cancer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jayasinghe, Reyka G.; Cao, Song; Gao, Qingsong
For the past decade, cancer genomic studies have focused on mutations leading to splice-site disruption, overlooking those having splice-creating potential. Here, we applied a bioinformatic tool, MiSplice, for the large-scale discovery of splice-site-creating mutations (SCMs) across 8,656 TCGA tumors. We report 1,964 originally mis-annotated mutations having clear evidence of creating alternative splice junctions. TP53 and GATA3 have 26 and 18 SCMs, respectively, and ATRX has 5 from lower-grade gliomas. Mutations in 11 genes, including PARP1, BRCA1, and BAP1, were experimentally validated for splice-site-creating function. Notably, we found that neoantigens induced by SCMs are likely several folds more immunogenic compared tomore » missense mutations, exemplified by the recurrent GATA3 SCM. Further, high expression of PD-1 and PD-L1 was observed in tumors with SCMs, suggesting candidates for immune blockade therapy. Finally, our work highlights the importance of integrating DNA and RNA data for understanding the functional and the clinical implications of mutations in human diseases.« less
Systematic Analysis of Splice-Site-Creating Mutations in Cancer.
Jayasinghe, Reyka G; Cao, Song; Gao, Qingsong; Wendl, Michael C; Vo, Nam Sy; Reynolds, Sheila M; Zhao, Yanyan; Climente-González, Héctor; Chai, Shengjie; Wang, Fang; Varghese, Rajees; Huang, Mo; Liang, Wen-Wei; Wyczalkowski, Matthew A; Sengupta, Sohini; Li, Zhi; Payne, Samuel H; Fenyö, David; Miner, Jeffrey H; Walter, Matthew J; Vincent, Benjamin; Eyras, Eduardo; Chen, Ken; Shmulevich, Ilya; Chen, Feng; Ding, Li
2018-04-03
For the past decade, cancer genomic studies have focused on mutations leading to splice-site disruption, overlooking those having splice-creating potential. Here, we applied a bioinformatic tool, MiSplice, for the large-scale discovery of splice-site-creating mutations (SCMs) across 8,656 TCGA tumors. We report 1,964 originally mis-annotated mutations having clear evidence of creating alternative splice junctions. TP53 and GATA3 have 26 and 18 SCMs, respectively, and ATRX has 5 from lower-grade gliomas. Mutations in 11 genes, including PARP1, BRCA1, and BAP1, were experimentally validated for splice-site-creating function. Notably, we found that neoantigens induced by SCMs are likely several folds more immunogenic compared to missense mutations, exemplified by the recurrent GATA3 SCM. Further, high expression of PD-1 and PD-L1 was observed in tumors with SCMs, suggesting candidates for immune blockade therapy. Our work highlights the importance of integrating DNA and RNA data for understanding the functional and the clinical implications of mutations in human diseases. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Systematic Analysis of Splice-Site-Creating Mutations in Cancer
Jayasinghe, Reyka G.; Cao, Song; Gao, Qingsong; ...
2018-04-05
For the past decade, cancer genomic studies have focused on mutations leading to splice-site disruption, overlooking those having splice-creating potential. Here, we applied a bioinformatic tool, MiSplice, for the large-scale discovery of splice-site-creating mutations (SCMs) across 8,656 TCGA tumors. We report 1,964 originally mis-annotated mutations having clear evidence of creating alternative splice junctions. TP53 and GATA3 have 26 and 18 SCMs, respectively, and ATRX has 5 from lower-grade gliomas. Mutations in 11 genes, including PARP1, BRCA1, and BAP1, were experimentally validated for splice-site-creating function. Notably, we found that neoantigens induced by SCMs are likely several folds more immunogenic compared tomore » missense mutations, exemplified by the recurrent GATA3 SCM. Further, high expression of PD-1 and PD-L1 was observed in tumors with SCMs, suggesting candidates for immune blockade therapy. Finally, our work highlights the importance of integrating DNA and RNA data for understanding the functional and the clinical implications of mutations in human diseases.« less
Creating and validating GIS measures of urban design for health research.
Purciel, Marnie; Neckerman, Kathryn M; Lovasi, Gina S; Quinn, James W; Weiss, Christopher; Bader, Michael D M; Ewing, Reid; Rundle, Andrew
2009-12-01
Studies relating urban design to health have been impeded by the unfeasibility of conducting field observations across large areas and the lack of validated objective measures of urban design. This study describes measures for five dimensions of urban design - imageability, enclosure, human scale, transparency, and complexity - created using public geographic information systems (GIS) data from the US Census and city and state government. GIS measures were validated for a sample of 588 New York City block faces using a well-documented field observation protocol. Correlations between GIS and observed measures ranged from 0.28 to 0.89. Results show valid urban design measures can be constructed from digital sources.
Multi-dimensional PIC-simulations of parametric instabilities for shock-ignition conditions
NASA Astrophysics Data System (ADS)
Riconda, C.; Weber, S.; Klimo, O.; Héron, A.; Tikhonchuk, V. T.
2013-11-01
Laser-plasma interaction is investigated for conditions relevant for the shock-ignition (SI) scheme of inertial confinement fusion using two-dimensional particle-in-cell (PIC) simulations of an intense laser beam propagating in a hot, large-scale, non-uniform plasma. The temporal evolution and interdependence of Raman- (SRS), and Brillouin- (SBS), side/backscattering as well as Two-Plasmon-Decay (TPD) are studied. TPD is developing in concomitance with SRS creating a broad spectrum of plasma waves near the quarter-critical density. They are rapidly saturated due to plasma cavitation within a few picoseconds. The hot electron spectrum created by SRS and TPD is relatively soft, limited to energies below one hundred keV.
Web-based segmentation and display of three-dimensional radiologic image data.
Silverstein, J; Rubenstein, J; Millman, A; Panko, W
1998-01-01
In many clinical circumstances, viewing sequential radiological image data as three-dimensional models is proving beneficial. However, designing customized computer-generated radiological models is beyond the scope of most physicians, due to specialized hardware and software requirements. We have created a simple method for Internet users to remotely construct and locally display three-dimensional radiological models using only a standard web browser. Rapid model construction is achieved by distributing the hardware intensive steps to a remote server. Once created, the model is automatically displayed on the requesting browser and is accessible to multiple geographically distributed users. Implementation of our server software on large scale systems could be of great service to the worldwide medical community.
Creating and validating GIS measures of urban design for health research
Purciel, Marnie; Neckerman, Kathryn M.; Lovasi, Gina S.; Quinn, James W.; Weiss, Christopher; Bader, Michael D.M.; Ewing, Reid; Rundle, Andrew
2012-01-01
Studies relating urban design to health have been impeded by the unfeasibility of conducting field observations across large areas and the lack of validated objective measures of urban design. This study describes measures for five dimensions of urban design – imageability, enclosure, human scale, transparency, and complexity – created using public geographic information systems (GIS) data from the US Census and city and state government. GIS measures were validated for a sample of 588 New York City block faces using a well-documented field observation protocol. Correlations between GIS and observed measures ranged from 0.28 to 0.89. Results show valid urban design measures can be constructed from digital sources. PMID:22956856
Thatcher, T L; Wilson, D J; Wood, E E; Craig, M J; Sextro, R G
2004-08-01
Scale modeling is a useful tool for analyzing complex indoor spaces. Scale model experiments can reduce experimental costs, improve control of flow and temperature conditions, and provide a practical method for pretesting full-scale system modifications. However, changes in physical scale and working fluid (air or water) can complicate interpretation of the equivalent effects in the full-scale structure. This paper presents a detailed scaling analysis of a water tank experiment designed to model a large indoor space, and experimental results obtained with this model to assess the influence of furniture and people in the pollutant concentration field at breathing height. Theoretical calculations are derived for predicting the effects from losses of molecular diffusion, small scale eddies, turbulent kinetic energy, and turbulent mass diffusivity in a scale model, even without Reynolds number matching. Pollutant dispersion experiments were performed in a water-filled 30:1 scale model of a large room, using uranine dye injected continuously from a small point source. Pollutant concentrations were measured in a plane, using laser-induced fluorescence techniques, for three interior configurations: unobstructed, table-like obstructions, and table-like and figure-like obstructions. Concentrations within the measurement plane varied by more than an order of magnitude, even after the concentration field was fully developed. Objects in the model interior had a significant effect on both the concentration field and fluctuation intensity in the measurement plane. PRACTICAL IMPLICATION: This scale model study demonstrates both the utility of scale models for investigating dispersion in indoor environments and the significant impact of turbulence created by furnishings and people on pollutant transport from floor level sources. In a room with no furniture or occupants, the average concentration can vary by about a factor of 3 across the room. Adding furniture and occupants can increase this spatial variation by another factor of 3.
Native fish conservation areas: A vision for large-scale conservation of native fish communities
Williams, Jack E.; Williams, Richard N.; Thurow, Russell F.; Elwell, Leah; Philipp, David P.; Harris, Fred A.; Kershner, Jeffrey L.; Martinez, Patrick J.; Miller, Dirk; Reeves, Gordon H.; Frissell, Christopher A.; Sedell, James R.
2011-01-01
The status of freshwater fishes continues to decline despite substantial conservation efforts to reverse this trend and recover threatened and endangered aquatic species. Lack of success is partially due to working at smaller spatial scales and focusing on habitats and species that are already degraded. Protecting entire watersheds and aquatic communities, which we term "native fish conservation areas" (NFCAs), would complement existing conservation efforts by protecting intact aquatic communities while allowing compatible uses. Four critical elements need to be met within a NFCA: (1) maintain processes that create habitat complexity, diversity, and connectivity; (2) nurture all of the life history stages of the fishes being protected; (3) include a long-term enough watershed to provide long-term persistence of native fish populations; and (4) provide management that is sustainable over time. We describe how a network of protected watersheds could be created that would anchor aquatic conservation needs in river basins across the country.
Incorporating Measurement Non-Equivalence in a Cross-Study Latent Growth Curve Analysis
Flora, David B.; Curran, Patrick J.; Hussong, Andrea M.; Edwards, Michael C.
2009-01-01
A large literature emphasizes the importance of testing for measurement equivalence in scales that may be used as observed variables in structural equation modeling applications. When the same construct is measured across more than one developmental period, as in a longitudinal study, it can be especially critical to establish measurement equivalence, or invariance, across the developmental periods. Similarly, when data from more than one study are combined into a single analysis, it is again important to assess measurement equivalence across the data sources. Yet, how to incorporate non-equivalence when it is discovered is not well described for applied researchers. Here, we present an item response theory approach that can be used to create scale scores from measures while explicitly accounting for non-equivalence. We demonstrate these methods in the context of a latent curve analysis in which data from two separate studies are combined to create a single longitudinal model spanning several developmental periods. PMID:19890440
Optical Detection and Sizing of Single Nano-Particles Using Continuous Wetting Films
Hennequin, Yves; McLeod, Euan; Mudanyali, Onur; Migliozzi, Daniel; Ozcan, Aydogan; Dinten, Jean-Marc
2013-01-01
The physical interaction between nano-scale objects and liquid interfaces can create unique optical properties, enhancing the signatures of the objects with sub-wavelength features. Here we show that the evaporation on a wetting substrate of a polymer solution containing sub-micrometer or nano-scale particles creates liquid micro-lenses that arise from the local deformations of the continuous wetting film. These micro-lenses have properties similar to axicon lenses that are known to create beams with a long depth of focus. This enhanced depth of focus allows detection of single nanoparticles using a low magnification microscope objective lens, achieving a relatively wide field-of-view, while also lifting the constraints on precise focusing onto the object plane. Hence, by creating these liquid axicon lenses through spatial deformations of a continuous thin wetting film, we transfer the challenge of imaging individual nano-particles to detecting the light focused by these lenses. As a proof of concept, we demonstrate the detection and sizing of single nano-particles (100 and 200 nm), CpGV granuloviruses as well as Staphylococcus epidermidis bacteria over a wide field of view of e.g., 5.10×3.75 mm2 using a ×5 objective lens with a numerical aperture of 0.15. In addition to conventional lens-based microscopy, this continuous wetting film based approach is also applicable to lensfree computational on-chip imaging, which can be used to detect single nano-particles over a large field-of-view of e.g., >20-30 mm2. These results could be especially useful for high-throughput field-analysis of nano-scale objects using compact and cost-effective microscope designs. PMID:23889001
NASA Astrophysics Data System (ADS)
Hussain, M.; Chen, D.
2014-11-01
Buildings, the basic unit of an urban landscape, host most of its socio-economic activities and play an important role in the creation of urban land-use patterns. The spatial arrangement of different building types creates varied urban land-use clusters which can provide an insight to understand the relationships between social, economic, and living spaces. The classification of such urban clusters can help in policy-making and resource management. In many countries including the UK no national-level cadastral database containing information on individual building types exists in public domain. In this paper, we present a framework for inferring functional types of buildings based on the analysis of their form (e.g. geometrical properties, such as area and perimeter, layout) and spatial relationship from large topographic and address-based GIS database. Machine learning algorithms along with exploratory spatial analysis techniques are used to create the classification rules. The classification is extended to two further levels based on the functions (use) of buildings derived from address-based data. The developed methodology was applied to the Manchester metropolitan area using the Ordnance Survey's MasterMap®, a large-scale topographic and address-based data available for the UK.
Felo, Michael; Christensen, Brandon; Higgins, John
2013-01-01
The bioreactor volume delineating the selection of primary clarification technology is not always easily defined. Development of a commercial scale process for the manufacture of therapeutic proteins requires scale-up from a few liters to thousands of liters. While the separation techniques used for protein purification are largely conserved across scales, the separation techniques for primary cell culture clarification vary with scale. Process models were developed to compare monoclonal antibody production costs using two cell culture clarification technologies. One process model was created for cell culture clarification by disc stack centrifugation with depth filtration. A second process model was created for clarification by multi-stage depth filtration. Analyses were performed to examine the influence of bioreactor volume, product titer, depth filter capacity, and facility utilization on overall operating costs. At bioreactor volumes <1,000 L, clarification using multi-stage depth filtration offers cost savings compared to clarification using centrifugation. For bioreactor volumes >5,000 L, clarification using centrifugation followed by depth filtration offers significant cost savings. For bioreactor volumes of ∼ 2,000 L, clarification costs are similar between depth filtration and centrifugation. At this scale, factors including facility utilization, available capital, ease of process development, implementation timelines, and process performance characterization play an important role in clarification technology selection. In the case study presented, a multi-product facility selected multi-stage depth filtration for cell culture clarification at the 500 and 2,000 L scales of operation. Facility implementation timelines, process development activities, equipment commissioning and validation, scale-up effects, and process robustness are examined. © 2013 American Institute of Chemical Engineers.
Large-scale wind disturbances promote tree diversity in a Central Amazon forest.
Marra, Daniel Magnabosco; Chambers, Jeffrey Q; Higuchi, Niro; Trumbore, Susan E; Ribeiro, Gabriel H P M; Dos Santos, Joaquim; Negrón-Juárez, Robinson I; Reu, Björn; Wirth, Christian
2014-01-01
Canopy gaps created by wind-throw events, or blowdowns, create a complex mosaic of forest patches varying in disturbance intensity and recovery in the Central Amazon. Using field and remote sensing data, we investigated the short-term (four-year) effects of large (>2000 m(2)) blowdown gaps created during a single storm event in January 2005 near Manaus, Brazil, to study (i) how forest structure and composition vary with disturbance gradients and (ii) whether tree diversity is promoted by niche differentiation related to wind-throw events at the landscape scale. In the forest area affected by the blowdown, tree mortality ranged from 0 to 70%, and was highest on plateaus and slopes. Less impacted areas in the region affected by the blowdown had overlapping characteristics with a nearby unaffected forest in tree density (583 ± 46 trees ha(-1)) (mean ± 99% Confidence Interval) and basal area (26.7 ± 2.4 m(2) ha(-1)). Highly impacted areas had tree density and basal area as low as 120 trees ha(-1) and 14.9 m(2) ha(-1), respectively. In general, these structural measures correlated negatively with an index of tree mortality intensity derived from satellite imagery. Four years after the blowdown event, differences in size-distribution, fraction of resprouters, floristic composition and species diversity still correlated with disturbance measures such as tree mortality and gap size. Our results suggest that the gradients of wind disturbance intensity encompassed in large blowdown gaps (>2000 m(2)) promote tree diversity. Specialists for particular disturbance intensities existed along the entire gradient. The existence of species or genera taking an intermediate position between undisturbed and gap specialists led to a peak of rarefied richness and diversity at intermediate disturbance levels. A diverse set of species differing widely in requirements and recruitment strategies forms the initial post-disturbance cohort, thus lending a high resilience towards wind disturbances at the community level.
Large-Scale Wind Disturbances Promote Tree Diversity in a Central Amazon Forest
Marra, Daniel Magnabosco; Chambers, Jeffrey Q.; Higuchi, Niro; Trumbore, Susan E.; Ribeiro, Gabriel H. P. M.; dos Santos, Joaquim; Negrón-Juárez, Robinson I.; Reu, Björn; Wirth, Christian
2014-01-01
Canopy gaps created by wind-throw events, or blowdowns, create a complex mosaic of forest patches varying in disturbance intensity and recovery in the Central Amazon. Using field and remote sensing data, we investigated the short-term (four-year) effects of large (>2000 m2) blowdown gaps created during a single storm event in January 2005 near Manaus, Brazil, to study (i) how forest structure and composition vary with disturbance gradients and (ii) whether tree diversity is promoted by niche differentiation related to wind-throw events at the landscape scale. In the forest area affected by the blowdown, tree mortality ranged from 0 to 70%, and was highest on plateaus and slopes. Less impacted areas in the region affected by the blowdown had overlapping characteristics with a nearby unaffected forest in tree density (583±46 trees ha−1) (mean±99% Confidence Interval) and basal area (26.7±2.4 m2 ha−1). Highly impacted areas had tree density and basal area as low as 120 trees ha−1 and 14.9 m2 ha−1, respectively. In general, these structural measures correlated negatively with an index of tree mortality intensity derived from satellite imagery. Four years after the blowdown event, differences in size-distribution, fraction of resprouters, floristic composition and species diversity still correlated with disturbance measures such as tree mortality and gap size. Our results suggest that the gradients of wind disturbance intensity encompassed in large blowdown gaps (>2000 m2) promote tree diversity. Specialists for particular disturbance intensities existed along the entire gradient. The existence of species or genera taking an intermediate position between undisturbed and gap specialists led to a peak of rarefied richness and diversity at intermediate disturbance levels. A diverse set of species differing widely in requirements and recruitment strategies forms the initial post-disturbance cohort, thus lending a high resilience towards wind disturbances at the community level. PMID:25099118
Large-Scale Wind Disturbances Promote Tree Diversity in a Central Amazon Forest
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marra, Daniel Magnabosco; Chambers, Jeffrey Q.; Higuchi, Niro
Canopy gaps created by wind-throw events, or blowdowns, create a complex mosaic of forest patches varying in disturbance intensity and recovery in the Central Amazon. Using field and remote sensing data, we investigated the short-term (four-year) effects of large (>2000 m 2) blowdown gaps created during a single storm event in January 2005 near Manaus, Brazil, to study (i) how forest structure and composition vary with disturbance gradients and (ii) whether tree diversity is promoted by niche differentiation related to wind-throw events at the landscape scale. In the forest area affected by the blowdown, tree mortality ranged from 0 tomore » 70%, and was highest on plateaus and slopes. Less impacted areas in the region affected by the blowdown had overlapping characteristics with a nearby unaffected forest in tree density (583±46 trees ha -1) (mean±99% Confidence Interval) and basal area (26.7±2.4 m 2 ha -1). Highly impacted areas had tree density and basal area as low as 120 trees ha -1 and 14.9 m 2 ha -1, respectively. In general, these structural measures correlated negatively with an index of tree mortality intensity derived from satellite imagery. Four years after the blowdown event, differences in size-distribution, fraction of resprouters, floristic composition and species diversity still correlated with disturbance measures such as tree mortality and gap size. Our results suggest that the gradients of wind disturbance intensity encompassed in large blowdown gaps (>2000 m 2) promote tree diversity. Specialists for particular disturbance intensities existed along the entire gradient. The existence of species or genera taking an intermediate position between undisturbed and gap specialists led to a peak of rarefied richness and diversity at intermediate disturbance levels. A diverse set of species differing widely in requirements and recruitment strategies forms the initial post-disturbance cohort, thus lending a high resilience towards wind disturbances at the community level.« less
Large-Scale Wind Disturbances Promote Tree Diversity in a Central Amazon Forest
Marra, Daniel Magnabosco; Chambers, Jeffrey Q.; Higuchi, Niro; ...
2014-08-06
Canopy gaps created by wind-throw events, or blowdowns, create a complex mosaic of forest patches varying in disturbance intensity and recovery in the Central Amazon. Using field and remote sensing data, we investigated the short-term (four-year) effects of large (>2000 m 2) blowdown gaps created during a single storm event in January 2005 near Manaus, Brazil, to study (i) how forest structure and composition vary with disturbance gradients and (ii) whether tree diversity is promoted by niche differentiation related to wind-throw events at the landscape scale. In the forest area affected by the blowdown, tree mortality ranged from 0 tomore » 70%, and was highest on plateaus and slopes. Less impacted areas in the region affected by the blowdown had overlapping characteristics with a nearby unaffected forest in tree density (583±46 trees ha -1) (mean±99% Confidence Interval) and basal area (26.7±2.4 m 2 ha -1). Highly impacted areas had tree density and basal area as low as 120 trees ha -1 and 14.9 m 2 ha -1, respectively. In general, these structural measures correlated negatively with an index of tree mortality intensity derived from satellite imagery. Four years after the blowdown event, differences in size-distribution, fraction of resprouters, floristic composition and species diversity still correlated with disturbance measures such as tree mortality and gap size. Our results suggest that the gradients of wind disturbance intensity encompassed in large blowdown gaps (>2000 m 2) promote tree diversity. Specialists for particular disturbance intensities existed along the entire gradient. The existence of species or genera taking an intermediate position between undisturbed and gap specialists led to a peak of rarefied richness and diversity at intermediate disturbance levels. A diverse set of species differing widely in requirements and recruitment strategies forms the initial post-disturbance cohort, thus lending a high resilience towards wind disturbances at the community level.« less
Vanacker, Peter; Heldner, Mirjam R; Amiguet, Michael; Faouzi, Mohamed; Cras, Patrick; Ntaios, George; Arnold, Marcel; Mattle, Heinrich P; Gralla, Jan; Fischer, Urs; Michel, Patrik
2016-06-01
Endovascular treatment for acute ischemic stroke with a large vessel occlusion was recently shown to be effective. We aimed to develop a score capable of predicting large vessel occlusion eligible for endovascular treatment in the early hospital management. Retrospective, cohort study. Two tertiary, Swiss stroke centers. Consecutive acute ischemic stroke patients (1,645 patients; Acute STroke Registry and Analysis of Lausanne registry), who had CT angiography within 6 and 12 hours of symptom onset, were categorized according to the occlusion site. Demographic and clinical information was used in logistic regression analysis to derive predictors of large vessel occlusion (defined as intracranial carotid, basilar, and M1 segment of middle cerebral artery occlusions). Based on logistic regression coefficients, an integer score was created and validated internally and externally (848 patients; Bernese Stroke Registry). None. Large vessel occlusions were present in 316 patients (21%) in the derivation and 566 (28%) in the external validation cohort. Five predictors added significantly to the score: National Institute of Health Stroke Scale at admission, hemineglect, female sex, atrial fibrillation, and no history of stroke and prestroke handicap (modified Rankin Scale score, < 2). Diagnostic accuracy in internal and external validation cohorts was excellent (area under the receiver operating characteristic curve, 0.84 both). The score performed slightly better than National Institute of Health Stroke Scale alone regarding prediction error (Wilcoxon signed rank test, p < 0.001) and regarding discriminatory power in derivation and pooled cohorts (area under the receiver operating characteristic curve, 0.81 vs 0.80; DeLong test, p = 0.02). Our score accurately predicts the presence of emergent large vessel occlusions, which are eligible for endovascular treatment. However, incorporation of additional demographic and historical information available on hospital arrival provides minimal incremental predictive value compared with the National Institute of Health Stroke Scale alone.
Principle of Parsimony, Fake Science, and Scales
NASA Astrophysics Data System (ADS)
Yeh, T. C. J.; Wan, L.; Wang, X. S.
2017-12-01
Considering difficulties in predicting exact motions of water molecules, and the scale of our interests (bulk behaviors of many molecules), Fick's law (diffusion concept) has been created to predict solute diffusion process in space and time. G.I. Taylor (1921) demonstrated that random motion of the molecules reach the Fickian regime in less a second if our sampling scale is large enough to reach ergodic condition. Fick's law is widely accepted for describing molecular diffusion as such. This fits the definition of the parsimony principle at the scale of our concern. Similarly, advection-dispersion or convection-dispersion equation (ADE or CDE) has been found quite satisfactory for analysis of concentration breakthroughs of solute transport in uniformly packed soil columns. This is attributed to the solute is often released over the entire cross-section of the column, which has sampled many pore-scale heterogeneities and met the ergodicity assumption. Further, the uniformly packed column contains a large number of stationary pore-size heterogeneity. The solute thus reaches the Fickian regime after traveling a short distance along the column. Moreover, breakthrough curves are concentrations integrated over the column cross-section (the scale of our interest), and they meet the ergodicity assumption embedded in the ADE and CDE. To the contrary, scales of heterogeneity in most groundwater pollution problems evolve as contaminants travel. They are much larger than the scale of our observations and our interests so that the ergodic and the Fickian conditions are difficult. Upscaling the Fick's law for solution dispersion, and deriving universal rules of the dispersion to the field- or basin-scale pollution migrations are merely misuse of the parsimony principle and lead to a fake science ( i.e., the development of theories for predicting processes that can not be observed.) The appropriate principle of parsimony for these situations dictates mapping of large-scale heterogeneities as detailed as possible and adapting the Fick's law for effects of small-scale heterogeneity resulting from our inability to characterize them in detail.
Testing the robustness of Citizen Science projects: Evaluating the results of pilot project COMBER.
Chatzigeorgiou, Giorgos; Faulwetter, Sarah; Dailianis, Thanos; Smith, Vincent Stuart; Koulouri, Panagiota; Dounas, Costas; Arvanitidis, Christos
2016-01-01
Citizen Science (CS) as a term implies a great deal of approaches and scopes involving many different fields of science. The number of the relevant projects globally has been increased significantly in the recent years. Large scale ecological questions can be answered only through extended observation networks and CS projects can support this effort. Although the need of such projects is apparent, an important part of scientific community cast doubt on the reliability of CS data sets. The pilot CS project COMBER has been created in order to provide evidence to answer the aforementioned question in the coastal marine biodiversity monitoring. The results of the current analysis show that a carefully designed CS project with clear hypotheses, wide participation and data sets validation, can be a valuable tool for the large scale and long term changes in marine biodiversity pattern change and therefore for relevant management and conservation issues.
PLUM: Parallel Load Balancing for Unstructured Adaptive Meshes. Degree awarded by Colorado Univ.
NASA Technical Reports Server (NTRS)
Oliker, Leonid
1998-01-01
Dynamic mesh adaption on unstructured grids is a powerful tool for computing large-scale problems that require grid modifications to efficiently resolve solution features. By locally refining and coarsening the mesh to capture physical phenomena of interest, such procedures make standard computational methods more cost effective. Unfortunately, an efficient parallel implementation of these adaptive methods is rather difficult to achieve, primarily due to the load imbalance created by the dynamically-changing nonuniform grid. This requires significant communication at runtime, leading to idle processors and adversely affecting the total execution time. Nonetheless, it is generally thought that unstructured adaptive- grid techniques will constitute a significant fraction of future high-performance supercomputing. Various dynamic load balancing methods have been reported to date; however, most of them either lack a global view of loads across processors or do not apply their techniques to realistic large-scale applications.
Stauffer, Reto; Mayr, Georg J; Messner, Jakob W; Umlauf, Nikolaus; Zeileis, Achim
2017-06-15
Flexible spatio-temporal models are widely used to create reliable and accurate estimates for precipitation climatologies. Most models are based on square root transformed monthly or annual means, where a normal distribution seems to be appropriate. This assumption becomes invalid on a daily time scale as the observations involve large fractions of zero observations and are limited to non-negative values. We develop a novel spatio-temporal model to estimate the full climatological distribution of precipitation on a daily time scale over complex terrain using a left-censored normal distribution. The results demonstrate that the new method is able to account for the non-normal distribution and the large fraction of zero observations. The new climatology provides the full climatological distribution on a very high spatial and temporal resolution, and is competitive with, or even outperforms existing methods, even for arbitrary locations.
Predictive wind turbine simulation with an adaptive lattice Boltzmann method for moving boundaries
NASA Astrophysics Data System (ADS)
Deiterding, Ralf; Wood, Stephen L.
2016-09-01
Operating horizontal axis wind turbines create large-scale turbulent wake structures that affect the power output of downwind turbines considerably. The computational prediction of this phenomenon is challenging as efficient low dissipation schemes are necessary that represent the vorticity production by the moving structures accurately and that are able to transport wakes without significant artificial decay over distances of several rotor diameters. We have developed a parallel adaptive lattice Boltzmann method for large eddy simulation of turbulent weakly compressible flows with embedded moving structures that considers these requirements rather naturally and enables first principle simulations of wake-turbine interaction phenomena at reasonable computational costs. The paper describes the employed computational techniques and presents validation simulations for the Mexnext benchmark experiments as well as simulations of the wake propagation in the Scaled Wind Farm Technology (SWIFT) array consisting of three Vestas V27 turbines in triangular arrangement.
Generation of large scale GHZ states with the interactions of photons and quantum-dot spins
NASA Astrophysics Data System (ADS)
Miao, Chun; Fang, Shu-Dong; Dong, Ping; Yang, Ming; Cao, Zhuo-Liang
2018-03-01
We present a deterministic scheme for generating large scale GHZ states in a cavity-quantum dot system. A singly charged quantum dot is embedded in a double-sided optical microcavity with partially reflective top and bottom mirrors. The GHZ-type Bell spin state can be created and two n-spin GHZ states can be perfectly fused to a 2n-spin GHZ state with the help of n ancilla single-photon pulses. The implementation of the current scheme only depends on the photon detection and its need not to operate multi-qubit gates and multi-qubit measurements. Discussions about the effect of the cavity loss, side leakage and exciton cavity coupling strength for the fidelity of generated states show that the fidelity can remain high enough by controlling system parameters. So the current scheme is simple and feasible in experiment.
What do the data show? Fostering physical intuition with ClimateBits and NASA Earth Observations
NASA Astrophysics Data System (ADS)
Schollaert Uz, S.; Ward, K.
2017-12-01
Through data visualizations using global satellite imagery available in NASA Earth Observations (NEO), we explain Earth science concepts (e.g. albedo, urban heat island effect, phytoplankton). We also provide examples of ways to explore the satellite data in NEO within a new blog series. This is an ideal tool for scientists and non-scientists alike who want to quickly check satellite imagery for large scale features or patterns. NEO analysis requires no software or plug-ins; only a browser and an internet connection. You can even check imagery and perform simple analyses from your smart phone. NEO can be used to create graphics for presentations and papers or as a first step before acquiring data for more rigorous analysis. NEO has potential application to easily explore large scale environmental and climate patterns that impact operations and infrastructure. This is something we are currently exploring with end user groups.
NASA Technical Reports Server (NTRS)
Bernhardt, Paul A.; Scales, W. A.
1990-01-01
Ionospheric plasma density irregularities can be produced by chemical releases into the upper atmosphere. F-region plasma modification occurs by: (1) chemically enhancing the electron number density; (2) chemically reducing the electron population; or (3) physically convecting the plasma from one region to another. The three processes (production, loss, and transport) determine the effectiveness of ionospheric chemical releases in subtle and surprising ways. Initially, a chemical release produces a localized change in plasma density. Subsequent processes, however, can lead to enhanced transport in chemically modified regions. Ionospheric modifications by chemical releases excites artificial enhancements in airglow intensities by exothermic chemical reactions between the newly created plasma species. Numerical models were developed to describe the creation and evolution of large scale density irregularities and airglow clouds generated by artificial means. Experimental data compares favorably with theses models. It was found that chemical releases produce transient, large amplitude perturbations in electron density which can evolve into fine scale irregularities via nonlinear transport properties.
Inherent polarization entanglement generated from a monolithic semiconductor chip
Horn, Rolf T.; Kolenderski, Piotr; Kang, Dongpeng; Abolghasem, Payam; Scarcella, Carmelo; Frera, Adriano Della; Tosi, Alberto; Helt, Lukas G.; Zhukovsky, Sergei V.; Sipe, J. E.; Weihs, Gregor; Helmy, Amr S.; Jennewein, Thomas
2013-01-01
Creating miniature chip scale implementations of optical quantum information protocols is a dream for many in the quantum optics community. This is largely because of the promise of stability and scalability. Here we present a monolithically integratable chip architecture upon which is built a photonic device primitive called a Bragg reflection waveguide (BRW). Implemented in gallium arsenide, we show that, via the process of spontaneous parametric down conversion, the BRW is capable of directly producing polarization entangled photons without additional path difference compensation, spectral filtering or post-selection. After splitting the twin-photons immediately after they emerge from the chip, we perform a variety of correlation tests on the photon pairs and show non-classical behaviour in their polarization. Combined with the BRW's versatile architecture our results signify the BRW design as a serious contender on which to build large scale implementations of optical quantum processing devices. PMID:23896982
NASA Astrophysics Data System (ADS)
Padden, M.; Whalen, K.
2013-12-01
Students in a large, second-year environmental earth science class made significant changes to their daily lives over a three-week period to learn how small-scale actions interact with global-scaled issues such as water and energy supplies, waste management and agriculture. The Lifestyle Project (Kirk and Thomas, 2003) was slightly adapted to fit a large-class setting (350 students). Students made changes to their lifestyle in self-selected categories (water, home heating, transportation, waste, food) and created journals over a three-week period as the changes increased in difficulty. The goal of this study is to gain an understanding of which aspects of the project played a pivotal role in impacting long-term learning. Content analysis of the journal entries and follow-up interviews are used to investigate if the Lifestyle Project is having a lasting impact on the students 18 months after the initial assignment.
NASA Astrophysics Data System (ADS)
Guthoff, Rudolf F.; Zhivov, Andrey; Stachs, Oliver
2010-02-01
The aim of the study was to produce two-dimensional reconstruction maps of the living corneal sub-basal nerve plexus by in vivo laser scanning confocal microscopy in real time. CLSM source data (frame rate 30Hz, 384x384 pixel) were used to create large-scale maps of the scanned area by selecting the Automatic Real Time (ART) composite mode. The mapping algorithm is based on an affine transformation. Microscopy of the sub-basal nerve plexus was performed on normal and LASIK eyes as well as on rabbit eyes. Real-time mapping of the sub-basal nerve plexus was performed in large-scale up to a size of 3.2mm x 3.2mm. The developed method enables a real-time in vivo mapping of the sub-basal nerve plexus which is stringently necessary for statistically firmed conclusions about morphometric plexus alterations.
Network models of biology, whether curated or derived from large-scale data analysis, are critical tools in the understanding of cancer mechanisms and in the design and personalization of therapies. The NDEx Project (Network Data Exchange) will create, deploy, and maintain an open-source, web-based software platform and public website to enable scientists, organizations, and software applications to share, store, manipulate, and publish biological networks.
Battlespace Awareness: Heterogeneous Sensor Maps of Large Scale, Complex Environments
2017-06-13
reference frames enable a system designer to describe the position of any sensor or platform at any point of time. This section introduces the...analysis to evaluate the quality of reconstructions created by our algorithms. CloudCompare is an open-source tool designed for this purpose [65]. In...structure of the data. The data term seeks to keep the proposed solution (u) similar to the originally observed values ( f ). A systems designer must
ERIC Educational Resources Information Center
Fleisch, Brahm; Shindler, Jennifer
2009-01-01
This monograph looks at patterns and prevalence of initial school enrolment, late entry, attainment promotion, and repetition in urban South Africa. The paper pays special attention to the particular gender nature of the patterns of school participation. The study analyses data generated in the genuine representative cohort study, Birth-to-Twenty…
PIPELINEs: Creating Comparable Clinical Knowledge Efficiently by Linking Trial Platforms
Shrier, AA; Antonijevic, Z; Beckman, RA; Campbell, RK; Chen, C; Flaherty, KT; Loewy, J; Lacombe, D; Madhavan, S; Selker, HP; Esserman, LJ
2016-01-01
Adaptive, seamless, multisponsor, multitherapy clinical trial designs executed as large scale platforms, could create superior evidence more efficiently than single‐sponsor, single‐drug trials. These trial PIPELINEs also could diminish barriers to trial participation, increase the representation of real‐world populations, and create systematic evidence development for learning throughout a therapeutic life cycle, to continually refine its use. Comparable evidence could arise from multiarm design, shared comparator arms, and standardized endpoints—aiding sponsors in demonstrating the distinct value of their innovative medicines; facilitating providers and patients in selecting the most appropriate treatments; assisting regulators in efficacy and safety determinations; helping payers make coverage and reimbursement decisions; and spurring scientists with translational insights. Reduced trial times and costs could enable more indications, reduced development cycle times, and improved system financial sustainability. Challenges to overcome range from statistical to operational to collaborative governance and data exchange. PMID:27643536
Vertically migrating swimmers generate aggregation-scale eddies in a stratified column.
Houghton, Isabel A; Koseff, Jeffrey R; Monismith, Stephen G; Dabiri, John O
2018-04-01
Biologically generated turbulence has been proposed as an important contributor to nutrient transport and ocean mixing 1-3 . However, to produce non-negligible transport and mixing, such turbulence must produce eddies at scales comparable to the length scales of stratification in the ocean. It has previously been argued that biologically generated turbulence is limited to the scale of the individual animals involved 4 , which would make turbulence created by highly abundant centimetre-scale zooplankton such as krill irrelevant to ocean mixing. Their small size notwithstanding, zooplankton form dense aggregations tens of metres in vertical extent as they undergo diurnal vertical migration over hundreds of metres 3,5,6 . This behaviour potentially introduces additional length scales-such as the scale of the aggregation-that are of relevance to animal interactions with the surrounding water column. Here we show that the collective vertical migration of centimetre-scale swimmers-as represented by the brine shrimp Artemia salina-generates aggregation-scale eddies that mix a stable density stratification, resulting in an effective turbulent diffusivity up to three orders of magnitude larger than the molecular diffusivity of salt. These observed large-scale mixing eddies are the result of flow in the wakes of the individual organisms coalescing to form a large-scale downward jet during upward swimming, even in the presence of a strong density stratification relative to typical values observed in the ocean. The results illustrate the potential for marine zooplankton to considerably alter the physical and biogeochemical structure of the water column, with potentially widespread effects owing to their high abundance in climatically important regions of the ocean 7 .
NASA Astrophysics Data System (ADS)
Bruno, Luigi; Decuzzi, Paolo; Gentile, Francesco
2016-01-01
The promise of nanotechnology lies in the possibility of engineering matter on the nanoscale and creating technological interfaces that, because of their small scales, may directly interact with biological objects, creating new strategies for the treatment of pathologies that are otherwise beyond the reach of conventional medicine. Nanotechnology is inherently a multiscale, multiphenomena challenge. Fundamental understanding and highly accurate predictive methods are critical to successful manufacturing of nanostructured materials, bio/mechanical devices and systems. In biomedical engineering, and in the mechanical analysis of biological tissues, classical continuum approaches are routinely utilized, even if these disregard the discrete nature of tissues, that are an interpenetrating network of a matrix (the extra cellular matrix, ECM) and a generally large but finite number of cells with a size falling in the micrometer range. Here, we introduce a nano-mechanical theory that accounts for the-non continuum nature of bio systems and other discrete systems. This discrete field theory, doublet mechanics (DM), is a technique to model the mechanical behavior of materials over multiple scales, ranging from some millimeters down to few nanometers. In the paper, we use this theory to predict the response of a granular material to an external applied load. Such a representation is extremely attractive in modeling biological tissues which may be considered as a spatial set of a large number of particulate (cells) dispersed in an extracellular matrix. Possibly more important of this, using digital image correlation (DIC) optical methods, we provide an experimental verification of the model.
Winslow, Luke A.; Read, Jordan S.; Hanson, Paul C.; Stanley, Emily H.
2014-01-01
With lake abundances in the thousands to millions, creating an intuitive understanding of the distribution of morphology and processes in lakes is challenging. To improve researchers’ understanding of large-scale lake processes, we developed a parsimonious mathematical model based on the Pareto distribution to describe the distribution of lake morphology (area, perimeter and volume). While debate continues over which mathematical representation best fits any one distribution of lake morphometric characteristics, we recognize the need for a simple, flexible model to advance understanding of how the interaction between morphometry and function dictates scaling across large populations of lakes. These models make clear the relative contribution of lakes to the total amount of lake surface area, volume, and perimeter. They also highlight the critical thresholds at which total perimeter, area and volume would be evenly distributed across lake size-classes have Pareto slopes of 0.63, 1 and 1.12, respectively. These models of morphology can be used in combination with models of process to create overarching “lake population” level models of process. To illustrate this potential, we combine the model of surface area distribution with a model of carbon mass accumulation rate. We found that even if smaller lakes contribute relatively less to total surface area than larger lakes, the increasing carbon accumulation rate with decreasing lake size is strong enough to bias the distribution of carbon mass accumulation towards smaller lakes. This analytical framework provides a relatively simple approach to upscaling morphology and process that is easily generalizable to other ecosystem processes.
Meta-analysis on Macropore Flow Velocity in Soils
NASA Astrophysics Data System (ADS)
Liu, D.; Gao, M.; Li, H. Y.; Chen, X.; Leung, L. R.
2017-12-01
Macropore flow is ubiquitous in the soils and an important hydrologic process that is not well explained using traditional hydrologic theories. Macropore Flow Velocity (MFV) is an important parameter used to describe macropore flow and quantify its effects on runoff generation and solute transport. However, the dominant factors controlling MFV are still poorly understood and the typical ranges of MFV measured at the field are not defined clearly. To address these issues, we conducted a meta-analysis based on a database created from 246 experiments on MFV collected from 76 journal articles. For a fair comparison, a conceptually unified definition of MFV is introduced to convert the MFV measured with different approaches and at various scales including soil core, field, trench or hillslope scales. The potential controlling factors of MFV considered include scale, travel distance, hydrologic conditions, site factors, macropore morphologies, soil texture, and land use. The results show that MFV is about 2 3 orders of magnitude larger than the corresponding values of saturated hydraulic conductivity. MFV is much larger at the trench and hillslope scale than at the field profile and soil core scales and shows a significant positive correlation with the travel distance. Generally, higher irrigation intensity tends to trigger faster MFV, especially at field profile scale, where MFV and irrigation intensity have significant positive correlation. At the trench and hillslope scale, the presence of large macropores (diameter>10 mm) is a key factor determining MFV. The geometric mean of MFV for sites with large macropores was found to be about 8 times larger than those without large macropores. For sites with large macropores, MFV increases with the macropore diameter. However, no noticeable difference in MFV has been observed among different soil texture and land use. Comparing the existing equations to describe MFV, the Poiseuille equation significantly overestimated the observed values, while the Manning-type equations generate reasonable values. The insights from this study will shed light on future field campaigns and modeling of macropore flow.
Supernova explosions in magnetized, primordial dark matter haloes
NASA Astrophysics Data System (ADS)
Seifried, D.; Banerjee, R.; Schleicher, D.
2014-05-01
The first supernova explosions are potentially relevant sources for the production of the first large-scale magnetic fields. For this reason, we present a set of high-resolution simulations studying the effect of supernova explosions on magnetized, primordial haloes. We focus on the evolution of an initially small-scale magnetic field formed during the collapse of the halo. We vary the degree of magnetization, the halo mass, and the amount of explosion energy in order to account for expected variations as well as to infer systematical dependences of the results on initial conditions. Our simulations suggest that core collapse supernovae with an explosion energy of 1051 erg and more violent pair instability supernovae with 1053 erg are able to disrupt haloes with masses up to about 106 and 107 M⊙, respectively. The peak of the magnetic field spectra shows a continuous shift towards smaller k-values, i.e. larger length scales, over time reaching values as low as k = 4. On small scales, the magnetic energy decreases at the cost of the energy on large scales resulting in a well-ordered magnetic field with a strength up to ˜10-8 G depending on the initial conditions. The coherence length of the magnetic field inferred from the spectra reaches values up to 250 pc in agreement with those obtained from autocorrelation functions. We find the coherence length to be as large as 50 per cent of the radius of the supernova bubble. Extrapolating this relation to later stages, we suggest that significantly strong magnetic fields with coherence lengths as large as 1.5 kpc could be created. We discuss possible implications of our results on processes like recollapse of the halo, first galaxy formation, and the magnetization of the intergalactic medium.
Reisner, A E
2005-11-01
The building and expansion of large-scale swine facilities has created considerable controversy in many neighboring communities, but to date, no systematic analysis has been done of the types of claims made during these conflicts. This study examined how local newspapers in one state covered the transition from the dominance of smaller, diversified swine operations to large, single-purpose pig production facilities. To look at publicly made statements concerning large-scale swine facilities (LSSF), the study collected all articles related to LSSF from 22 daily Illinois newspapers over a 3-yr period (a total of 1,737 articles). The most frequent sets of claims used by proponents of LSSF were that the environment was not harmed, that state regulations were sufficiently strict, and that the state economically needed this type of agriculture. The most frequent claims made by opponents were that LSSF harmed the environment and neighboring communities and that stricter regulations were needed. Proponents' claims were primarily defensive and, to some degree, underplayed the advantages of LSSF. Pro-and anti-LSSF groups were talking at cross-purposes, to some degree. Even across similar themes, those in favor of LSSF and those opposed were addressing different sets of concerns. The newspaper claims did not indicate any effective alliances forming between local anti-LSSF groups and national environmental or animal rights groups.
NASA Astrophysics Data System (ADS)
McClain, Bobbi J.; Porter, William F.
2000-11-01
Satellite imagery is a useful tool for large-scale habitat analysis; however, its limitations need to be tested. We tested these limitations by varying the methods of a habitat evaluation for white-tailed deer ( Odocoileus virginianus) in the Adirondack Park, New York, USA, utilizing harvest data to create and validate the assessment models. We used two classified images, one with a large minimum mapping unit but high accuracy and one with no minimum mapping unit but slightly lower accuracy, to test the sensitivity of the evaluation to these differences. We tested the utility of two methods of assessment, habitat suitability index modeling, and pattern recognition modeling. We varied the scale at which the models were applied by using five separate sizes of analysis windows. Results showed that the presence of a large minimum mapping unit eliminates important details of the habitat. Window size is relatively unimportant if the data are averaged to a large resolution (i.e., township), but if the data are used at the smaller resolution, then the window size is an important consideration. In the Adirondacks, the proportion of hardwood and softwood in an area is most important to the spatial dynamics of deer populations. The low occurrence of open area in all parts of the park either limits the effect of this cover type on the population or limits our ability to detect the effect. The arrangement and interspersion of cover types were not significant to deer populations.
Sparse synthetic aperture with Fresnel elements (S-SAFE) using digital incoherent holograms
Kashter, Yuval; Rivenson, Yair; Stern, Adrian; Rosen, Joseph
2015-01-01
Creating a large-scale synthetic aperture makes it possible to break the resolution boundaries dictated by the wave nature of light of common optical systems. However, their implementation is challenging, since the generation of a large size continuous mosaic synthetic aperture composed of many patterns is complicated in terms of both phase matching and time-multiplexing duration. In this study we present an advanced configuration for an incoherent holographic imaging system with super resolution qualities that creates a partial synthetic aperture. The new system, termed sparse synthetic aperture with Fresnel elements (S-SAFE), enables significantly decreasing the number of the recorded elements, and it is free from positional constrains on their location. Additionally, in order to obtain the best image quality we propose an optimal mosaicking structure derived on the basis of physical and numerical considerations, and introduce three reconstruction approaches which are compared and discussed. The super-resolution capabilities of the proposed scheme and its limitations are analyzed, numerically simulated and experimentally demonstrated. PMID:26367947
The Style Troika Model: A Structural Model of the Thinking and Creating Styles Scale
ERIC Educational Resources Information Center
Ibérico Nogueira, Sara; Almeida, Leonor; Garcês, Soraia; Pocinho, Margarida; Wechsler, Solange
2016-01-01
Individuals express their creativity through a variety of thinking and creating styles (Wechsler, 2006, 2007). These constructs underlie the Thinking and Creating Styles Scale (TCSS), which is used to identify individuals' creating styles. The aim of this research is to assess the factorial structure of the Portuguese version of the TCSS. Two…
QuickEval: a web application for psychometric scaling experiments
NASA Astrophysics Data System (ADS)
Van Ngo, Khai; Storvik, Jehans J.; Dokkeberg, Christopher A.; Farup, Ivar; Pedersen, Marius
2015-01-01
QuickEval is a web application for carrying out psychometric scaling experiments. It offers the possibility of running controlled experiments in a laboratory, or large scale experiment over the web for people all over the world. It is a unique one of a kind web application, and it is a software needed in the image quality field. It is also, to the best of knowledge, the first software that supports the three most common scaling methods; paired comparison, rank order, and category judgement. It is also the first software to support rank order. Hopefully, a side effect of this newly created software is that it will lower the threshold to perform psychometric experiments, improve the quality of the experiments being carried out, make it easier to reproduce experiments, and increase research on image quality both in academia and industry. The web application is available at www.colourlab.no/quickeval.
Impact vaporization: Late time phenomena from experiments
NASA Technical Reports Server (NTRS)
Schultz, P. H.; Gault, D. E.
1987-01-01
While simple airflow produced by the outward movement of the ejecta curtain can be scaled to large dimensions, the interaction between an impact-vaporized component and the ejecta curtain is more complicated. The goal of these experiments was to examine such interaction in a real system involving crater growth, ejection of material, two phased mixtures of gas and dust, and strong pressure gradients. The results will be complemented by theoretical studies at laboratory scales in order to separate the various parameters for planetary scale processes. These experiments prompt, however, the following conclusions that may have relevance at broader scales. First, under near vacuum or low atmospheric pressures, an expanding vapor cloud scours the surrounding surface in advance of arriving ejecta. Second, the effect of early-time vaporization is relatively unimportant at late-times. Third, the overpressure created within the crater cavity by significant vaporization results in increased cratering efficiency and larger aspect ratios.
Machtans, Craig S.; Thogmartin, Wayne E.
2014-01-01
The publication of a U.S. estimate of bird–window collisions by Loss et al. is an example of the somewhat contentious approach of using extrapolations to obtain large-scale estimates from small-scale studies. We review the approach by Loss et al. and other authors who have published papers on human-induced avian mortality and describe the drawbacks and advantages to publishing what could be considered imperfect science. The main drawback is the inherent and somewhat unquantifiable bias of using small-scale studies to scale up to a national estimate. The direct benefits include development of new methodologies for creating the estimates, an explicit treatment of known biases with acknowledged uncertainty in the final estimate, and the novel results. Other overarching benefits are that these types of papers are catalysts for improving all aspects of the science of estimates and for policies that must respond to the new information.
The Seasonal Predictability of Extreme Wind Events in the Southwest United States
NASA Astrophysics Data System (ADS)
Seastrand, Simona Renee
Extreme wind events are a common phenomenon in the Southwest United States. Entities such as the United States Air Force (USAF) find the Southwest appealing for many reasons, primarily for the an expansive, unpopulated, and electronically unpolluted space for large-scale training and testing. However, wind events can cause hazards for the USAF including: surface wind gusts can impact the take-off and landing of all aircraft, can tip the airframes of large wing-surface aircraft during the performance of maneuvers close to the ground, and can even impact weapons systems. This dissertation is comprised of three sections intended to further our knowledge and understanding of wind events in the Southwest. The first section builds a climatology of wind events for seven locations in the Southwest during the twelve 3-month seasons of the year. The first section further examines the wind events in relation to terrain and the large-scale flow of the atmosphere. The second section builds upon the first by taking the wind events and generating mid-level composites for each of the twelve 3-month seasons. In the third section, teleconnections identified as consistent with the large-scale circulation in the second paper were used as predictor variables to build a Poisson regression model for each of the twelve 3-month seasons. The purpose of this research is to increase our understanding of the climatology of extreme wind events, increase our understanding of how the large-scale circulation influences extreme wind events, and create a model to enhance predictability of extreme wind events in the Southwest. Knowledge from this paper will help protect personnel and property associated with not only the USAF, but all those in the Southwest.
Jossen, Valentin; Schirmer, Cedric; Mostafa Sindi, Dolman; Eibl, Regine; Kraume, Matthias; Pörtner, Ralf; Eibl, Dieter
2016-01-01
The potential of human mesenchymal stem cells (hMSCs) for allogeneic cell therapies has created a large amount of interest. However, this presupposes the availability of efficient scale-up procedures. Promising results have been reported for stirred bioreactors that operate with microcarriers. Recent publications focusing on microcarrier-based stirred bioreactors have demonstrated the successful use of Computational Fluid Dynamics (CFD) and suspension criteria (N S1u, N S1) for rapidly scaling up hMSC expansions from mL- to pilot scale. Nevertheless, one obstacle may be the formation of large microcarrier-cell-aggregates, which may result in mass transfer limitations and inhomogeneous distributions of stem cells in the culture broth. The dependence of microcarrier-cell-aggregate formation on impeller speed and shear stress levels was investigated for human adipose derived stromal/stem cells (hASCs) at the spinner scale by recording the Sauter mean diameter (d 32) versus time. Cultivation at the suspension criteria provided d 32 values between 0.2 and 0.7 mm, the highest cell densities (1.25 × 106 cells mL−1 hASCs), and the highest expansion factors (117.0 ± 4.7 on day 7), while maintaining the expression of specific surface markers. Furthermore, suitability of the suspension criterion N S1u was investigated for scaling up microcarrier-based processes in wave-mixed bioreactors for the first time. PMID:26981131
NASA Astrophysics Data System (ADS)
Sasaki, Yuki; Kitaura, Ryo; Yuk, Jong Min; Zettl, Alex; Shinohara, Hisanori
2016-04-01
By utilizing graphene-sandwiched structures recently developed in this laboratory, we are able to visualize small droplets of liquids in nanometer scale. We have found that small water droplets as small as several tens of nanometers sandwiched by two single-layer graphene are frequently observed by TEM. Due to the electron beam irradiation during the TEM observation, these sandwiched droplets are frequently moving from one place to another and are subjected to create small bubbles inside. The synthesis of a large area single-domain graphene of high-quality is essential to prepare the graphene sandwiched cell which safely encapsulates the droplets in nanometer size.
Statistical Model Applied to NetFlow for Network Intrusion Detection
NASA Astrophysics Data System (ADS)
Proto, André; Alexandre, Leandro A.; Batista, Maira L.; Oliveira, Isabela L.; Cansian, Adriano M.
The computers and network services became presence guaranteed in several places. These characteristics resulted in the growth of illicit events and therefore the computers and networks security has become an essential point in any computing environment. Many methodologies were created to identify these events; however, with increasing of users and services on the Internet, many difficulties are found in trying to monitor a large network environment. This paper proposes a methodology for events detection in large-scale networks. The proposal approaches the anomaly detection using the NetFlow protocol, statistical methods and monitoring the environment in a best time for the application.
Yuan, Liang (Leon); Herman, Peter R.
2016-01-01
Three-dimensional (3D) periodic nanostructures underpin a promising research direction on the frontiers of nanoscience and technology to generate advanced materials for exploiting novel photonic crystal (PC) and nanofluidic functionalities. However, formation of uniform and defect-free 3D periodic structures over large areas that can further integrate into multifunctional devices has remained a major challenge. Here, we introduce a laser scanning holographic method for 3D exposure in thick photoresist that combines the unique advantages of large area 3D holographic interference lithography (HIL) with the flexible patterning of laser direct writing to form both micro- and nano-structures in a single exposure step. Phase mask interference patterns accumulated over multiple overlapping scans are shown to stitch seamlessly and form uniform 3D nanostructure with beam size scaled to small 200 μm diameter. In this way, laser scanning is presented as a facile means to embed 3D PC structure within microfluidic channels for integration into an optofluidic lab-on-chip, demonstrating a new laser HIL writing approach for creating multi-scale integrated microsystems. PMID:26922872
Development of a 3D Stream Network and Topography for Improved Large-Scale Hydraulic Modeling
NASA Astrophysics Data System (ADS)
Saksena, S.; Dey, S.; Merwade, V.
2016-12-01
Most digital elevation models (DEMs) used for hydraulic modeling do not include channel bed elevations. As a result, the DEMs are complimented with additional bathymetric data for accurate hydraulic simulations. Existing methods to acquire bathymetric information through field surveys or through conceptual models are limited to reach-scale applications. With an increasing focus on large scale hydraulic modeling of rivers, a framework to estimate and incorporate bathymetry for an entire stream network is needed. This study proposes an interpolation-based algorithm to estimate bathymetry for a stream network by modifying the reach-based empirical River Channel Morphology Model (RCMM). The effect of a 3D stream network that includes river bathymetry is then investigated by creating a 1D hydraulic model (HEC-RAS) and 2D hydrodynamic model (Integrated Channel and Pond Routing) for the Upper Wabash River Basin in Indiana, USA. Results show improved simulation of flood depths and storage in the floodplain. Similarly, the impact of river bathymetry incorporation is more significant in the 2D model as compared to the 1D model.
NASA Astrophysics Data System (ADS)
Yulaeva, E.; Fan, Y.; Moosdorf, N.; Richard, S. M.; Bristol, S.; Peters, S. E.; Zaslavsky, I.; Ingebritsen, S.
2015-12-01
The Digital Crust EarthCube building block creates a framework for integrating disparate 3D/4D information from multiple sources into a comprehensive model of the structure and composition of the Earth's upper crust, and to demonstrate the utility of this model in several research scenarios. One of such scenarios is estimation of various crustal properties related to fluid dynamics (e.g. permeability and porosity) at each node of any arbitrary unstructured 3D grid to support continental-scale numerical models of fluid flow and transport. Starting from Macrostrat, an existing 4D database of 33,903 chronostratigraphic units, and employing GeoDeepDive, a software system for extracting structured information from unstructured documents, we construct 3D gridded fields of sediment/rock porosity, permeability and geochemistry for large sedimentary basins of North America, which will be used to improve our understanding of large-scale fluid flow, chemical weathering rates, and geochemical fluxes into the ocean. In this talk, we discuss the methods, data gaps (particularly in geologically complex terrain), and various physical and geological constraints on interpolation and uncertainty estimation.
Fast Quantum State Transfer and Entanglement Renormalization Using Long-Range Interactions.
Eldredge, Zachary; Gong, Zhe-Xuan; Young, Jeremy T; Moosavian, Ali Hamed; Foss-Feig, Michael; Gorshkov, Alexey V
2017-10-27
In short-range interacting systems, the speed at which entanglement can be established between two separated points is limited by a constant Lieb-Robinson velocity. Long-range interacting systems are capable of faster entanglement generation, but the degree of the speedup possible is an open question. In this Letter, we present a protocol capable of transferring a quantum state across a distance L in d dimensions using long-range interactions with a strength bounded by 1/r^{α}. If α
Fast Quantum State Transfer and Entanglement Renormalization Using Long-Range Interactions
NASA Astrophysics Data System (ADS)
Eldredge, Zachary; Gong, Zhe-Xuan; Young, Jeremy T.; Moosavian, Ali Hamed; Foss-Feig, Michael; Gorshkov, Alexey V.
2017-10-01
In short-range interacting systems, the speed at which entanglement can be established between two separated points is limited by a constant Lieb-Robinson velocity. Long-range interacting systems are capable of faster entanglement generation, but the degree of the speedup possible is an open question. In this Letter, we present a protocol capable of transferring a quantum state across a distance L in d dimensions using long-range interactions with a strength bounded by 1 /rα. If α
White, Mark; Wells, John S G; Butterworth, Tony
2014-09-01
To examine the literature related to a large-scale quality improvement initiative, the 'Productive Ward: Releasing Time to Care', providing a bibliometric profile that tracks the level of interest and scale of roll-out and adoption, discussing the implications for sustainability. Productive Ward: Releasing Time to Care (aka Productive Ward) is probably one of the most ambitious quality improvement efforts engaged by the UK-NHS. Politically and financially supported, its main driver was the NHS Institute for Innovation and Improvement. The NHS institute closed in early 2013 leaving a void of resources, knowledge and expertise. UK roll-out of the initiative is well established and has arguably peaked. International interest in the initiative however continues to develop. A comprehensive literature review was undertaken to identify the literature related to the Productive Ward and its implementation (January 2006-June 2013). A bibliometric analysis examined/reviewed the trends and identified/measured interest, spread and uptake. Overall distribution patterns identify a declining trend of interest, with reduced numbers of grey literature and evaluation publications. However, detailed examination of the data shows no reduction in peer-reviewed outputs. There is some evidence that international uptake of the initiative continues to generate publications and create interest. Sustaining this initiative in the UK will require re-energising, a new focus and financing. The transition period created by the closure of its creator may well contribute to further reduced levels of interest and publication outputs in the UK. However, international implementation, evaluation and associated publications could serve to attract professional/academic interest in this well-established, positively reported, quality improvement initiative. This paper provides nurses and ward teams involved in quality improvement programmes with a detailed, current-state, examination and analysis of the Productive Ward literature, highlighting the bibliometric patterns of this large-scale, international, quality improvement programme. It serves to disseminate updated publication information to those in clinical practice who are involved in Productive Ward or a similar quality improvement initiative. © 2014 John Wiley & Sons Ltd.
'Fracking', Induced Seismicity and the Critical Earth
NASA Astrophysics Data System (ADS)
Leary, P.; Malin, P. E.
2012-12-01
Issues of 'fracking' and induced seismicity are reverse-analogous to the equally complex issues of well productivity in hydrocarbon, geothermal and ore reservoirs. In low hazard reservoir economics, poorly producing wells and low grade ore bodies are many while highly producing wells and high grade ores are rare but high pay. With induced seismicity factored in, however, the same distribution physics reverses the high/low pay economics: large fracture-connectivity systems are hazardous hence low pay, while high probability small fracture-connectivity systems are non-hazardous hence high pay. Put differently, an economic risk abatement tactic for well productivity and ore body pay is to encounter large-scale fracture systems, while an economic risk abatement tactic for 'fracking'-induced seismicity is to avoid large-scale fracture systems. Well productivity and ore body grade distributions arise from three empirical rules for fluid flow in crustal rock: (i) power-law scaling of grain-scale fracture density fluctuations; (ii) spatial correlation between spatial fluctuations in well-core porosity and the logarithm of well-core permeability; (iii) frequency distributions of permeability governed by a lognormality skewness parameter. The physical origin of rules (i)-(iii) is the universal existence of a critical-state-percolation grain-scale fracture-density threshold for crustal rock. Crustal fractures are effectively long-range spatially-correlated distributions of grain-scale defects permitting fluid percolation on mm to km scales. The rule is, the larger the fracture system the more intense the percolation throughput. As percolation pathways are spatially erratic and unpredictable on all scales, they are difficult to model with sparsely sampled well data. Phenomena such as well productivity, induced seismicity, and ore body fossil fracture distributions are collectively extremely difficult to predict. Risk associated with unpredictable reservoir well productivity and ore body distributions can be managed by operating in a context which affords many small failures for a few large successes. In reverse view, 'fracking' and induced seismicity could be rationally managed in a context in which many small successes can afford a few large failures. However, just as there is every incentive to acquire information leading to higher rates of productive well drilling and ore body exploration, there are equal incentives for acquiring information leading to lower rates of 'fracking'-induced seismicity. Current industry practice of using an effective medium approach to reservoir rock creates an uncritical sense that property distributions in rock are essentially uniform. Well-log data show that the reverse is true: the larger the length scale the greater the deviation from uniformity. Applying the effective medium approach to large-scale rock formations thus appears to be unnecessarily hazardous. It promotes the notion that large scale fluid pressurization acts against weakly cohesive but essentially uniform rock to produce large-scale quasi-uniform tensile discontinuities. Indiscriminate hydrofacturing appears to be vastly more problematic in reality than as pictured by the effective medium hypothesis. The spatial complexity of rock, especially at large scales, provides ample reason to find more controlled pressurization strategies for enhancing in situ flow.
The seesaw space, a vector space to identify and characterize large-scale structures at 1 AU
NASA Astrophysics Data System (ADS)
Lara, A.; Niembro, T.
2017-12-01
We introduce the seesaw space, an orthonormal space formed by the local and the global fluctuations of any of the four basic solar parameters: velocity, density, magnetic field and temperature at any heliospheric distance. The fluctuations compare the standard deviation of a moving average of three hours against the running average of the parameter in a month (consider as the local fluctuations) and in a year (global fluctuations) We created this new vectorial spaces to identify the arrival of transients to any spacecraft without the need of an observer. We applied our method to the one-minute resolution data of WIND spacecraft from 1996 to 2016. To study the behavior of the seesaw norms in terms of the solar cycle, we computed annual histograms and fixed piecewise functions formed by two log-normal distributions and observed that one of the distributions is due to large-scale structures while the other to the ambient solar wind. The norm values in which the piecewise functions change vary in terms of the solar cycle. We compared the seesaw norms of each of the basic parameters due to the arrival of coronal mass ejections, co-rotating interaction regions and sector boundaries reported in literature. High seesaw norms are due to large-scale structures. We found three critical values of the norms that can be used to determined the arrival of coronal mass ejections. We present as well general comparisons of the norms during the two maxima and the minimum solar cycle periods and the differences of the norms due to large-scale structures depending on each period.
NASA Astrophysics Data System (ADS)
Avetissian, A. K.
2017-07-01
New cosmic scales, completely different from the Plank's scales, have been disclosed in the frame of so called “Non-Inflationary Cosmology” (NIC), created by the author during last decade. The proposed new ideas shed light on some hidden inaccuracies within the essence of Planck's scales in Modern Cosmology, so the new scales have been nominated as “NAIRI (New Alternative Ideas Regenerating Irregularities) Cosmic Scales” (NCS). The NCS is believed to be realistic due to qualitative and quantitative correspondences with observational and experimental data. The basic concept about NCS has been created based on two hypotheses about cosmological time-evolution of Planck's constant and multi-photon processes. Together with the hypothesis about domination of Bose-statistics in the early Universe and the possibility of large-scale Bose-condensate, these predictions have been converted into phenomena, based on which the bases of alternative theory of cosmology have been investigated. The predicted by the author “Cosmic Small (Local) Bang” (CSB) phenomenon has been investigated in the model of galaxy, and as a consequence of CSB the possibility of Super-Strong Shock Wave (SSW) has been postulated. Thus, based on phenomena CSB and SSW, NIC guarantees the non-accretion mechanism of generation of galaxies and super-massive black holes in their core, as well as creation of supernovas and massive stars (super-massive stars exceeding also 100M⊙). The possibility of gravitational radiation (GR) by the central black hole of the galaxy, even by the disk (or whole galaxy!) has been investigated.
Using Unplanned Fires to Help Suppressing Future Large Fires in Mediterranean Forests
Regos, Adrián; Aquilué, Núria; Retana, Javier; De Cáceres, Miquel; Brotons, Lluís
2014-01-01
Despite the huge resources invested in fire suppression, the impact of wildfires has considerably increased across the Mediterranean region since the second half of the 20th century. Modulating fire suppression efforts in mild weather conditions is an appealing but hotly-debated strategy to use unplanned fires and associated fuel reduction to create opportunities for suppression of large fires in future adverse weather conditions. Using a spatially-explicit fire–succession model developed for Catalonia (Spain), we assessed this opportunistic policy by using two fire suppression strategies that reproduce how firefighters in extreme weather conditions exploit previous fire scars as firefighting opportunities. We designed scenarios by combining different levels of fire suppression efficiency and climatic severity for a 50-year period (2000–2050). An opportunistic fire suppression policy induced large-scale changes in fire regimes and decreased the area burnt under extreme climate conditions, but only accounted for up to 18–22% of the area to be burnt in reference scenarios. The area suppressed in adverse years tended to increase in scenarios with increasing amounts of area burnt during years dominated by mild weather. Climate change had counterintuitive effects on opportunistic fire suppression strategies. Climate warming increased the incidence of large fires under uncontrolled conditions but also indirectly increased opportunities for enhanced fire suppression. Therefore, to shift fire suppression opportunities from adverse to mild years, we would require a disproportionately large amount of area burnt in mild years. We conclude that the strategic planning of fire suppression resources has the potential to become an important cost-effective fuel-reduction strategy at large spatial scale. We do however suggest that this strategy should probably be accompanied by other fuel-reduction treatments applied at broad scales if large-scale changes in fire regimes are to be achieved, especially in the wider context of climate change. PMID:24727853
Using unplanned fires to help suppressing future large fires in Mediterranean forests.
Regos, Adrián; Aquilué, Núria; Retana, Javier; De Cáceres, Miquel; Brotons, Lluís
2014-01-01
Despite the huge resources invested in fire suppression, the impact of wildfires has considerably increased across the Mediterranean region since the second half of the 20th century. Modulating fire suppression efforts in mild weather conditions is an appealing but hotly-debated strategy to use unplanned fires and associated fuel reduction to create opportunities for suppression of large fires in future adverse weather conditions. Using a spatially-explicit fire-succession model developed for Catalonia (Spain), we assessed this opportunistic policy by using two fire suppression strategies that reproduce how firefighters in extreme weather conditions exploit previous fire scars as firefighting opportunities. We designed scenarios by combining different levels of fire suppression efficiency and climatic severity for a 50-year period (2000-2050). An opportunistic fire suppression policy induced large-scale changes in fire regimes and decreased the area burnt under extreme climate conditions, but only accounted for up to 18-22% of the area to be burnt in reference scenarios. The area suppressed in adverse years tended to increase in scenarios with increasing amounts of area burnt during years dominated by mild weather. Climate change had counterintuitive effects on opportunistic fire suppression strategies. Climate warming increased the incidence of large fires under uncontrolled conditions but also indirectly increased opportunities for enhanced fire suppression. Therefore, to shift fire suppression opportunities from adverse to mild years, we would require a disproportionately large amount of area burnt in mild years. We conclude that the strategic planning of fire suppression resources has the potential to become an important cost-effective fuel-reduction strategy at large spatial scale. We do however suggest that this strategy should probably be accompanied by other fuel-reduction treatments applied at broad scales if large-scale changes in fire regimes are to be achieved, especially in the wider context of climate change.
Developing closed life support systems for large space habitats
NASA Technical Reports Server (NTRS)
Phillips, J. M.; Harlan, A. D.; Krumhar, K. C.
1978-01-01
In anticipation of possible large-scale, long-duration space missions which may be conducted in the future, NASA has begun to investigate the research and technology development requirements to create life support systems for large space habitats. An analysis suggests the feasibility of a regeneration of food in missions which exceed four years duration. Regeneration of food in space may be justified for missions of shorter duration when large crews must be supported at remote sites such as lunar bases and space manufacturing facilities. It is thought that biological components consisting principally of traditional crop and livestock species will prove to be the most acceptable means of closing the food cycle. A description is presented of the preliminary results of a study of potential biological components for large space habitats. Attention is given to controlled ecosystems, Russian life support system research, controlled-environment agriculture, and the social aspects of the life-support system.
NASA Astrophysics Data System (ADS)
Doulamis, A.; Doulamis, N.; Ioannidis, C.; Chrysouli, C.; Grammalidis, N.; Dimitropoulos, K.; Potsiou, C.; Stathopoulou, E.-K.; Ioannides, M.
2015-08-01
Outdoor large-scale cultural sites are mostly sensitive to environmental, natural and human made factors, implying an imminent need for a spatio-temporal assessment to identify regions of potential cultural interest (material degradation, structuring, conservation). On the other hand, in Cultural Heritage research quite different actors are involved (archaeologists, curators, conservators, simple users) each of diverse needs. All these statements advocate that a 5D modelling (3D geometry plus time plus levels of details) is ideally required for preservation and assessment of outdoor large scale cultural sites, which is currently implemented as a simple aggregation of 3D digital models at different time and levels of details. The main bottleneck of such an approach is its complexity, making 5D modelling impossible to be validated in real life conditions. In this paper, a cost effective and affordable framework for 5D modelling is proposed based on a spatial-temporal dependent aggregation of 3D digital models, by incorporating a predictive assessment procedure to indicate which regions (surfaces) of an object should be reconstructed at higher levels of details at next time instances and which at lower ones. In this way, dynamic change history maps are created, indicating spatial probabilities of regions needed further 3D modelling at forthcoming instances. Using these maps, predictive assessment can be made, that is, to localize surfaces within the objects where a high accuracy reconstruction process needs to be activated at the forthcoming time instances. The proposed 5D Digital Cultural Heritage Model (5D-DCHM) is implemented using open interoperable standards based on the CityGML framework, which also allows the description of additional semantic metadata information. Visualization aspects are also supported to allow easy manipulation, interaction and representation of the 5D-DCHM geometry and the respective semantic information. The open source 3DCityDB incorporating a PostgreSQL geo-database is used to manage and manipulate 3D data and their semantics.
Mishra, Bud; Daruwala, Raoul-Sam; Zhou, Yi; Ugel, Nadia; Policriti, Alberto; Antoniotti, Marco; Paxia, Salvatore; Rejali, Marc; Rudra, Archisman; Cherepinsky, Vera; Silver, Naomi; Casey, William; Piazza, Carla; Simeoni, Marta; Barbano, Paolo; Spivak, Marina; Feng, Jiawu; Gill, Ofer; Venkatesh, Mysore; Cheng, Fang; Sun, Bing; Ioniata, Iuliana; Anantharaman, Thomas; Hubbard, E Jane Albert; Pnueli, Amir; Harel, David; Chandru, Vijay; Hariharan, Ramesh; Wigler, Michael; Park, Frank; Lin, Shih-Chieh; Lazebnik, Yuri; Winkler, Franz; Cantor, Charles R; Carbone, Alessandra; Gromov, Mikhael
2003-01-01
We collaborate in a research program aimed at creating a rigorous framework, experimental infrastructure, and computational environment for understanding, experimenting with, manipulating, and modifying a diverse set of fundamental biological processes at multiple scales and spatio-temporal modes. The novelty of our research is based on an approach that (i) requires coevolution of experimental science and theoretical techniques and (ii) exploits a certain universality in biology guided by a parsimonious model of evolutionary mechanisms operating at the genomic level and manifesting at the proteomic, transcriptomic, phylogenic, and other higher levels. Our current program in "systems biology" endeavors to marry large-scale biological experiments with the tools to ponder and reason about large, complex, and subtle natural systems. To achieve this ambitious goal, ideas and concepts are combined from many different fields: biological experimentation, applied mathematical modeling, computational reasoning schemes, and large-scale numerical and symbolic simulations. From a biological viewpoint, the basic issues are many: (i) understanding common and shared structural motifs among biological processes; (ii) modeling biological noise due to interactions among a small number of key molecules or loss of synchrony; (iii) explaining the robustness of these systems in spite of such noise; and (iv) cataloging multistatic behavior and adaptation exhibited by many biological processes.
A Scalable Cyberinfrastructure for Interactive Visualization of Terascale Microscopy Data
Venkat, A.; Christensen, C.; Gyulassy, A.; Summa, B.; Federer, F.; Angelucci, A.; Pascucci, V.
2017-01-01
The goal of the recently emerged field of connectomics is to generate a wiring diagram of the brain at different scales. To identify brain circuitry, neuroscientists use specialized microscopes to perform multichannel imaging of labeled neurons at a very high resolution. CLARITY tissue clearing allows imaging labeled circuits through entire tissue blocks, without the need for tissue sectioning and section-to-section alignment. Imaging the large and complex non-human primate brain with sufficient resolution to identify and disambiguate between axons, in particular, produces massive data, creating great computational challenges to the study of neural circuits. Researchers require novel software capabilities for compiling, stitching, and visualizing large imagery. In this work, we detail the image acquisition process and a hierarchical streaming platform, ViSUS, that enables interactive visualization of these massive multi-volume datasets using a standard desktop computer. The ViSUS visualization framework has previously been shown to be suitable for 3D combustion simulation, climate simulation and visualization of large scale panoramic images. The platform is organized around a hierarchical cache oblivious data layout, called the IDX file format, which enables interactive visualization and exploration in ViSUS, scaling to the largest 3D images. In this paper we showcase the VISUS framework used in an interactive setting with the microscopy data. PMID:28638896
A Scalable Cyberinfrastructure for Interactive Visualization of Terascale Microscopy Data.
Venkat, A; Christensen, C; Gyulassy, A; Summa, B; Federer, F; Angelucci, A; Pascucci, V
2016-08-01
The goal of the recently emerged field of connectomics is to generate a wiring diagram of the brain at different scales. To identify brain circuitry, neuroscientists use specialized microscopes to perform multichannel imaging of labeled neurons at a very high resolution. CLARITY tissue clearing allows imaging labeled circuits through entire tissue blocks, without the need for tissue sectioning and section-to-section alignment. Imaging the large and complex non-human primate brain with sufficient resolution to identify and disambiguate between axons, in particular, produces massive data, creating great computational challenges to the study of neural circuits. Researchers require novel software capabilities for compiling, stitching, and visualizing large imagery. In this work, we detail the image acquisition process and a hierarchical streaming platform, ViSUS, that enables interactive visualization of these massive multi-volume datasets using a standard desktop computer. The ViSUS visualization framework has previously been shown to be suitable for 3D combustion simulation, climate simulation and visualization of large scale panoramic images. The platform is organized around a hierarchical cache oblivious data layout, called the IDX file format, which enables interactive visualization and exploration in ViSUS, scaling to the largest 3D images. In this paper we showcase the VISUS framework used in an interactive setting with the microscopy data.
Blueprint for a microwave trapped ion quantum computer.
Lekitsch, Bjoern; Weidt, Sebastian; Fowler, Austin G; Mølmer, Klaus; Devitt, Simon J; Wunderlich, Christof; Hensinger, Winfried K
2017-02-01
The availability of a universal quantum computer may have a fundamental impact on a vast number of research fields and on society as a whole. An increasingly large scientific and industrial community is working toward the realization of such a device. An arbitrarily large quantum computer may best be constructed using a modular approach. We present a blueprint for a trapped ion-based scalable quantum computer module, making it possible to create a scalable quantum computer architecture based on long-wavelength radiation quantum gates. The modules control all operations as stand-alone units, are constructed using silicon microfabrication techniques, and are within reach of current technology. To perform the required quantum computations, the modules make use of long-wavelength radiation-based quantum gate technology. To scale this microwave quantum computer architecture to a large size, we present a fully scalable design that makes use of ion transport between different modules, thereby allowing arbitrarily many modules to be connected to construct a large-scale device. A high error-threshold surface error correction code can be implemented in the proposed architecture to execute fault-tolerant operations. With appropriate adjustments, the proposed modules are also suitable for alternative trapped ion quantum computer architectures, such as schemes using photonic interconnects.
2016-04-30
focus on novel onshore/offshore and small/large scale wind turbine designs for expanding their operational range and increasing their efficiency at...of maintenance options created by the implementation of PHM in wind turbines . When an RUL is predicted for a subsystem, there are multiple choices...The section titled Example— Wind Turbine With an Outcome-Based Contract presents a case study for a PHM enabled wind turbine with and without an
Molecular Beam Epitaxial Growth of GaAs on (631) Oriented Substrates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cruz Hernandez, Esteban; Rojas Ramirez, Juan-Salvador; Contreras Hernandez, Rocio
2007-02-09
In this work, we report the study of the homoepitaxial growth of GaAs on (631) oriented substrates by molecular beam epitaxy (MBE). We observed the spontaneous formation of a high density of large scale features on the surface. The hilly like features are elongated towards the [-5, 9, 3] direction. We show the dependence of these structures with the growth conditions and we present the possibility of to create quantum wires structures on this surface.
How plume-ridge interaction shapes the crustal thickness pattern of the Réunion hotspot track
NASA Astrophysics Data System (ADS)
Bredow, Eva; Steinberger, Bernhard; Gassmöller, Rene; Dannberg, Juliane
2017-08-01
The Réunion mantle plume has shaped a large area of the Earth's surface over the past 65 million years: from the Deccan Traps in India along the hotspot track comprising the island chains of the Laccadives, Maldives, and Chagos Bank on the Indian plate and the Mascarene Plateau on the African plate up to the currently active volcanism at La Réunion Island. This study addresses the question how the Réunion plume, especially in interaction with the Central Indian Ridge, created the complex crustal thickness pattern of the hotspot track. For this purpose, the mantle convection code ASPECT was used to design three-dimensional numerical models, which consider the specific location of the plume underneath moving plates and surrounded by large-scale mantle flow. The results show the crustal thickness pattern produced by the plume, which altogether agrees well with topographic maps. Especially two features are consistently reproduced by the models: the distinctive gap in the hotspot track between the Maldives and Chagos is created by the combination of the ridge geometry and plume-ridge interaction; and the Rodrigues Ridge, a narrow crustal structure which connects the hotspot track and the Central Indian Ridge, appears as the surface expression of a long-distance sublithospheric flow channel. This study therefore provides further insight how small-scale surface features are generated by the complex interplay between mantle and lithospheric processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Marvin; Bose, James; Beier, Richard
2004-12-01
The assets that Citizen Potawatomi Nation holds were evaluated to help define the strengths and weaknesses to be used in pursuing economic prosperity. With this baseline assessment, a Planning Team will create a vision for the tribe to integrate into long-term energy and business strategies. Identification of energy efficiency devices, systems and technologies was made, and an estimation of cost benefits of the more promising ideas is submitted for possible inclusion into the final energy plan. Multiple energy resources and sources were identified and their attributes were assessed to determine the appropriateness of each. Methods of saving energy were evaluatedmore » and reported on and potential revenue-generating sources that specifically fit the tribe were identified and reported. A primary goal is to create long-term energy strategies to explore development of tribal utility options and analyze renewable energy and energy efficiency options. Associated goals are to consider exploring energy efficiency and renewable economic development projects involving the following topics: (1) Home-scale projects may include construction of a home with energy efficiency or renewable energy features and retrofitting an existing home to add energy efficiency or renewable energy features. (2) Community-scale projects may include medium to large scale energy efficiency building construction, retrofit project, or installation of community renewable energy systems. (3) Small business development may include the creation of a tribal enterprise that would manufacture and distribute solar and wind powered equipment for ranches and farms or create a contracting business to include energy efficiency and renewable retrofits such as geothermal heat pumps. (4) Commercial-scale energy projects may include at a larger scale, the formation of a tribal utility formed to sell power to the commercial grid, or to transmit and distribute power throughout the tribal community, or hydrogen production, and propane and natural-gas distribution systems.« less
Snow Tweets: Emergency Information Dissemination in a US County During 2014 Winter Storms
Bonnan-White, Jess; Shulman, Jason; Bielecke, Abigail
2014-01-01
Introduction: This paper describes how American federal, state, and local organizations created, sourced, and disseminated emergency information via social media in preparation for several winter storms in one county in the state of New Jersey (USA). Methods: Postings submitted to Twitter for three winter storm periods were collected from selected organizations, along with a purposeful sample of select private local users. Storm-related posts were analyzed for stylistic features (hashtags, retweet mentions, embedded URLs). Sharing and re-tweeting patterns were also mapped using NodeXL. Results: Results indicate emergency management entities were active in providing preparedness and response information during the selected winter weather events. A large number of posts, however, did not include unique Twitter features that maximize dissemination and discovery by users. Visual representations of interactions illustrate opportunities for developing stronger relationships among agencies. Discussion: Whereas previous research predominantly focuses on large-scale national or international disaster contexts, the current study instead provides needed analysis in a small-scale context. With practice during localized events like extreme weather, effective information dissemination in large events can be enhanced. PMID:25685629
Snow Tweets: Emergency Information Dissemination in a US County During 2014 Winter Storms.
Bonnan-White, Jess; Shulman, Jason; Bielecke, Abigail
2014-12-22
This paper describes how American federal, state, and local organizations created, sourced, and disseminated emergency information via social media in preparation for several winter storms in one county in the state of New Jersey (USA). Postings submitted to Twitter for three winter storm periods were collected from selected organizations, along with a purposeful sample of select private local users. Storm-related posts were analyzed for stylistic features (hashtags, retweet mentions, embedded URLs). Sharing and re-tweeting patterns were also mapped using NodeXL. RESULTS indicate emergency management entities were active in providing preparedness and response information during the selected winter weather events. A large number of posts, however, did not include unique Twitter features that maximize dissemination and discovery by users. Visual representations of interactions illustrate opportunities for developing stronger relationships among agencies. Whereas previous research predominantly focuses on large-scale national or international disaster contexts, the current study instead provides needed analysis in a small-scale context. With practice during localized events like extreme weather, effective information dissemination in large events can be enhanced.
Is the negative IOD during 2016 the reason for monsoon failure over southwest peninsular India?
NASA Astrophysics Data System (ADS)
Sreelekha, P. N.; Babu, C. A.
2018-01-01
The study investigates the mechanism responsible for the deficit rainfall over southwest peninsular India during the 2016 monsoon season. Analysis shows that the large-scale variation in circulation pattern due to the strong, negative Indian Ocean Dipole phenomenon was the reason for the deficit rainfall. Significant reduction in the number of northward-propagating monsoon-organized convections together with fast propagation over the southwest peninsular India resulted in reduction in rainfall. On the other hand, their persistence for longer time over the central part of India resulted in normal rainfall. It was found that the strong convection over the eastern equatorial Indian Ocean creates strong convergence over that region. The combined effect of the sinking due to the well-developed Walker circulation originated over the eastern equatorial Indian Ocean and the descending limb of the monsoon Hadley cell caused strong subsidence over the western equatorial Indian Ocean. The tail of this large-scale sinking extended up to the southern parts of India. This hinders formation of monsoon-organized convections leading to a large deficiency of rainfall during monsoon 2016 over the southwest peninsular India.
CORALINA: a universal method for the generation of gRNA libraries for CRISPR-based screening.
Köferle, Anna; Worf, Karolina; Breunig, Christopher; Baumann, Valentin; Herrero, Javier; Wiesbeck, Maximilian; Hutter, Lukas H; Götz, Magdalena; Fuchs, Christiane; Beck, Stephan; Stricker, Stefan H
2016-11-14
The bacterial CRISPR system is fast becoming the most popular genetic and epigenetic engineering tool due to its universal applicability and adaptability. The desire to deploy CRISPR-based methods in a large variety of species and contexts has created an urgent need for the development of easy, time- and cost-effective methods enabling large-scale screening approaches. Here we describe CORALINA (comprehensive gRNA library generation through controlled nuclease activity), a method for the generation of comprehensive gRNA libraries for CRISPR-based screens. CORALINA gRNA libraries can be derived from any source of DNA without the need of complex oligonucleotide synthesis. We show the utility of CORALINA for human and mouse genomic DNA, its reproducibility in covering the most relevant genomic features including regulatory, coding and non-coding sequences and confirm the functionality of CORALINA generated gRNAs. The simplicity and cost-effectiveness make CORALINA suitable for any experimental system. The unprecedented sequence complexities obtainable with CORALINA libraries are a necessary pre-requisite for less biased large scale genomic and epigenomic screens.
Impact of lateral boundary conditions on regional analyses
NASA Astrophysics Data System (ADS)
Chikhar, Kamel; Gauthier, Pierre
2017-04-01
Regional and global climate models are usually validated by comparison to derived observations or reanalyses. Using a model in data assimilation results in a direct comparison to observations to produce its own analyses that may reveal systematic errors. In this study, regional analyses over North America are produced based on the fifth-generation Canadian Regional Climate Model (CRCM5) combined with the variational data assimilation system of the Meteorological Service of Canada (MSC). CRCM5 is driven at its boundaries by global analyses from ERA-interim or produced with the global configuration of the CRCM5. Assimilation cycles for the months of January and July 2011 revealed systematic errors in winter through large values in the mean analysis increments. This bias is attributed to the coupling of the lateral boundary conditions of the regional model with the driving data particularly over the northern boundary where a rapidly changing large scale circulation created significant cross-boundary flows. Increasing the time frequency of the lateral driving and applying a large-scale spectral nudging improved significantly the circulation through the lateral boundaries which translated in a much better agreement with observations.
The future of management: The NASA paradigm
NASA Technical Reports Server (NTRS)
Harris, Philip R.
1992-01-01
Prototypes of 21st century management, especially for large scale enterprises, may well be found within the aerospace industry. The space era inaugurated a number of projects of such scope and magnitude that another type of management had to be created to ensure successful achievement. The challenges will be not just in terms of technology and its management, but also human and cultural in dimension. Futurists, students of management, and those concerned with technological administration would do well to review the literature of emerging space management for its wider implications. NASA offers a paradigm, or demonstrated model, of future trends in the field of management at large. More research is needed on issues of leadership for Earth based project in space and space based programs with managers there. It is needed to realize that large scale technical enterprises, such as are undertaken in space, require a new form of management. NASA and other responsible agencies are urged to study excellence in space macromanagement, including the necessary multidisciplinary skills. Two recommended targets are the application of general living systems theory and macromanagement concepts for space stations in the 1990s.
Johannessen, Liv Karen; Obstfelder, Aud; Lotherington, Ann Therese
2013-05-01
The purpose of this paper is to explore the making and scaling of information infrastructures, as well as how the conditions for scaling a component may change for the vendor. The first research question is how the making and scaling of a healthcare information infrastructure can be done and by whom. The second question is what scope for manoeuvre there might be for vendors aiming to expand their market. This case study is based on an interpretive approach, whereby data is gathered through participant observation and semi-structured interviews. A case study of the making and scaling of an electronic system for general practitioners ordering laboratory services from hospitals is described as comprising two distinct phases. The first may be characterized as an evolving phase, when development, integration and implementation were achieved in small steps, and the vendor, together with end users, had considerable freedom to create the solution according to the users' needs. The second phase was characterized by a large-scale procurement process over which regional healthcare authorities exercised much more control and the needs of groups other than the end users influenced the design. The making and scaling of healthcare information infrastructures is not simply a process of evolution, in which the end users use and change the technology. It also consists of large steps, during which different actors, including vendors and healthcare authorities, may make substantial contributions. This process requires work, negotiation and strategies. The conditions for the vendor may change dramatically, from considerable freedom and close relationships with users and customers in the small-scale development, to losing control of the product and being required to engage in more formal relations with customers in the wider public healthcare market. Onerous procurement processes may be one of the reasons why large-scale implementation of information projects in healthcare is difficult and slow. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Improving Design Efficiency for Large-Scale Heterogeneous Circuits
NASA Astrophysics Data System (ADS)
Gregerson, Anthony
Despite increases in logic density, many Big Data applications must still be partitioned across multiple computing devices in order to meet their strict performance requirements. Among the most demanding of these applications is high-energy physics (HEP), which uses complex computing systems consisting of thousands of FPGAs and ASICs to process the sensor data created by experiments at particles accelerators such as the Large Hadron Collider (LHC). Designing such computing systems is challenging due to the scale of the systems, the exceptionally high-throughput and low-latency performance constraints that necessitate application-specific hardware implementations, the requirement that algorithms are efficiently partitioned across many devices, and the possible need to update the implemented algorithms during the lifetime of the system. In this work, we describe our research to develop flexible architectures for implementing such large-scale circuits on FPGAs. In particular, this work is motivated by (but not limited in scope to) high-energy physics algorithms for the Compact Muon Solenoid (CMS) experiment at the LHC. To make efficient use of logic resources in multi-FPGA systems, we introduce Multi-Personality Partitioning, a novel form of the graph partitioning problem, and present partitioning algorithms that can significantly improve resource utilization on heterogeneous devices while also reducing inter-chip connections. To reduce the high communication costs of Big Data applications, we also introduce Information-Aware Partitioning, a partitioning method that analyzes the data content of application-specific circuits, characterizes their entropy, and selects circuit partitions that enable efficient compression of data between chips. We employ our information-aware partitioning method to improve the performance of the hardware validation platform for evaluating new algorithms for the CMS experiment. Together, these research efforts help to improve the efficiency and decrease the cost of the developing large-scale, heterogeneous circuits needed to enable large-scale application in high-energy physics and other important areas.
Rudel, Thomas K
2009-07-01
Humans transformed landscapes at an unprecedented scale and pace during the 20th century, creating sprawling urban areas in affluent countries and large-scale agricultural expanses in tropics. To date, attempts to explain these processes in other disciplines have had a disembodied, a historical quality to them. A sociological account of these changes emphasizes the role of strategic actions by states and coalitions of interested parties in transforming landscapes. It identifies the agents of change and the timing of transformative events. Case studies of suburban sprawl and tropical deforestation illustrate the value of the sociological approach and the wide range of situations to which it applies.
Xiong, Xiaorui R.; Liang, Feixue; Li, Haifu; Mesik, Lukas; Zhang, Ke K.; Polley, Daniel B.; Tao, Huizhong W.; Xiao, Zhongju; Zhang, Li I.
2013-01-01
Binaural integration in the central nucleus of inferior colliculus (ICC) plays a critical role in sound localization. However, its arithmetic nature and underlying synaptic mechanisms remain unclear. Here, we showed in mouse ICC neurons that the contralateral dominance is created by a “push-pull”-like mechanism, with contralaterally dominant excitation and more bilaterally balanced inhibition. Importantly, binaural spiking response is generated apparently from an ipsilaterally-mediated scaling of contralateral response, leaving frequency tuning unchanged. This scaling effect is attributed to a divisive attenuation of contralaterally-evoked synaptic excitation onto ICC neurons with their inhibition largely unaffected. Thus, a gain control mediates the linear transformation from monaural to binaural spike responses. The gain value is modulated by interaural level difference (ILD) primarily through scaling excitation to different levels. The ILD-dependent synaptic scaling and gain adjustment allow ICC neurons to dynamically encode interaural sound localization cues while maintaining an invariant representation of other independent sound attributes. PMID:23972599
SurveyWiz and factorWiz: JavaScript Web pages that make HTML forms for research on the Internet.
Birnbaum, M H
2000-05-01
SurveyWiz and factorWiz are Web pages that act as wizards to create HTML forms that enable one to collect data via the Web. SurveyWiz allows the user to enter survey questions or personality test items with a mixture of text boxes and scales of radio buttons. One can add demographic questions of age, sex, education, and nationality with the push of a button. FactorWiz creates the HTML for within-subjects, two-factor designs as large as 9 x 9, or higher order factorial designs up to 81 cells. The user enters levels of the row and column factors, which can be text, images, or other multimedia. FactorWiz generates the stimulus combinations, randomizes their order, and creates the page. In both programs HTML is displayed in a window, and the user copies it to a text editor to save it. When uploaded to a Web server and supported by a CGI script, the created Web pages allow data to be collected, coded, and saved on the server. These programs are intended to assist researchers and students in quickly creating studies that can be administered via the Web.
NASA Astrophysics Data System (ADS)
Arevalo, S.; Atwood, C.; Bell, P.; Blacker, T. D.; Dey, S.; Fisher, D.; Fisher, D. A.; Genalis, P.; Gorski, J.; Harris, A.; Hill, K.; Hurwitz, M.; Kendall, R. P.; Meakin, R. L.; Morton, S.; Moyer, E. T.; Post, D. E.; Strawn, R.; Veldhuizen, D. v.; Votta, L. G.; Wynn, S.; Zelinski, G.
2008-07-01
In FY2008, the U.S. Department of Defense (DoD) initiated the Computational Research and Engineering Acquisition Tools and Environments (CREATE) program, a 360M program with a two-year planning phase and a ten-year execution phase. CREATE will develop and deploy three computational engineering tool sets for DoD acquisition programs to use to design aircraft, ships and radio-frequency antennas. The planning and execution of CREATE are based on the 'lessons learned' from case studies of large-scale computational science and engineering projects. The case studies stress the importance of a stable, close-knit development team; a focus on customer needs and requirements; verification and validation; flexible and agile planning, management, and development processes; risk management; realistic schedules and resource levels; balanced short- and long-term goals and deliverables; and stable, long-term support by the program sponsor. Since it began in FY2008, the CREATE program has built a team and project structure, developed requirements and begun validating them, identified candidate products, established initial connections with the acquisition programs, begun detailed project planning and development, and generated the initial collaboration infrastructure necessary for success by its multi-institutional, multidisciplinary teams.
NASA Astrophysics Data System (ADS)
Ercolano, Barbara; Weber, Michael L.; Owen, James E.
2018-01-01
Circumstellar discs with large dust depleted cavities and vigorous accretion on to the central star are often considered signposts for (multiple) giant planet formation. In this Letter, we show that X-ray photoevaporation operating in discs with modest (factors 3-10) gas-phase depletion of carbon and oxygen at large radii ( > 15 au) yields the inner radius and accretion rates for most of the observed discs, without the need to invoke giant planet formation. We present one-dimensional viscous evolution models of discs affected by X-ray photoevaporation assuming moderate gas-phase depletion of carbon and oxygen, well within the range reported by recent observations. Our models use a simplified prescription for scaling the X-ray photoevaporation rates and profiles at different metallicity, and our quantitative result depends on this scaling. While more rigorous hydrodynamical modelling of mass-loss profiles at low metallicities is required to constrain the observational parameter space that can be explained by our models, the general conclusion that metal sequestering at large radii may be responsible for the observed diversity of transition discs is shown to be robust. Gap opening by giant planet formation may still be responsible for a number of observed transition discs with large cavities and very high accretion rate.
NASA Technical Reports Server (NTRS)
Niles, P.B.
2008-01-01
The chemistry, sedimentology, and geology of the Meridiani sedimentary deposits are best explained by eolian reworking of the sublimation residue of a large scale ice/dust deposit. This large ice deposit was located in close proximity to Terra Meridiani and incorporated large amounts of dust, sand, and SO2 aerosols generated by impacts and volcanism during early martian history. Sulfate formation and chemical weathering of the initial igneous material is hypothesized to have occurred inside of the ice when the darker mineral grains were heated by solar radiant energy. This created conditions in which small films of liquid water were created in and around the mineral grains. This water dissolved the SO2 and reacted with the mineral grains forming an acidic environment under low water/rock conditions. Subsequent sublimation of this ice deposit left behind large amounts of weathered sublimation residue which became the source material for the eolian process that deposited the Terra Meridiani deposit. The following features of the Meridiani sediments are best explained by this model: The large scale of the deposit, its mineralogic similarity across large distances, the cation-conservative nature of the weathering processes, the presence of acidic groundwaters on a basaltic planet, the accumulation of a thick sedimentary sequence outside of a topographic basin, and the low water/rock ratio needed to explain the presence of very soluble minerals and elements in the deposit. Remote sensing studies have linked the Meridiani deposits to a number of other martian surface features through mineralogic similarities, geomorphic similarities, and regional associations. These include layered deposits in Arabia Terra, interior layered deposits in the Valles Marineris system, southern Elysium/Aeolis, Amazonis Planitia, and the Hellas basin, Aram Chaos, Aureum Chaos, and Ioni Chaos. The common properties shared by these deposits suggest that all of these deposits share a common formation process which must have acted over a large area of Mars. The results of this study suggest a mechanism for volatile transport on Mars without invoking an early greenhouse. They also imply a common formation mechanism for most of the sulfate minerals and layered deposits on Mars, which explains their common occurrence.
Challenges in Modeling and Measuring Learning Trajectories
ERIC Educational Resources Information Center
Confrey, Jere; Jones, R. Seth; Gianopulos, Garron
2015-01-01
Briggs and Peck make a compelling case for creating new, more intuitive measures of learning, based on creating vertical scales using learning trajectories (LT) in place of "domain sampling." We believe that the importance of creating measurement scales that coordinate recognizable landmarks in learning trajectories with interval scales…
Tracing Galactic Outflows to the Source: Spatially Resolved Feedback in M83 with COS
NASA Astrophysics Data System (ADS)
Aloisi, Alessandra
2016-10-01
Star-formation (SF) feedback plays a vital role in shaping galaxy properties, but there are many open questions about how this feedback is created, propagated, and felt by galaxies. SF-driven feedback can be observationally constrained with rest-frame UV absorption-line spectroscopy that accesses a range of powerful gas density and kinematic diagnostics. Studies at both high and low redshift show clear evidence for large-scale outflows in star-forming galaxies that scale with galaxy SF rate. However, by sampling one sightline or the galaxy as a whole, these studies are not tailored to reveal how the large-scale outflows develop from their ultimate sources at the scale of individual SF regions. We propose the first spatially-resolved COS G130M/G160M (1130-1800 A) study of the ISM in the nearby (4.6 Mpc) face-on spiral starburst M83 using individual young star clusters as background sources. This is the first down-the-barrel study where blueshifted absorptions can be identified directly with outflowing gas in a spatially resolved fashion. The kpc-scale flows sampled by the COS pointings will be anchored to the properties of the large-scale (10-100 kpc) flows thanks to the wealth of multi-wavelength observations of M83 from X-ray to radio. A comparison of COS data with mock spectra from constrained simulations of spiral galaxies with FIRE (Feedback In Realistic Environments; a code with unprecedented 1-100 pc spatial resolution and self-consistent treatments of stellar feedback) will provide an important validation of these simulations and will supply the community with a powerful and well-tested tool for galaxy formation predictions applicable to all redshifts.
Innovative Visualizations Shed Light on Avian Nocturnal Migration
Farnsworth, Andrew; Aelterman, Bart; Alves, Jose A.; Azijn, Kevin; Bernstein, Garrett; Branco, Sérgio; Desmet, Peter; Dokter, Adriaan M.; Horton, Kyle; Kelling, Steve; Kelly, Jeffrey F.; Leijnse, Hidde; Rong, Jingjing; Sheldon, Daniel; Van den Broeck, Wouter; Van Den Meersche, Jan Klaas; Van Doren, Benjamin Mark; van Gasteren, Hans
2016-01-01
Globally, billions of flying animals undergo seasonal migrations, many of which occur at night. The temporal and spatial scales at which migrations occur and our inability to directly observe these nocturnal movements makes monitoring and characterizing this critical period in migratory animals’ life cycles difficult. Remote sensing, therefore, has played an important role in our understanding of large-scale nocturnal bird migrations. Weather surveillance radar networks in Europe and North America have great potential for long-term low-cost monitoring of bird migration at scales that have previously been impossible to achieve. Such long-term monitoring, however, poses a number of challenges for the ornithological and ecological communities: how does one take advantage of this vast data resource, integrate information across multiple sensors and large spatial and temporal scales, and visually represent the data for interpretation and dissemination, considering the dynamic nature of migration? We assembled an interdisciplinary team of ecologists, meteorologists, computer scientists, and graphic designers to develop two different flow visualizations, which are interactive and open source, in order to create novel representations of broad-front nocturnal bird migration to address a primary impediment to long-term, large-scale nocturnal migration monitoring. We have applied these visualization techniques to mass bird migration events recorded by two different weather surveillance radar networks covering regions in Europe and North America. These applications show the flexibility and portability of such an approach. The visualizations provide an intuitive representation of the scale and dynamics of these complex systems, are easily accessible for a broad interest group, and are biologically insightful. Additionally, they facilitate fundamental ecological research, conservation, mitigation of human–wildlife conflicts, improvement of meteorological products, and public outreach, education, and engagement. PMID:27557096
Innovative Visualizations Shed Light on Avian Nocturnal Migration.
Shamoun-Baranes, Judy; Farnsworth, Andrew; Aelterman, Bart; Alves, Jose A; Azijn, Kevin; Bernstein, Garrett; Branco, Sérgio; Desmet, Peter; Dokter, Adriaan M; Horton, Kyle; Kelling, Steve; Kelly, Jeffrey F; Leijnse, Hidde; Rong, Jingjing; Sheldon, Daniel; Van den Broeck, Wouter; Van Den Meersche, Jan Klaas; Van Doren, Benjamin Mark; van Gasteren, Hans
2016-01-01
Globally, billions of flying animals undergo seasonal migrations, many of which occur at night. The temporal and spatial scales at which migrations occur and our inability to directly observe these nocturnal movements makes monitoring and characterizing this critical period in migratory animals' life cycles difficult. Remote sensing, therefore, has played an important role in our understanding of large-scale nocturnal bird migrations. Weather surveillance radar networks in Europe and North America have great potential for long-term low-cost monitoring of bird migration at scales that have previously been impossible to achieve. Such long-term monitoring, however, poses a number of challenges for the ornithological and ecological communities: how does one take advantage of this vast data resource, integrate information across multiple sensors and large spatial and temporal scales, and visually represent the data for interpretation and dissemination, considering the dynamic nature of migration? We assembled an interdisciplinary team of ecologists, meteorologists, computer scientists, and graphic designers to develop two different flow visualizations, which are interactive and open source, in order to create novel representations of broad-front nocturnal bird migration to address a primary impediment to long-term, large-scale nocturnal migration monitoring. We have applied these visualization techniques to mass bird migration events recorded by two different weather surveillance radar networks covering regions in Europe and North America. These applications show the flexibility and portability of such an approach. The visualizations provide an intuitive representation of the scale and dynamics of these complex systems, are easily accessible for a broad interest group, and are biologically insightful. Additionally, they facilitate fundamental ecological research, conservation, mitigation of human-wildlife conflicts, improvement of meteorological products, and public outreach, education, and engagement.
Gray, Nicola; Lewis, Matthew R; Plumb, Robert S; Wilson, Ian D; Nicholson, Jeremy K
2015-06-05
A new generation of metabolic phenotyping centers are being created to meet the increasing demands of personalized healthcare, and this has resulted in a major requirement for economical, high-throughput metabonomic analysis by liquid chromatography-mass spectrometry (LC-MS). Meeting these new demands represents an emerging bioanalytical problem that must be solved if metabolic phenotyping is to be successfully applied to large clinical and epidemiological sample sets. Ultraperformance (UP)LC-MS-based metabolic phenotyping, based on 2.1 mm i.d. LC columns, enables comprehensive metabolic phenotyping but, when employed for the analysis of thousands of samples, results in high solvent usage. The use of UPLC-MS employing 1 mm i.d. columns for metabolic phenotyping rather than the conventional 2.1 mm i.d. methodology shows that the resulting optimized microbore method provided equivalent or superior performance in terms of peak capacity, sensitivity, and robustness. On average, we also observed, when using the microbore scale separation, an increase in response of 2-3 fold over that obtained with the standard 2.1 mm scale method. When applied to the analysis of human urine, the 1 mm scale method showed no decline in performance over the course of 1000 analyses, illustrating that microbore UPLC-MS represents a viable alternative to conventional 2.1 mm i.d. formats for routine large-scale metabolic profiling studies while also resulting in a 75% reduction in solvent usage. The modest increase in sensitivity provided by this methodology also offers the potential to either reduce sample consumption or increase the number of metabolite features detected with confidence due to the increased signal-to-noise ratios obtained. Implementation of this miniaturized UPLC-MS method of metabolic phenotyping results in clear analytical, economic, and environmental benefits for large-scale metabolic profiling studies with similar or improved analytical performance compared to conventional UPLC-MS.
Properties of galaxies reproduced by a hydrodynamic simulation
NASA Astrophysics Data System (ADS)
Vogelsberger, M.; Genel, S.; Springel, V.; Torrey, P.; Sijacki, D.; Xu, D.; Snyder, G.; Bird, S.; Nelson, D.; Hernquist, L.
2014-05-01
Previous simulations of the growth of cosmic structures have broadly reproduced the `cosmic web' of galaxies that we see in the Universe, but failed to create a mixed population of elliptical and spiral galaxies, because of numerical inaccuracies and incomplete physical models. Moreover, they were unable to track the small-scale evolution of gas and stars to the present epoch within a representative portion of the Universe. Here we report a simulation that starts 12 million years after the Big Bang, and traces 13 billion years of cosmic evolution with 12 billion resolution elements in a cube of 106.5 megaparsecs a side. It yields a reasonable population of ellipticals and spirals, reproduces the observed distribution of galaxies in clusters and characteristics of hydrogen on large scales, and at the same time matches the `metal' and hydrogen content of galaxies on small scales.
NASA Technical Reports Server (NTRS)
Farrell, W. M.; McLain, J. L.; Collier, M. R.; Keller, J. W.
2017-01-01
Analogous to terrestrial dust devils, charged dust in Mars dust devils should become vertically stratified in the convective features, creating large scale E-fields. This E-field in a Martian-like atmosphere has been shown to stimulate the development of a Townsend discharge (electron avalanche) that acts to dissipate charge in regions where charge build-up occurs. While the stratification of the charged dust is a source of the electrical energy, the uncharged particulates in the dust population may absorb a portion of these avalanching electrons, thereby inhibiting dissipation and leading to the development of anomalously large E-field values. We performed a laboratory study that does indeed show the presence of enhanced E-field strengths between an anode and cathode when dust-absorbing filaments (acting as particulates) are placed in the avalanching electron flow. Further, the E-field threshold condition to create an impulsive spark discharge increases to larger values as more filaments are placed between the anode and cathode. We conclude that the spatially separated charged dust creates the charge centers and E-fields in a dust devil, but the under-charged portion of the population acts to reduce Townsend electron dissipation currents, further fortifying the development of larger-than-expected E-fields.
NASA Astrophysics Data System (ADS)
Hossain, U. H.; Ensinger, W.
2015-12-01
Devices operating in space, e.g. in satellites, are being hit by cosmic rays. These include so-called HZE-ions, with High mass (Z) and energy (E). These highly energetic heavy ions penetrate deeply into the materials and deposit a large amount of energy, typically several keV per nm range. Serious damage is created. In space vehicles, polymers are used which are degraded under ion bombardment. HZE ion irradiation can experimentally be simulated in large scale accelerators. In the present study, the radiation damage of aliphatic vinyl- and fluoro-polymers by heavy ions with energies in the GeV range is described. The ions cause bond scission and create volatile small molecular species, leading to considerable mass loss of the polymers. Since hydrogen, oxygen and fluorine-containing molecules are created and these elements are depleted, the remaining material is carbon-richer than the original polymers and contains conjugated CC double bonds. This process is investigated by measuring the optical band gap with UV-Vis absorption spectrometry as a function of ion fluence. The results show how the optical band gaps shift from the UV into the Vis region upon ion irradiation for the different polymers.
Farmer, Cristan A; Aman, Michael G
2010-01-01
Although often lacking "malice", aggression is fairly common in children with intellectual or developmental disability (I/DD). Despite this, there are no scales available that are appropriate for an in-depth analysis of aggressive behavior in this population. Such scales are needed for the study of aggressive behavior, which is a common target symptom in clinical trials. We assessed the reliability and validity of the Children's Scale of Hostility and Aggression: Reactive/Proactive (C-SHARP), a new aggression scale created for children with I/DD. Data are presented from a survey of 365 children with I/DD aged 3-21 years. Interrater reliability was very high for the Problem Scale, which characterizes type of aggression. Reliability was lower but largely acceptable for the Provocation Scale, which assesses motivation. Validity of the Problem Scale was supported by expected differences in children with autism, Down syndrome, comorbid disruptive behavior disorders (DBDs) and ADHD. The Provocation Scale, which categorizes behavior as proactive or reactive, showed expected differences in children with DBD, but was less effective in those with ADHD. The C-SHARP appears to have fundamentally sound psychometric characteristics, although more research is needed.
Spatio-Temporal Variability of Groundwater Storage in India
NASA Technical Reports Server (NTRS)
Bhanja, Soumendra; Rodell, Matthew; Li, Bailing; Mukherjee, Abhijit
2016-01-01
Groundwater level measurements from 3907 monitoring wells, distributed within 22 major river basins of India, are assessed to characterize their spatial and temporal variability. Ground water storage (GWS) anomalies (relative to the long-term mean) exhibit strong seasonality, with annual maxima observed during the monsoon season and minima during pre-monsoon season. Spatial variability of GWS anomalies increases with the extent of measurements, following the power law relationship, i.e., log-(spatial variability) is linearly dependent on log-(spatial extent).In addition, the impact of well spacing on spatial variability and the power law relationship is investigated. We found that the mean GWS anomaly sampled at a 0.25 degree grid scale closes to unweighted average over all wells. The absolute error corresponding to each basin grows with increasing scale, i.e., from 0.25 degree to 1 degree. It was observed that small changes in extent could create very large changes in spatial variability at large grid scales. Spatial variability of GWS anomaly has been found to vary with climatic conditions. To our knowledge, this is the first study of the effects of well spacing on groundwater spatial variability. The results may be useful for interpreting large scale groundwater variations from unevenly spaced or sparse groundwater well observations or for siting and prioritizing wells in a network for groundwater management. The output of this study could be used to maintain a cost effective groundwater monitoring network in the study region and the approach can also be used in other parts of the globe.
Spatio-temporal variability of groundwater storage in India.
Bhanja, Soumendra N; Rodell, Matthew; Li, Bailing; Mukherjee, Abhijit
2017-01-01
Groundwater level measurements from 3907 monitoring wells, distributed within 22 major river basins of India, are assessed to characterize their spatial and temporal variability. Groundwater storage (GWS) anomalies (relative to the long-term mean) exhibit strong seasonality, with annual maxima observed during the monsoon season and minima during pre-monsoon season. Spatial variability of GWS anomalies increases with the extent of measurements, following the power law relationship, i.e., log-(spatial variability) is linearly dependent on log-(spatial extent). In addition, the impact of well spacing on spatial variability and the power law relationship is investigated. We found that the mean GWS anomaly sampled at a 0.25 degree grid scale closes to unweighted average over all wells. The absolute error corresponding to each basin grows with increasing scale, i.e., from 0.25 degree to 1 degree. It was observed that small changes in extent could create very large changes in spatial variability at large grid scales. Spatial variability of GWS anomaly has been found to vary with climatic conditions. To our knowledge, this is the first study of the effects of well spacing on groundwater spatial variability. The results may be useful for interpreting large scale groundwater variations from unevenly spaced or sparse groundwater well observations or for siting and prioritizing wells in a network for groundwater management. The output of this study could be used to maintain a cost effective groundwater monitoring network in the study region and the approach can also be used in other parts of the globe.
NASA Astrophysics Data System (ADS)
Zhang, Chen; Huang, Xiaohu; Liu, Hongfei; Chua, Soo Jin; Ross, Caroline A.
2016-12-01
Vertically aligned, highly ordered, large area arrays of nanostructures are important building blocks for multifunctional devices. Here, ZnO nanorod arrays are selectively synthesized on Si substrates by a solution method within patterns created by nanoimprint lithography. The growth modes of two dimensional nucleation-driven wedding cakes and screw dislocation-driven spirals are inferred to determine the top end morphologies of the nanorods. Sub-bandgap photoluminescence of the nanorods is greatly enhanced by the manipulation of the hydrogen donors via a post-growth thermal treatment. Lasing behavior is facilitated in the nanorods with faceted top ends formed from wedding cakes growth mode. This work demonstrates the control of morphologies of oxide nanostructures in a large scale and the optimization of the optical performance.
NASA Astrophysics Data System (ADS)
Darner, R.; Shuster, W.
2016-12-01
Expansion of the urban environment can alter the landscape and creates challenges for how cities deal with energy and water. Large volumes of stormwater in areas that have combined septic and stormwater systems present on challenge. Managing the water as near to the source as possible by creates an environment that allows more infiltration and evapotranspiration. Stormwater control measures (SCM) associated with this type of development, often called green infrastructure, include rain gardens, pervious or porous pavements, bioswales, green or blue roofs, and others. In this presentation, we examine the hydrology of green infrastructure in urban sewersheds in Cleveland and Columbus, OH. We present the need for data throughout the water cycle and challenges to collecting field data at a small scale (single rain garden instrumented to measure inflows, outflow, weather, soil moisture, and groundwater levels) and at a macro scale (a project including low-cost rain gardens, highly engineered rain gardens, groundwater wells, weather stations, soil moisture, and combined sewer flow monitoring). Results will include quantifying the effectiveness of SCMs in intercepting stormwater for different precipitation event sizes. Small scale deployment analysis will demonstrate the role of active adaptive management in the ongoing optimization over multiple years of data collection.
Unique wing scale photonics of male Rajah Brooke's birdwing butterflies.
Wilts, Bodo D; Giraldo, Marco A; Stavenga, Doekele G
2016-01-01
Ultrastructures in butterfly wing scales can take many shapes, resulting in the often striking coloration of many butterflies due to interference of light. The plethora of coloration mechanisms is dazzling, but often only single mechanisms are described for specific animals. We have here investigated the male Rajah Brooke's birdwing, Trogonoptera brookiana, a large butterfly from Malaysia, which is marked by striking, colorful wing patterns. The dorsal side is decorated with large, iridescent green patterning, while the ventral side of the wings is primarily brown-black with small white, blue and green patches on the hindwings. Dense arrays of red hairs, creating a distinct collar as well as contrasting areas ventrally around the thorax, enhance the butterfly's beauty. The remarkable coloration is realized by a diverse number of intricate and complicated nanostructures in the hairs as well as the wing scales. The red collar hairs contain a broad-band absorbing pigment as well as UV-reflecting multilayers resembling the photonic structures of Morpho butterflies; the white wing patches consist of scales with prominent thin film reflectors; the blue patches have scales with ridge multilayers and these scales also have centrally concentrated melanin. The green wing areas consist of strongly curved scales, which possess a uniquely arranged photonic structure consisting of multilayers and melanin baffles that produces highly directional reflections. Rajah Brooke's birdwing employs a variety of structural and pigmentary coloration mechanisms to achieve its stunning optical appearance. The intriguing usage of order and disorder in related photonic structures in the butterfly wing scales may inspire novel optical materials as well as investigations into the development of these nanostructures in vivo.
Constraining primordial vector mode from B-mode polarization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saga, Shohei; Ichiki, Kiyotomo; Shiraishi, Maresuke, E-mail: saga.shohei@nagoya-u.jp, E-mail: maresuke.shiraishi@pd.infn.it, E-mail: ichiki@a.phys.nagoya-u.ac.jp
The B-mode polarization spectrum of the Cosmic Microwave Background (CMB) may be the smoking gun of not only the primordial tensor mode but also of the primordial vector mode. If there exist nonzero vector-mode metric perturbations in the early Universe, they are known to be supported by anisotropic stress fluctuations of free-streaming particles such as neutrinos, and to create characteristic signatures on both the CMB temperature, E-mode, and B-mode polarization anisotropies. We place constraints on the properties of the primordial vector mode characterized by the vector-to-scalar ratio r{sub v} and the spectral index n{sub v} of the vector-shear power spectrum,more » from the Planck and BICEP2 B-mode data. We find that, for scale-invariant initial spectra, the ΛCDM model including the vector mode fits the data better than the model including the tensor mode. The difference in χ{sup 2} between the vector and tensor models is Δχ{sup 2} = 3.294, because, on large scales the vector mode generates smaller temperature fluctuations than the tensor mode, which is preferred for the data. In contrast, the tensor mode can fit the data set equally well if we allow a significantly blue-tilted spectrum. We find that the best-fitting tensor mode has a large blue tilt and leads to an indistinct reionization bump on larger angular scales. The slightly red-tilted vector mode supported by the current data set can also create O(10{sup -22})-Gauss magnetic fields at cosmological recombination. Our constraints should motivate research that considers models of the early Universe that involve the vector mode.« less
NASA Astrophysics Data System (ADS)
Harrison, L.; Hafs, A. W.; Utz, R.; Dunne, T.
2013-12-01
The habitat complexity of a riverine ecosystem substantially influences aquatic communities, and especially the bioenergetics of drift feeding fish. We coupled hydrodynamic and bioenergetic models to assess the influence of habitat complexity, generated via large woody debris (LWD) additions, on juvenile Chinook salmon (Oncorhynchus tshawytscha) growth potential in a river that lacked large wood. Model simulations indicated that LWD diversified the flow field, creating pronounced velocity gradients, which enhanced fish feeding and resting activities at the micro-habitat (sub-meter) scale. Fluid drag created by individual wood structures was increased under higher wood loading rates, leading to a 5-19% reduction in the reach-averaged velocity. We found that wood loading was asymptotically related to the reach-scale growth potential, suggesting that the river became saturated with LWD and additional loading would produce minimal benefit. In our study reach, LWD additions could potentially quadruple the potential growth area available before that limit was reached. Wood depletion in the world's rivers has been widely documented, leading to widespread attempts by river managers to reverse this trend by adding wood to simplified aquatic habitats, though systematic prediction of the effects of wood on fish growth has not been previously accomplished. We offer a quantitative, theory-based approach for assessing the role of wood on habitat potential as it affects fish growth at the micro-habitat and reach-scales. Fig. 1. Predicted flow field and salmon growth potential maps produced from model simulations with no woody debris (Graphs A and D), a low density (Graphs B and E), and a high density (Graphs C and E) of woody debris.
Dynamic permeability in fault damage zones induced by repeated coseismic fracturing events
NASA Astrophysics Data System (ADS)
Aben, F. M.; Doan, M. L.; Mitchell, T. M.
2017-12-01
Off-fault fracture damage in upper crustal fault zones change the fault zone properties and affect various co- and interseismic processes. One of these properties is the permeability of the fault damage zone rocks, which is generally higher than the surrounding host rock. This allows large-scale fluid flow through the fault zone that affects fault healing and promotes mineral transformation processes. Moreover, it might play an important role in thermal fluid pressurization during an earthquake rupture. The damage zone permeability is dynamic due to coseismic damaging. It is crucial for earthquake mechanics and for longer-term processes to understand how the dynamic permeability structure of a fault looks like and how it evolves with repeated earthquakes. To better detail coseismically induced permeability, we have performed uniaxial split Hopkinson pressure bar experiments on quartz-monzonite rock samples. Two sample sets were created and analyzed: single-loaded samples subjected to varying loading intensities - with damage varying from apparently intact to pulverized - and samples loaded at a constant intensity but with a varying number of repeated loadings. The first set resembles a dynamic permeability structure created by a single large earthquake. The second set resembles a permeability structure created by several earthquakes. After, the permeability and acoustic velocities were measured as a function of confining pressure. The permeability in both datasets shows a large and non-linear increase over several orders of magnitude (from 10-20 up to 10-14 m2) with an increasing amount of fracture damage. This, combined with microstructural analyses of the varying degrees of damage, suggests a percolation threshold. The percolation threshold does not coincide with the pulverization threshold. With increasing confining pressure, the permeability might drop up to two orders of magnitude, which supports the possibility of large coseismic fluid pulses over relatively large distances along a fault. Also, a relatively small threshold could potentially increase permeability in a large volume of rock, given that previous earthquakes already damaged these rocks.
Sigehuzi, Tomoo; Tanaka, Hajime
2004-11-01
We study phase-separation behavior of an off-symmetric fluid mixture induced by a "double temperature quench." We first quench a system into the unstable region. After a large phase-separated structure is formed, we again quench the system more deeply and follow the pattern-evolution process. The second quench makes the domains formed by the first quench unstable and leads to double phase separation; that is, small droplets are formed inside the large domains created by the first quench. The complex coarsening behavior of this hierarchic structure having two characteristic length scales is studied in detail by using the digital image analysis. We find three distinct time regimes in the time evolution of the structure factor of the system. In the first regime, small droplets coarsen with time inside large domains. There a large domain containing small droplets in it can be regarded as an isolated system. Later, however, the coarsening of small droplets stops when they start to interact via diffusion with the large domain containing them. Finally, small droplets disappear due to the Lifshitz-Slyozov mechanism. Thus the observed behavior can be explained by the crossover of the nature of a large domain from the isolated to the open system; this is a direct consequence of the existence of the two characteristic length scales.
Evolution of the Tropical Cyclone Integrated Data Exchange And Analysis System (TC-IDEAS)
NASA Technical Reports Server (NTRS)
Turk, J.; Chao, Y.; Haddad, Z.; Hristova-Veleva, S.; Knosp, B.; Lambrigtsen, B.; Li, P.; Licata, S.; Poulsen, W.; Su, H.;
2010-01-01
The Tropical Cyclone Integrated Data Exchange and Analysis System (TC-IDEAS) is being jointly developed by the Jet Propulsion Laboratory (JPL) and the Marshall Space Flight Center (MSFC) as part of NASA's Hurricane Science Research Program. The long-term goal is to create a comprehensive tropical cyclone database of satellite and airborne observations, in-situ measurements and model simulations containing parameters that pertain to the thermodynamic and microphysical structure of the storms; the air-sea interaction processes; and the large-scale environment.
Privacy Challenges of Genomic Big Data.
Shen, Hong; Ma, Jian
2017-01-01
With the rapid advancement of high-throughput DNA sequencing technologies, genomics has become a big data discipline where large-scale genetic information of human individuals can be obtained efficiently with low cost. However, such massive amount of personal genomic data creates tremendous challenge for privacy, especially given the emergence of direct-to-consumer (DTC) industry that provides genetic testing services. Here we review the recent development in genomic big data and its implications on privacy. We also discuss the current dilemmas and future challenges of genomic privacy.
Nonlinear penetration of whistler pulses into collisional plasmas via conductivity modifications
NASA Technical Reports Server (NTRS)
Urrutia, J. M.; Stenzel, R. L.
1991-01-01
A strong electromagnetic impulse (about 0.2 microsec) with central frequency in the whistler-wave regime is applied to a large laboratory plasma dominated by Coulomb collisions. Local electron heating at the antenna and transport along B0 create a channel of high conductivity along which the whistler pulse penetrates with little damping. Because of its rapid temporal evolution, this new form of modulational instability does not involve ducting by density gradients which require ion time scales to develop.
Sieblist, Christian; Jenzsch, Marco; Pohlscheidt, Michael
2016-08-01
The production of monoclonal antibodies by mammalian cell culture in bioreactors up to 25,000 L is state of the art technology in the biotech industry. During the lifecycle of a product, several scale up activities and technology transfers are typically executed to enable the supply chain strategy of a global pharmaceutical company. Given the sensitivity of mammalian cells to physicochemical culture conditions, process and equipment knowledge are critical to avoid impacts on timelines, product quantity and quality. Especially, the fluid dynamics of large scale bioreactors versus small scale models need to be described, and similarity demonstrated, in light of the Quality by Design approach promoted by the FDA. This approach comprises an associated design space which is established during process characterization and validation in bench scale bioreactors. Therefore the establishment of predictive models and simulation tools for major operating conditions of stirred vessels (mixing, mass transfer, and shear force.), based on fundamental engineering principles, have experienced a renaissance in the recent years. This work illustrates the systematic characterization of a large variety of bioreactor designs deployed in a global manufacturing network ranging from small bench scale equipment to large scale production equipment (25,000 L). Several traditional methods to determine power input, mixing, mass transfer and shear force have been used to create a data base and identify differences for various impeller types and configurations in operating ranges typically applied in cell culture processes at manufacturing scale. In addition, extrapolation of different empirical models, e.g. Cooke et al. (Paper presented at the proceedings of the 2nd international conference of bioreactor fluid dynamics, Cranfield, UK, 1988), have been assessed for their validity in these operational ranges. Results for selected designs are shown and serve as examples of structured characterization to enable fast and agile process transfers, scale up and troubleshooting.
Large scale features and energetics of the hybrid subtropical low `Duck' over the Tasman Sea
NASA Astrophysics Data System (ADS)
Pezza, Alexandre Bernardes; Garde, Luke Andrew; Veiga, José Augusto Paixão; Simmonds, Ian
2014-01-01
New aspects of the genesis and partial tropical transition of a rare hybrid subtropical cyclone on the eastern Australian coast are presented. The `Duck' (March 2001) attracted more recent attention due to its underlying genesis mechanisms being remarkably similar to the first South Atlantic hurricane (March 2004). Here we put this cyclone in climate perspective, showing that it belongs to a class within the 1 % lowest frequency percentile in the Southern Hemisphere as a function of its thermal evolution. A large scale analysis reveals a combined influence from an existing tropical cyclone and a persistent mid-latitude block. A Lagrangian tracer showed that the upper level air parcels arriving at the cyclone's center had been modified by the blocking. Lorenz energetics is used to identify connections with both tropical and extratropical processes, and reveal how these create the large scale environment conducive to the development of the vortex. The results reveal that the blocking exerted the most important influence, with a strong peak in barotropic generation of kinetic energy over a large area traversed by the air parcels just before genesis. A secondary peak also coincided with the first time the cyclone developed an upper level warm core, but with insufficient amplitude to allow for a full tropical transition. The applications of this technique are numerous and promising, particularly on the use of global climate models to infer changes in environmental parameters associated with severe storms.
Nanopatterning of Crystalline Silicon Using Anodized Aluminum Oxide Templates for Photovoltaics
NASA Astrophysics Data System (ADS)
Chao, Tsu-An
A novel thin film anodized aluminum oxide templating process was developed and applied to make nanopatterns on crystalline silicon to enhance the optical properties of silicon. The thin film anodized aluminum oxide was created to improve the conventional thick aluminum templating method with the aim for potential large scale fabrication. A unique two-step anodizing method was introduced to create high quality nanopatterns and it was demonstrated that this process is superior over the original one-step approach. Optical characterization of the nanopatterned silicon showed up to 10% reduction in reflection in the short wavelength range. Scanning electron microscopy was also used to analyze the nanopatterned surface structure and it was found that interpore spacing and pore density can be tuned by changing the anodizing potential.
Clinical terminology support for a national ambulatory practice outcomes research network.
Ricciardi, Thomas N; Lieberman, Michael I; Kahn, Michael G; Masarie, F E
2005-01-01
The Medical Quality Improvement Consortium (MQIC) is a nationwide collaboration of 74 healthcare delivery systems, consisting of 3755 clinicians, who contribute de-identified clinical data from the same commercial electronic medical record (EMR) for quality reporting, outcomes research and clinical research in public health and practice benchmarking. Despite the existence of a common, centrally-managed, shared terminology for core concepts (medications, problem lists, observation names), a substantial "back-end" information management process is required to ensure terminology and data harmonization for creating multi-facility clinically-acceptable queries and comparable results. We describe the information architecture created to support terminology harmonization across this data-sharing consortium and discuss the implications for large scale data sharing envisioned by proponents for the national adoption of ambulatory EMR systems.
Clinical Terminology Support for a National Ambulatory Practice Outcomes Research Network
Ricciardi, Thomas N.; Lieberman, Michael I.; Kahn, Michael G.; Masarie, F.E. “Chip”
2005-01-01
The Medical Quality Improvement Consortium (MQIC) is a nationwide collaboration of 74 healthcare delivery systems, consisting of 3755 clinicians, who contribute de-identified clinical data from the same commercial electronic medical record (EMR) for quality reporting, outcomes research and clinical research in public health and practice benchmarking. Despite the existence of a common, centrally-managed, shared terminology for core concepts (medications, problem lists, observation names), a substantial “back-end” information management process is required to ensure terminology and data harmonization for creating multi-facility clinically-acceptable queries and comparable results. We describe the information architecture created to support terminology harmonization across this data-sharing consortium and discuss the implications for large scale data sharing envisioned by proponents for the national adoption of ambulatory EMR systems. PMID:16779116
Kinetic Simulations of the Interruption of Large-Amplitude Shear-Alfvén Waves in a High- β Plasma
Squire, J.; Kunz, M. W.; Quataert, E.; ...
2017-10-12
Using two-dimensional hybrid-kinetic simulations, we explore the nonlinear “interruption” of standing and traveling shear-Alfvén waves in collisionless plasmas. Interruption involves a self-generated pressure anisotropy removing the restoring force of a linearly polarized Alfvénic perturbation, and occurs for wave amplitudes δB ⊥/B 0≳β –1/2 (where β is the ratio of thermal to magnetic pressure). We use highly elongated domains to obtain maximal scale separation between the wave and the ion gyroscale. For standing waves above the amplitude limit, we find that the large-scale magnetic field of the wave decays rapidly. The dynamics are strongly affected by the excitation of oblique firehosemore » modes, which transition into long-lived parallel fluctuations at the ion gyroscale and cause significant particle scattering. Traveling waves are damped more slowly, but are also influenced by small-scale parallel fluctuations created by the decay of firehose modes. Our results demonstrate that collisionless plasmas cannot support linearly polarized Alfvén waves above δB ⊥/B 0~β –1/2. Here, they also provide a vivid illustration of two key aspects of low-collisionality plasma dynamics: (i) the importance of velocity-space instabilities in regulating plasma dynamics at high β, and (ii) how nonlinear collisionless processes can transfer mechanical energy directly from the largest scales into thermal energy and microscale fluctuations, without the need for a scale-by-scale turbulent cascade.« less
Kinetic Simulations of the Interruption of Large-Amplitude Shear-Alfvén Waves in a High- β Plasma
DOE Office of Scientific and Technical Information (OSTI.GOV)
Squire, J.; Kunz, M. W.; Quataert, E.
Using two-dimensional hybrid-kinetic simulations, we explore the nonlinear “interruption” of standing and traveling shear-Alfvén waves in collisionless plasmas. Interruption involves a self-generated pressure anisotropy removing the restoring force of a linearly polarized Alfvénic perturbation, and occurs for wave amplitudes δB ⊥/B 0≳β –1/2 (where β is the ratio of thermal to magnetic pressure). We use highly elongated domains to obtain maximal scale separation between the wave and the ion gyroscale. For standing waves above the amplitude limit, we find that the large-scale magnetic field of the wave decays rapidly. The dynamics are strongly affected by the excitation of oblique firehosemore » modes, which transition into long-lived parallel fluctuations at the ion gyroscale and cause significant particle scattering. Traveling waves are damped more slowly, but are also influenced by small-scale parallel fluctuations created by the decay of firehose modes. Our results demonstrate that collisionless plasmas cannot support linearly polarized Alfvén waves above δB ⊥/B 0~β –1/2. Here, they also provide a vivid illustration of two key aspects of low-collisionality plasma dynamics: (i) the importance of velocity-space instabilities in regulating plasma dynamics at high β, and (ii) how nonlinear collisionless processes can transfer mechanical energy directly from the largest scales into thermal energy and microscale fluctuations, without the need for a scale-by-scale turbulent cascade.« less
Abnormal ranges of vital signs in children in Japanese prehospital settings.
Nosaka, Nobuyuki; Muguruma, Takashi; Knaup, Emily; Tsukahara, Kohei; Enomoto, Yuki; Kaku, Noriyuki
2015-10-01
The revised Fire Service Law obliges each prefectural government in Japan to establish a prehospital acuity scale. The Foundation for Ambulance Service Development (FASD) created an acuity scale for use as a reference. Our preliminary survey revealed that 32 of 47 prefectures directly applied the FASD scale for children. This scale shows abnormal ranges of heart rate and respiratory rate in young children. This study aimed to evaluate the validity of the abnormal ranges on the FASD scale to assess its overall performance for triage purposes in paediatric patients. We evaluated the validity of the ranges by comparing published centile charts for these vital signs with records of 1,296 ambulance patients. A large portion of the abnormal ranges on the scale substantially overlapped with the normal centile charts. Triage decisions using the FASD scale of vital signs properly classified 22% ( n = 287) of children. The sensitivity and specificity for high urgency were as high as 91% (95% confidence interval, 82-96%) and as low as 18% (95% confidence interval, 16-20%). We found there is room for improvement of the abnormal ranges on the FASD scale.
NASA Astrophysics Data System (ADS)
Abbaspour, K. C.; Rouholahnejad, E.; Vaghefi, S.; Srinivasan, R.; Yang, H.; Kløve, B.
2015-05-01
A combination of driving forces are increasing pressure on local, national, and regional water supplies needed for irrigation, energy production, industrial uses, domestic purposes, and the environment. In many parts of Europe groundwater quantity, and in particular quality, have come under sever degradation and water levels have decreased resulting in negative environmental impacts. Rapid improvements in the economy of the eastern European block of countries and uncertainties with regard to freshwater availability create challenges for water managers. At the same time, climate change adds a new level of uncertainty with regard to freshwater supplies. In this research we build and calibrate an integrated hydrological model of Europe using the Soil and Water Assessment Tool (SWAT) program. Different components of water resources are simulated and crop yield and water quality are considered at the Hydrological Response Unit (HRU) level. The water resources are quantified at subbasin level with monthly time intervals. Leaching of nitrate into groundwater is also simulated at a finer spatial level (HRU). The use of large-scale, high-resolution water resources models enables consistent and comprehensive examination of integrated system behavior through physically-based, data-driven simulation. In this article we discuss issues with data availability, calibration of large-scale distributed models, and outline procedures for model calibration and uncertainty analysis. The calibrated model and results provide information support to the European Water Framework Directive and lay the basis for further assessment of the impact of climate change on water availability and quality. The approach and methods developed are general and can be applied to any large region around the world.
Generating descriptive visual words and visual phrases for large-scale image applications.
Zhang, Shiliang; Tian, Qi; Hua, Gang; Huang, Qingming; Gao, Wen
2011-09-01
Bag-of-visual Words (BoWs) representation has been applied for various problems in the fields of multimedia and computer vision. The basic idea is to represent images as visual documents composed of repeatable and distinctive visual elements, which are comparable to the text words. Notwithstanding its great success and wide adoption, visual vocabulary created from single-image local descriptors is often shown to be not as effective as desired. In this paper, descriptive visual words (DVWs) and descriptive visual phrases (DVPs) are proposed as the visual correspondences to text words and phrases, where visual phrases refer to the frequently co-occurring visual word pairs. Since images are the carriers of visual objects and scenes, a descriptive visual element set can be composed by the visual words and their combinations which are effective in representing certain visual objects or scenes. Based on this idea, a general framework is proposed for generating DVWs and DVPs for image applications. In a large-scale image database containing 1506 object and scene categories, the visual words and visual word pairs descriptive to certain objects or scenes are identified and collected as the DVWs and DVPs. Experiments show that the DVWs and DVPs are informative and descriptive and, thus, are more comparable with the text words than the classic visual words. We apply the identified DVWs and DVPs in several applications including large-scale near-duplicated image retrieval, image search re-ranking, and object recognition. The combination of DVW and DVP performs better than the state of the art in large-scale near-duplicated image retrieval in terms of accuracy, efficiency and memory consumption. The proposed image search re-ranking algorithm: DWPRank outperforms the state-of-the-art algorithm by 12.4% in mean average precision and about 11 times faster in efficiency.
Quantifying the Impacts of Large Scale Integration of Renewables in Indian Power Sector
NASA Astrophysics Data System (ADS)
Kumar, P.; Mishra, T.; Banerjee, R.
2017-12-01
India's power sector is responsible for nearly 37 percent of India's greenhouse gas emissions. For a fast emerging economy like India whose population and energy consumption are poised to rise rapidly in the coming decades, renewable energy can play a vital role in decarbonizing power sector. In this context, India has targeted 33-35 percent emission intensity reduction (with respect to 2005 levels) along with large scale renewable energy targets (100GW solar, 60GW wind, and 10GW biomass energy by 2022) in INDCs submitted at Paris agreement. But large scale integration of renewable energy is a complex process which faces a number of problems like capital intensiveness, matching intermittent loads with least storage capacity and reliability. In this context, this study attempts to assess the technical feasibility of integrating renewables into Indian electricity mix by 2022 and analyze its implications on power sector operations. This study uses TIMES, a bottom up energy optimization model with unit commitment and dispatch features. We model coal and gas fired units discretely with region-wise representation of wind and solar resources. The dispatch features are used for operational analysis of power plant units under ramp rate and minimum generation constraints. The study analyzes India's electricity sector transition for the year 2022 with three scenarios. The base case scenario (no RE addition) along with INDC scenario (with 100GW solar, 60GW wind, 10GW biomass) and low RE scenario (50GW solar, 30GW wind) have been created to analyze the implications of large scale integration of variable renewable energy. The results provide us insights on trade-offs involved in achieving mitigation targets and investment decisions involved. The study also examines operational reliability and flexibility requirements of the system for integrating renewables.
Base Station Placement Algorithm for Large-Scale LTE Heterogeneous Networks.
Lee, Seungseob; Lee, SuKyoung; Kim, Kyungsoo; Kim, Yoon Hyuk
2015-01-01
Data traffic demands in cellular networks today are increasing at an exponential rate, giving rise to the development of heterogeneous networks (HetNets), in which small cells complement traditional macro cells by extending coverage to indoor areas. However, the deployment of small cells as parts of HetNets creates a key challenge for operators' careful network planning. In particular, massive and unplanned deployment of base stations can cause high interference, resulting in highly degrading network performance. Although different mathematical modeling and optimization methods have been used to approach various problems related to this issue, most traditional network planning models are ill-equipped to deal with HetNet-specific characteristics due to their focus on classical cellular network designs. Furthermore, increased wireless data demands have driven mobile operators to roll out large-scale networks of small long term evolution (LTE) cells. Therefore, in this paper, we aim to derive an optimum network planning algorithm for large-scale LTE HetNets. Recently, attempts have been made to apply evolutionary algorithms (EAs) to the field of radio network planning, since they are characterized as global optimization methods. Yet, EA performance often deteriorates rapidly with the growth of search space dimensionality. To overcome this limitation when designing optimum network deployments for large-scale LTE HetNets, we attempt to decompose the problem and tackle its subcomponents individually. Particularly noting that some HetNet cells have strong correlations due to inter-cell interference, we propose a correlation grouping approach in which cells are grouped together according to their mutual interference. Both the simulation and analytical results indicate that the proposed solution outperforms the random-grouping based EA as well as an EA that detects interacting variables by monitoring the changes in the objective function algorithm in terms of system throughput performance.
Multibiodose radiation emergency triage categorization software.
Ainsbury, Elizabeth A; Barnard, Stephen; Barrios, Lleonard; Fattibene, Paola; de Gelder, Virginie; Gregoire, Eric; Lindholm, Carita; Lloyd, David; Nergaard, Inger; Rothkamm, Kai; Romm, Horst; Scherthan, Harry; Thierens, Hubert; Vandevoorde, Charlot; Woda, Clemens; Wojcik, Andrzej
2014-07-01
In this note, the authors describe the MULTIBIODOSE software, which has been created as part of the MULTIBIODOSE project. The software enables doses estimated by networks of laboratories, using up to five retrospective (biological and physical) assays, to be combined to give a single estimate of triage category for each individual potentially exposed to ionizing radiation in a large scale radiation accident or incident. The MULTIBIODOSE software has been created in Java. The usage of the software is based on the MULTIBIODOSE Guidance: the program creates a link to a single SQLite database for each incident, and the database is administered by the lead laboratory. The software has been tested with Java runtime environment 6 and 7 on a number of different Windows, Mac, and Linux systems, using data from a recent intercomparison exercise. The Java program MULTIBIODOSE_1.0.jar is freely available to download from http://www.multibiodose.eu/software or by contacting the software administrator: MULTIBIODOSE-software@gmx.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chamana, Manohar; Mather, Barry A
A library of load variability classes is created to produce scalable synthetic data sets using historical high-speed raw data. These data are collected from distribution monitoring units connected at the secondary side of a distribution transformer. Because of the irregular patterns and large volume of historical high-speed data sets, the utilization of current load characterization and modeling techniques are challenging. Multi-resolution analysis techniques are applied to extract the necessary components and eliminate the unnecessary components from the historical high-speed raw data to create the library of classes, which are then utilized to create new synthetic load data sets. A validationmore » is performed to ensure that the synthesized data sets contain the same variability characteristics as the training data sets. The synthesized data sets are intended to be utilized in quasi-static time-series studies for distribution system planning studies on a granular scale, such as detailed PV interconnection studies.« less
Mnemonic convergence in social networks: The emergent properties of cognition at a collective level.
Coman, Alin; Momennejad, Ida; Drach, Rae D; Geana, Andra
2016-07-19
The development of shared memories, beliefs, and norms is a fundamental characteristic of human communities. These emergent outcomes are thought to occur owing to a dynamic system of information sharing and memory updating, which fundamentally depends on communication. Here we report results on the formation of collective memories in laboratory-created communities. We manipulated conversational network structure in a series of real-time, computer-mediated interactions in fourteen 10-member communities. The results show that mnemonic convergence, measured as the degree of overlap among community members' memories, is influenced by both individual-level information-processing phenomena and by the conversational social network structure created during conversational recall. By studying laboratory-created social networks, we show how large-scale social phenomena (i.e., collective memory) can emerge out of microlevel local dynamics (i.e., mnemonic reinforcement and suppression effects). The social-interactionist approach proposed herein points to optimal strategies for spreading information in social networks and provides a framework for measuring and forging collective memories in communities of individuals.
Srinivasa, Narayan; Zhang, Deying; Grigorian, Beayna
2014-03-01
This paper describes a novel architecture for enabling robust and efficient neuromorphic communication. The architecture combines two concepts: 1) synaptic time multiplexing (STM) that trades space for speed of processing to create an intragroup communication approach that is firing rate independent and offers more flexibility in connectivity than cross-bar architectures and 2) a wired multiple input multiple output (MIMO) communication with orthogonal frequency division multiplexing (OFDM) techniques to enable a robust and efficient intergroup communication for neuromorphic systems. The MIMO-OFDM concept for the proposed architecture was analyzed by simulating large-scale spiking neural network architecture. Analysis shows that the neuromorphic system with MIMO-OFDM exhibits robust and efficient communication while operating in real time with a high bit rate. Through combining STM with MIMO-OFDM techniques, the resulting system offers a flexible and scalable connectivity as well as a power and area efficient solution for the implementation of very large-scale spiking neural architectures in hardware.
MINC 2.0: A Flexible Format for Multi-Modal Images.
Vincent, Robert D; Neelin, Peter; Khalili-Mahani, Najmeh; Janke, Andrew L; Fonov, Vladimir S; Robbins, Steven M; Baghdadi, Leila; Lerch, Jason; Sled, John G; Adalat, Reza; MacDonald, David; Zijdenbos, Alex P; Collins, D Louis; Evans, Alan C
2016-01-01
It is often useful that an imaging data format can afford rich metadata, be flexible, scale to very large file sizes, support multi-modal data, and have strong inbuilt mechanisms for data provenance. Beginning in 1992, MINC was developed as a system for flexible, self-documenting representation of neuroscientific imaging data with arbitrary orientation and dimensionality. The MINC system incorporates three broad components: a file format specification, a programming library, and a growing set of tools. In the early 2000's the MINC developers created MINC 2.0, which added support for 64-bit file sizes, internal compression, and a number of other modern features. Because of its extensible design, it has been easy to incorporate details of provenance in the header metadata, including an explicit processing history, unique identifiers, and vendor-specific scanner settings. This makes MINC ideal for use in large scale imaging studies and databases. It also makes it easy to adapt to new scanning sequences and modalities.
a Voxel-Based Metadata Structure for Change Detection in Point Clouds of Large-Scale Urban Areas
NASA Astrophysics Data System (ADS)
Gehrung, J.; Hebel, M.; Arens, M.; Stilla, U.
2018-05-01
Mobile laser scanning has not only the potential to create detailed representations of urban environments, but also to determine changes up to a very detailed level. An environment representation for change detection in large scale urban environments based on point clouds has drawbacks in terms of memory scalability. Volumes, however, are a promising building block for memory efficient change detection methods. The challenge of working with 3D occupancy grids is that the usual raycasting-based methods applied for their generation lead to artifacts caused by the traversal of unfavorable discretized space. These artifacts have the potential to distort the state of voxels in close proximity to planar structures. In this work we propose a raycasting approach that utilizes knowledge about planar surfaces to completely prevent this kind of artifacts. To demonstrate the capabilities of our approach, a method for the iterative volumetric approximation of point clouds that allows to speed up the raycasting by 36 percent is proposed.
Testing the robustness of Citizen Science projects: Evaluating the results of pilot project COMBER
Faulwetter, Sarah; Dailianis, Thanos; Smith, Vincent Stuart; Koulouri, Panagiota; Dounas, Costas; Arvanitidis, Christos
2016-01-01
Abstract Background Citizen Science (CS) as a term implies a great deal of approaches and scopes involving many different fields of science. The number of the relevant projects globally has been increased significantly in the recent years. Large scale ecological questions can be answered only through extended observation networks and CS projects can support this effort. Although the need of such projects is apparent, an important part of scientific community cast doubt on the reliability of CS data sets. New information The pilot CS project COMBER has been created in order to provide evidence to answer the aforementioned question in the coastal marine biodiversity monitoring. The results of the current analysis show that a carefully designed CS project with clear hypotheses, wide participation and data sets validation, can be a valuable tool for the large scale and long term changes in marine biodiversity pattern change and therefore for relevant management and conservation issues. PMID:28174507
A large-scale solar dynamics observatory image dataset for computer vision applications.
Kucuk, Ahmet; Banda, Juan M; Angryk, Rafal A
2017-01-01
The National Aeronautics Space Agency (NASA) Solar Dynamics Observatory (SDO) mission has given us unprecedented insight into the Sun's activity. By capturing approximately 70,000 images a day, this mission has created one of the richest and biggest repositories of solar image data available to mankind. With such massive amounts of information, researchers have been able to produce great advances in detecting solar events. In this resource, we compile SDO solar data into a single repository in order to provide the computer vision community with a standardized and curated large-scale dataset of several hundred thousand solar events found on high resolution solar images. This publicly available resource, along with the generation source code, will accelerate computer vision research on NASA's solar image data by reducing the amount of time spent performing data acquisition and curation from the multiple sources we have compiled. By improving the quality of the data with thorough curation, we anticipate a wider adoption and interest from the computer vision to the solar physics community.
Newmark local time stepping on high-performance computing architectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rietmann, Max, E-mail: max.rietmann@erdw.ethz.ch; Institute of Geophysics, ETH Zurich; Grote, Marcus, E-mail: marcus.grote@unibas.ch
In multi-scale complex media, finite element meshes often require areas of local refinement, creating small elements that can dramatically reduce the global time-step for wave-propagation problems due to the CFL condition. Local time stepping (LTS) algorithms allow an explicit time-stepping scheme to adapt the time-step to the element size, allowing near-optimal time-steps everywhere in the mesh. We develop an efficient multilevel LTS-Newmark scheme and implement it in a widely used continuous finite element seismic wave-propagation package. In particular, we extend the standard LTS formulation with adaptations to continuous finite element methods that can be implemented very efficiently with very strongmore » element-size contrasts (more than 100x). Capable of running on large CPU and GPU clusters, we present both synthetic validation examples and large scale, realistic application examples to demonstrate the performance and applicability of the method and implementation on thousands of CPU cores and hundreds of GPUs.« less
Paying for performance: Performance incentives increase desire for the reward object.
Hur, Julia D; Nordgren, Loran F
2016-09-01
The current research examines how exposure to performance incentives affects one's desire for the reward object. We hypothesized that the flexible nature of performance incentives creates an attentional fixation on the reward object (e.g., money), which leads people to become more desirous of the rewards. Results from 5 laboratory experiments and 1 large-scale field study provide support for this prediction. When performance was incentivized with monetary rewards, participants reported being more desirous of money (Study 1), put in more effort to earn additional money in an ensuing task (Study 2), and were less willing to donate money to charity (Study 4). We replicated the result with nonmonetary rewards (Study 5). We also found that performance incentives increased attention to the reward object during the task, which in part explains the observed effects (Study 6). A large-scale field study replicated these findings in a real-world setting (Study 7). One laboratory experiment failed to replicate (Study 3). (PsycINFO Database Record (c) 2016 APA, all rights reserved).
LSSGalPy: Interactive Visualization of the Large-scale Environment Around Galaxies
NASA Astrophysics Data System (ADS)
Argudo-Fernández, M.; Duarte Puertas, S.; Ruiz, J. E.; Sabater, J.; Verley, S.; Bergond, G.
2017-05-01
New tools are needed to handle the growth of data in astrophysics delivered by recent and upcoming surveys. We aim to build open-source, light, flexible, and interactive software designed to visualize extensive three-dimensional (3D) tabular data. Entirely written in the Python language, we have developed interactive tools to browse and visualize the positions of galaxies in the universe and their positions with respect to its large-scale structures (LSS). Motivated by a previous study, we created two codes using Mollweide projection and wedge diagram visualizations, where survey galaxies can be overplotted on the LSS of the universe. These are interactive representations where the visualizations can be controlled by widgets. We have released these open-source codes that have been designed to be easily re-used and customized by the scientific community to fulfill their needs. The codes are adaptable to other kinds of 3D tabular data and are robust enough to handle several millions of objects. .
Social networks and environmental outcomes.
Barnes, Michele L; Lynham, John; Kalberg, Kolter; Leung, PingSun
2016-06-07
Social networks can profoundly affect human behavior, which is the primary force driving environmental change. However, empirical evidence linking microlevel social interactions to large-scale environmental outcomes has remained scarce. Here, we leverage comprehensive data on information-sharing networks among large-scale commercial tuna fishers to examine how social networks relate to shark bycatch, a global environmental issue. We demonstrate that the tendency for fishers to primarily share information within their ethnic group creates segregated networks that are strongly correlated with shark bycatch. However, some fishers share information across ethnic lines, and examinations of their bycatch rates show that network contacts are more strongly related to fishing behaviors than ethnicity. Our findings indicate that social networks are tied to actions that can directly impact marine ecosystems, and that biases toward within-group ties may impede the diffusion of sustainable behaviors. Importantly, our analysis suggests that enhanced communication channels across segregated fisher groups could have prevented the incidental catch of over 46,000 sharks between 2008 and 2012 in a single commercial fishery.
Non-radial pulsations and large-scale structure in stellar winds
NASA Astrophysics Data System (ADS)
Blomme, R.
2009-07-01
Almost all early-type stars show Discrete Absorption Components (DACs) in their ultraviolet spectral lines. These can be attributed to Co-rotating Interaction Regions (CIRs): large-scale spiral-shaped structures that sweep through the stellar wind. We used the Zeus hydrodynamical code to model the CIRs. In the model, the CIRs are caused by ``spots" on the stellar surface. Through the radiative acceleration these spots create fast streams in the stellar wind material. Where the fast and slow streams collide, a CIR is formed. By varying the parameters of the spots, we quantitatively fit the observed DACs in HD~64760. An important result from our work is that the spots do not rotate with the same velocity as the stellar surface. The fact that the cause of the CIRs is not fixed on the surface eliminates many potential explanations. The only remaining explanation is that the CIRs are due to the interference pattern of a number of non-radial pulsations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ullal, H.; Mitchell, R.; Keyes, B.
In this paper, we report on the major accomplishments of the U.S. Department of Energy's (DOE) Solar Energy Technologies Program (SETP) Photovoltaic (PV) Technology Incubator project. The Incubator project facilitates a company's transition from developing a solar cell or PV module prototype to pilot- and large-scale U.S. manufacturing. The project targets small businesses that have demonstrated proof-of-concept devices or processes in the laboratory. Their success supports U.S. Secretary of Energy Steven Chu's SunShot Initiative, which seeks to achieve PV technologies that are cost-competitive without subsidies at large scale with fossil-based energy sources by the end of this decade. The Incubatormore » Project has enhanced U.S. PV manufacturing capacity and created more than 1200 clean energy jobs, resulting in an increase in American economic competitiveness. The investment raised to date by these PV Incubator companies as a result of DOE's $ 59 million investment totals nearly $ 1.3 billion.« less
Progress of the PV Technology Incubator Project Towards an Enhanced U.S. Manufacturing Base
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ullal, H.; Mitchell, R.; Keyes, B.
In this paper, we report on the major accomplishments of the U.S. Department of Energy's (DOE) Solar Energy Technologies Program (SETP) Photovoltaic (PV) Technology Incubator project. The Incubator project facilitates a company's transition from developing a solar cell or PV module prototype to pilot- and large-scale U.S. manufacturing. The project targets small businesses that have demonstrated proof-of-concept devices or processes in the laboratory. Their success supports U.S. Secretary of Energy Steven Chu's SunShot Initiative, which seeks to achieve PV technologies that are cost-competitive without subsidies at large scale with fossil-based energy sources by the end of this decade. The Incubatormore » Project has enhanced U.S. PV manufacturing capacity and created more than 1200 clean energy jobs, resulting in an increase in American economic competitiveness. The investment raised to date by these PV Incubator companies as a result of DOE's $ 59 million investment total nearly $ 1.3 billion.« less
Khomtchouk, Bohdan B; Van Booven, Derek J; Wahlestedt, Claes
2014-01-01
The graphical visualization of gene expression data using heatmaps has become an integral component of modern-day medical research. Heatmaps are used extensively to plot quantitative differences in gene expression levels, such as those measured with RNAseq and microarray experiments, to provide qualitative large-scale views of the transcriptonomic landscape. Creating high-quality heatmaps is a computationally intensive task, often requiring considerable programming experience, particularly for customizing features to a specific dataset at hand. Software to create publication-quality heatmaps is developed with the R programming language, C++ programming language, and OpenGL application programming interface (API) to create industry-grade high performance graphics. We create a graphical user interface (GUI) software package called HeatmapGenerator for Windows OS and Mac OS X as an intuitive, user-friendly alternative to researchers with minimal prior coding experience to allow them to create publication-quality heatmaps using R graphics without sacrificing their desired level of customization. The simplicity of HeatmapGenerator is that it only requires the user to upload a preformatted input file and download the publicly available R software language, among a few other operating system-specific requirements. Advanced features such as color, text labels, scaling, legend construction, and even database storage can be easily customized with no prior programming knowledge. We provide an intuitive and user-friendly software package, HeatmapGenerator, to create high-quality, customizable heatmaps generated using the high-resolution color graphics capabilities of R. The software is available for Microsoft Windows and Apple Mac OS X. HeatmapGenerator is released under the GNU General Public License and publicly available at: http://sourceforge.net/projects/heatmapgenerator/. The Mac OS X direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_MAC_OSX.tar.gz/download. The Windows OS direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_WINDOWS.zip/download.
Recent Advances in Geospatial Visualization with the New Google Earth
NASA Astrophysics Data System (ADS)
Anderson, J. C.; Poyart, E.; Yan, S.; Sargent, R.
2017-12-01
Google Earth's detailed, world-wide imagery and terrain data provide a rich backdrop for geospatial visualization at multiple scales, from global to local. The Keyhole Markup Language (KML) is an open standard that has been the primary way for users to author and share data visualizations in Google Earth. Despite its ease of use and flexibility for relatively small amounts of data, users can quickly run into difficulties and limitations working with large-scale or time-varying datasets using KML in Google Earth. Recognizing these challenges, we present our recent work toward extending Google Earth to be a more powerful data visualization platform. We describe a new KML extension to simplify the display of multi-resolution map tile pyramids - which can be created by analysis platforms like Google Earth Engine, or by a variety of other map tile production pipelines. We also describe how this implementation can pave the way to creating novel data visualizations by leveraging custom graphics shaders. Finally, we present our investigations into native support in Google Earth for data storage and transport formats that are well-suited for big raster and vector data visualization. Taken together, these capabilities make it easier to create and share new scientific data visualization experiences using Google Earth, and simplify the integration of Google Earth with existing map data products, services, and analysis pipelines.
Beckett, Stephen J.; Williams, Hywel T. P.
2013-01-01
Phage and their bacterial hosts are the most diverse and abundant biological entities in the oceans, where their interactions have a major impact on marine ecology and ecosystem function. The structure of interaction networks for natural phage–bacteria communities offers insight into their coevolutionary origin. At small phylogenetic scales, observed communities typically show a nested structure, in which both hosts and phages can be ranked by their range of resistance and infectivity, respectively. A qualitatively different multi-scale structure is seen at larger phylogenetic scales; a natural assemblage sampled from the Atlantic Ocean displays large-scale modularity and local nestedness within each module. Here, we show that such ‘nested-modular’ interaction networks can be produced by a simple model of host–phage coevolution in which infection depends on genetic matching. Negative frequency-dependent selection causes diversification of hosts (to escape phages) and phages (to track their evolving hosts). This creates a diverse community of bacteria and phage, maintained by kill-the-winner ecological dynamics. When the resulting communities are visualized as bipartite networks of who infects whom, they show the nested-modular structure characteristic of the Atlantic sample. The statistical significance and strength of this observation varies depending on whether the interaction networks take into account the density of the interacting strains, with implications for interpretation of interaction networks constructed by different methods. Our results suggest that the apparently complex community structures associated with marine bacteria and phage may arise from relatively simple coevolutionary origins. PMID:24516719
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makarashvili, Vakhtang; Merzari, Elia; Obabko, Aleksandr
We analyze the potential performance benefits of estimating expected quantities in large eddy simulations of turbulent flows using true ensembles rather than ergodic time averaging. Multiple realizations of the same flow are simulated in parallel, using slightly perturbed initial conditions to create unique instantaneous evolutions of the flow field. Each realization is then used to calculate statistical quantities. Provided each instance is sufficiently de-correlated, this approach potentially allows considerable reduction in the time to solution beyond the strong scaling limit for a given accuracy. This study focuses on the theory and implementation of the methodology in Nek5000, a massively parallelmore » open-source spectral element code.« less
Makarashvili, Vakhtang; Merzari, Elia; Obabko, Aleksandr; ...
2017-06-07
We analyze the potential performance benefits of estimating expected quantities in large eddy simulations of turbulent flows using true ensembles rather than ergodic time averaging. Multiple realizations of the same flow are simulated in parallel, using slightly perturbed initial conditions to create unique instantaneous evolutions of the flow field. Each realization is then used to calculate statistical quantities. Provided each instance is sufficiently de-correlated, this approach potentially allows considerable reduction in the time to solution beyond the strong scaling limit for a given accuracy. This study focuses on the theory and implementation of the methodology in Nek5000, a massively parallelmore » open-source spectral element code.« less
PLSS Scale Demonstration of MTSA Temperature Swing Adsorption Bed Concept for CO2 Removal/Rejection
NASA Technical Reports Server (NTRS)
Iacomini, Christine S.; Powers, Aaron; Paul, Heather L.
2009-01-01
Metabolic heat regenerated temperature swing adsorption (MTSA) incorporated into a portable life support system (PLSS) is being explored as a viable means of removing and rejecting carbon dioxide (CO2) from an astronaut s ventilation loop. Sorbent pellets used in previous work are inherently difficult to quickly heat and cool. Further, their use in packed beds create large undesirable pressure drop. Thus work has been done to assess the application and performance of aluminum foam wash coated with a layer of sorbent. A to-scale sorbent bed, as envisioned studying use by a Martian PLSS, was designed, built, and tested. Performance of the assembly in regards to CO2 adsorption and pressure drop were assessed and the results are presented.
Profitability and sustainability of small - medium scale palm biodiesel plant
NASA Astrophysics Data System (ADS)
Solikhah, Maharani Dewi; Kismanto, Agus; Raksodewanto, Agus; Peryoga, Yoga
2017-06-01
The mandatory of biodiesel application at 20% blending (B20) has been started since January 2016. It creates huge market for biodiesel industry. To build large-scale biodiesel plant (> 100,000 tons/year) is most favorable for biodiesel producers since it can give lower production cost. This cost becomes a challenge for small - medium scale biodiesel plants. However, current biodiesel plants in Indonesia are located mainly in Java and Sumatra, which then distribute biodiesel around Indonesia so that there is an additional cost for transportation from area to area. This factor becomes an opportunity for the small - medium scale biodiesel plants to compete with the large one. This paper discusses the profitability of small - medium scale biodiesel plants conducted on a capacity of 50 tons/day using CPO and its derivatives. The study was conducted by performing economic analysis between scenarios of biodiesel plant that using raw material of stearin, PFAD, and multi feedstock. Comparison on the feasibility of scenarios was also conducted on the effect of transportation cost and selling price. The economic assessment shows that profitability is highly affected by raw material price so that it is important to secure the source of raw materials and consider a multi-feedstock type for small - medium scale biodiesel plants to become a sustainable plant. It was concluded that the small - medium scale biodiesel plants will be profitable and sustainable if they are connected to palm oil mill, have a captive market, and are located minimally 200 km from other biodiesel plants. The use of multi feedstock could increase IRR from 18.68 % to 56.52 %.
NASA Astrophysics Data System (ADS)
Wilson, Robert H.; Vishwanath, Karthik; Mycek, Mary-Ann
2009-02-01
Monte Carlo (MC) simulations are considered the "gold standard" for mathematical description of photon transport in tissue, but they can require large computation times. Therefore, it is important to develop simple and efficient methods for accelerating MC simulations, especially when a large "library" of related simulations is needed. A semi-analytical method involving MC simulations and a path-integral (PI) based scaling technique generated time-resolved reflectance curves from layered tissue models. First, a zero-absorption MC simulation was run for a tissue model with fixed scattering properties in each layer. Then, a closed-form expression for the average classical path of a photon in tissue was used to determine the percentage of time that the photon spent in each layer, to create a weighted Beer-Lambert factor to scale the time-resolved reflectance of the simulated zero-absorption tissue model. This method is a unique alternative to other scaling techniques in that it does not require the path length or number of collisions of each photon to be stored during the initial simulation. Effects of various layer thicknesses and absorption and scattering coefficients on the accuracy of the method will be discussed.
Yeom, Jae Min; Yum, Seong Soo; Liu, Yangang; ...
2017-04-20
Entrainment and mixing processes and their effects on cloud microphysics in the continental stratocumulus clouds observed in Oklahoma during the RACORO campaign are analyzed in the frame of homogeneous and inhomogeneous mixing concepts by combining the approaches of microphysical correlation, mixing diagram, and transition scale (number). A total of 110 horizontally penetrated cloud segments is analyzed in this paper. Mixing diagram and cloud microphysical relationship analyses show homogeneous mixing trait of positive relationship between liquid water content (L) and mean volume of droplets (V) (i.e., smaller droplets in more diluted parcel) in most cloud segments. Relatively small temperature and humiditymore » differences between the entraining air from above the cloud top and cloudy air and relatively large turbulent dissipation rate are found to be responsible for this finding. The related scale parameters (i.e., transition length and transition scale number) are relatively large, which also indicates high likelihood of homogeneous mixing. Finally, clear positive relationship between L and vertical velocity (W) for some cloud segments is suggested to be evidence of vertical circulation mixing, which may further enhance the positive relationship between L and V created by homogeneous mixing.« less
The place of algae in agriculture: policies for algal biomass production.
Trentacoste, Emily M; Martinez, Alice M; Zenk, Tim
2015-03-01
Algae have been used for food and nutraceuticals for thousands of years, and the large-scale cultivation of algae, or algaculture, has existed for over half a century. More recently algae have been identified and developed as renewable fuel sources, and the cultivation of algal biomass for various products is transitioning to commercial-scale systems. It is crucial during this period that institutional frameworks (i.e., policies) support and promote development and commercialization and anticipate and stimulate the evolution of the algal biomass industry as a source of renewable fuels, high value protein and carbohydrates and low-cost drugs. Large-scale cultivation of algae merges the fundamental aspects of traditional agricultural farming and aquaculture. Despite this overlap, algaculture has not yet been afforded a position within agriculture or the benefits associated with it. Various federal and state agricultural support and assistance programs are currently appropriated for crops, but their extension to algal biomass is uncertain. These programs are essential for nascent industries to encourage investment, build infrastructure, disseminate technical experience and information, and create markets. This review describes the potential agricultural policies and programs that could support algal biomass cultivation, and the barriers to the expansion of these programs to algae.
Supporting observation campaigns with high resolution modeling
NASA Astrophysics Data System (ADS)
Klocke, Daniel; Brueck, Matthias; Voigt, Aiko
2017-04-01
High resolution simulation in support of measurement campaigns offers a promising and emerging way to create large-scale context for small-scale observations of clouds and precipitation processes. As these simulation include the coupling of measured small-scale processes with the circulation, they also help to integrate the research communities from modeling and observations and allow for detailed model evaluations against dedicated observations. In connection with the measurement campaign NARVAL (August 2016 and December 2013) simulations with a grid-spacing of 2.5 km for the tropical Atlantic region (9000x3300 km), with local refinement to 1.2 km for the western part of the domain, were performed using the icosahedral non-hydrostatic (ICON) general circulation model. These simulations are again used to drive large eddy resolving simulations with the same model for selected days in the high definition clouds and precipitation for advancing climate prediction (HD(CP)2) project. The simulations are presented with the focus on selected results showing the benefit for the scientific communities doing atmospheric measurements and numerical modeling of climate and weather. Additionally, an outlook will be given on how similar simulations will support the NAWDEX measurement campaign in the North Atlantic and AC3 measurement campaign in the Arctic.
NASA Astrophysics Data System (ADS)
Yeom, Jae Min; Yum, Seong Soo; Liu, Yangang; Lu, Chunsong
2017-09-01
Entrainment and mixing processes and their effects on cloud microphysics in the continental stratocumulus clouds observed in Oklahoma during the RACORO campaign are analyzed in the frame of homogeneous and inhomogeneous mixing concepts by combining the approaches of microphysical correlation, mixing diagram, and transition scale (number). A total of 110 horizontally penetrated cloud segments is analyzed. Mixing diagram and cloud microphysical relationship analyses show homogeneous mixing trait of positive relationship between liquid water content (L) and mean volume of droplets (V) (i.e., smaller droplets in more diluted parcel) in most cloud segments. Relatively small temperature and humidity differences between the entraining air from above the cloud top and cloudy air and relatively large turbulent dissipation rate are found to be responsible for this finding. The related scale parameters (i.e., transition length and transition scale number) are relatively large, which also indicates high likelihood of homogeneous mixing. Clear positive relationship between L and vertical velocity (W) for some cloud segments is suggested to be evidence of vertical circulation mixing, which may further enhance the positive relationship between L and V created by homogeneous mixing.
Corridors affect plants, animals, and their interactions in fragmented landscapes
Tewksbury, Joshua J.; Levey, Douglas J.; Haddad, Nick M.; Sargent, Sarah; Orrock, John L.; Weldon, Aimee; Danielson, Brent J.; Brinkerhoff, Jory; Damschen, Ellen I.; Townsend, Patricia
2002-01-01
Among the most popular strategies for maintaining populations of both plants and animals in fragmented landscapes is to connect isolated patches with thin strips of habitat, called corridors. Corridors are thought to increase the exchange of individuals between habitat patches, promoting genetic exchange and reducing population fluctuations. Empirical studies addressing the effects of corridors have either been small in scale or have ignored confounding effects of increased habitat area created by the presence of a corridor. These methodological difficulties, coupled with a paucity of studies examining the effects of corridors on plants and plant–animal interactions, have sparked debate over the purported value of corridors in conservation planning. We report results of a large-scale experiment that directly address this debate. In eight large-scale experimental landscapes that control for patch area and test alternative mechanisms of corridor function, we demonstrate that corridors not only increase the exchange of animals between patches, but also facilitate two key plant–animal interactions: pollination and seed dispersal. Our results show that the beneficial effects of corridors extend beyond the area they add, and suggest that increased plant and animal movement through corridors will have positive impacts on plant populations and community interactions in fragmented landscapes. PMID:12239344
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yeom, Jae Min; Yum, Seong Soo; Liu, Yangang
Entrainment and mixing processes and their effects on cloud microphysics in the continental stratocumulus clouds observed in Oklahoma during the RACORO campaign are analyzed in the frame of homogeneous and inhomogeneous mixing concepts by combining the approaches of microphysical correlation, mixing diagram, and transition scale (number). A total of 110 horizontally penetrated cloud segments is analyzed in this paper. Mixing diagram and cloud microphysical relationship analyses show homogeneous mixing trait of positive relationship between liquid water content (L) and mean volume of droplets (V) (i.e., smaller droplets in more diluted parcel) in most cloud segments. Relatively small temperature and humiditymore » differences between the entraining air from above the cloud top and cloudy air and relatively large turbulent dissipation rate are found to be responsible for this finding. The related scale parameters (i.e., transition length and transition scale number) are relatively large, which also indicates high likelihood of homogeneous mixing. Finally, clear positive relationship between L and vertical velocity (W) for some cloud segments is suggested to be evidence of vertical circulation mixing, which may further enhance the positive relationship between L and V created by homogeneous mixing.« less
NASA Astrophysics Data System (ADS)
Riley, W. J.; Dwivedi, D.; Ghimire, B.; Hoffman, F. M.; Pau, G. S. H.; Randerson, J. T.; Shen, C.; Tang, J.; Zhu, Q.
2015-12-01
Numerical model representations of decadal- to centennial-scale soil-carbon dynamics are a dominant cause of uncertainty in climate change predictions. Recent attempts by some Earth System Model (ESM) teams to integrate previously unrepresented soil processes (e.g., explicit microbial processes, abiotic interactions with mineral surfaces, vertical transport), poor performance of many ESM land models against large-scale and experimental manipulation observations, and complexities associated with spatial heterogeneity highlight the nascent nature of our community's ability to accurately predict future soil carbon dynamics. I will present recent work from our group to develop a modeling framework to integrate pore-, column-, watershed-, and global-scale soil process representations into an ESM (ACME), and apply the International Land Model Benchmarking (ILAMB) package for evaluation. At the column scale and across a wide range of sites, observed depth-resolved carbon stocks and their 14C derived turnover times can be explained by a model with explicit representation of two microbial populations, a simple representation of mineralogy, and vertical transport. Integrating soil and plant dynamics requires a 'process-scaling' approach, since all aspects of the multi-nutrient system cannot be explicitly resolved at ESM scales. I will show that one approach, the Equilibrium Chemistry Approximation, improves predictions of forest nitrogen and phosphorus experimental manipulations and leads to very different global soil carbon predictions. Translating model representations from the site- to ESM-scale requires a spatial scaling approach that either explicitly resolves the relevant processes, or more practically, accounts for fine-resolution dynamics at coarser scales. To that end, I will present recent watershed-scale modeling work that applies reduced order model methods to accurately scale fine-resolution soil carbon dynamics to coarse-resolution simulations. Finally, we contend that creating believable soil carbon predictions requires a robust, transparent, and community-available benchmarking framework. I will present an ILAMB evaluation of several of the above-mentioned approaches in ACME, and attempt to motivate community adoption of this evaluation approach.
NASA Astrophysics Data System (ADS)
Andersen, G.; Dearborn, M.; Hcharg, G.
2010-09-01
We are investigating new technologies for creating ultra-large apertures (>20m) for space-based imagery. Our approach has been to create diffractive primaries in flat membranes deployed from compact payloads. These structures are attractive in that they are much simpler to fabricate, launch and deploy compared to conventional three-dimensional optics. In this case the flat focusing element is a photon sieve which consists of a large number of holes in an otherwise opaque substrate. A photon sieve is essentially a large number of holes located according to an underlying Fresnel Zone Plate (FZP) geometry. The advantages over the FZP are that there are no support struts which lead to diffraction spikes in the far-field and non-uniform tension which can cause wrinkling of the substrate. Furthermore, with modifications in hole size and distribution we can achieve improved resolution and contrast over conventional optics. The trade-offs in using diffractive optics are the large amounts of dispersion and decreased efficiency. We present both theoretical and experimental results from small-scale prototypes. Several key solutions to issues of limited bandwidth and efficiency have been addressed. Along with these we have studied the materials aspects in order to optimize performance and achieve a scalable solution to an on-orbit demonstrator. Our current efforts are being directed towards an on-orbit 1m solar observatory demonstration deployed from a CubeSat bus.
NASA Astrophysics Data System (ADS)
Nomori, Koji; Kitamura, Koji; Motomura, Yoichi; Nishida, Yoshifumi; Yamanaka, Tatsuhiro; Komatsubara, Akinori
In Japan, childhood injury prevention is urgent issue. Safety measures through creating knowledge of injury data are essential for preventing childhood injuries. Especially the injury prevention approach by product modification is very important. The risk assessment is one of the most fundamental methods to design safety products. The conventional risk assessment has been carried out subjectively because product makers have poor data on injuries. This paper deals with evidence-based risk assessment, in which artificial intelligence technologies are strongly needed. This paper describes a new method of foreseeing usage of products, which is the first step of the evidence-based risk assessment, and presents a retrieval system of injury data. The system enables a product designer to foresee how children use a product and which types of injuries occur due to the product in daily environment. The developed system consists of large scale injury data, text mining technology and probabilistic modeling technology. Large scale text data on childhood injuries was collected from medical institutions by an injury surveillance system. Types of behaviors to a product were derived from the injury text data using text mining technology. The relationship among products, types of behaviors, types of injuries and characteristics of children was modeled by Bayesian Network. The fundamental functions of the developed system and examples of new findings obtained by the system are reported in this paper.
Pan, Xin; Qi, Jian-cheng; Long, Ming; Liang, Hao; Chen, Xiao; Li, Han; Li, Guang-bo; Zheng, Hao
2010-01-01
The close phylogenetic relationship between humans and non-human primates makes non-human primates an irreplaceable model for the study of human infectious diseases. In this study, we describe the development of a large-scale automatic multi-functional isolation chamber for use with medium-sized laboratory animals carrying infectious diseases. The isolation chamber, including the transfer chain, disinfection chain, negative air pressure isolation system, animal welfare system, and the automated system, is designed to meet all biological safety standards. To create an internal chamber environment that is completely isolated from the exterior, variable frequency drive blowers are used in the air-intake and air-exhaust system, precisely controlling the filtered air flow and providing an air-barrier protection. A double door transfer port is used to transfer material between the interior of the isolation chamber and the outside. A peracetic acid sterilizer and its associated pipeline allow for complete disinfection of the isolation chamber. All of the isolation chamber parameters can be automatically controlled by a programmable computerized menu, allowing for work with different animals in different-sized cages depending on the research project. The large-scale multi-functional isolation chamber provides a useful and safe system for working with infectious medium-sized laboratory animals in high-level bio-safety laboratories. PMID:20872984
PIV measurements of in-cylinder, large-scale structures in a water-analogue Diesel engine
NASA Astrophysics Data System (ADS)
Kalpakli Vester, A.; Nishio, Y.; Alfredsson, P. H.
2016-11-01
Swirl and tumble are large-scale structures that develop in an engine cylinder during the intake stroke. Their structure and strength depend on the design of the inlet ports and valves, but also on the valve lift history. Engine manufacturers make their design to obtain a specific flow structure that is assumed to give the best engine performance. Despite many efforts, there are still open questions, such as how swirl and tumble depend on the dynamics of the valves/piston as well as how cycle-to-cycle variations should be minimized. In collaboration with Swedish vehicle industry we perform PIV measurements of the flow dynamics during the intake stroke inside a cylinder of a water-analogue engine model having the same geometrical characteristics as a typical truck Diesel engine. Water can be used since during the intake stroke the flow is nearly incompressible. The flow from the valves moves radially outwards, hits the vertical walls of the cylinder, entrains surrounding fluid, moves along the cylinder walls and creates a central backflow, i.e. a tumble motion. Depending on the port and valve design and orientation none, low, or high swirl can be established. For the first time, the effect of the dynamic motion of the piston/valves on the large-scale structures is captured. Supported by the Swedish Energy Agency, Scania CV AB and Volvo GTT, through the FFI program.
Talking About The Smokes: a large-scale, community-based participatory research project.
Couzos, Sophia; Nicholson, Anna K; Hunt, Jennifer M; Davey, Maureen E; May, Josephine K; Bennet, Pele T; Westphal, Darren W; Thomas, David P
2015-06-01
To describe the Talking About The Smokes (TATS) project according to the World Health Organization guiding principles for conducting community-based participatory research (PR) involving indigenous peoples, to assist others planning large-scale PR projects. The TATS project was initiated in Australia in 2010 as part of the International Tobacco Control Policy Evaluation Project, and surveyed a representative sample of 2522 Aboriginal and Torres Strait Islander adults to assess the impact of tobacco control policies. The PR process of the TATS project, which aimed to build partnerships to create equitable conditions for knowledge production, was mapped and summarised onto a framework adapted from the WHO principles. Processes describing consultation and approval, partnerships and research agreements, communication, funding, ethics and consent, data and benefits of the research. The TATS project involved baseline and follow-up surveys conducted in 34 Aboriginal community-controlled health services and one Torres Strait community. Consistent with the WHO PR principles, the TATS project built on community priorities and strengths through strategic partnerships from project inception, and demonstrated the value of research agreements and trusting relationships to foster shared decision making, capacity building and a commitment to Indigenous data ownership. Community-based PR methodology, by definition, needs adaptation to local settings and priorities. The TATS project demonstrates that large-scale research can be participatory, with strong Indigenous community engagement and benefits.
MESUR: USAGE-BASED METRICS OF SCHOLARLY IMPACT
DOE Office of Scientific and Technical Information (OSTI.GOV)
BOLLEN, JOHAN; RODRIGUEZ, MARKO A.; VAN DE SOMPEL, HERBERT
2007-01-30
The evaluation of scholarly communication items is now largely a matter of expert opinion or metrics derived from citation data. Both approaches can fail to take into account the myriad of factors that shape scholarly impact. Usage data has emerged as a promising complement to existing methods o fassessment but the formal groundwork to reliably and validly apply usage-based metrics of schlolarly impact is lacking. The Andrew W. Mellon Foundation funded MESUR project constitutes a systematic effort to define, validate and cross-validate a range of usage-based metrics of schlolarly impact by creating a semantic model of the scholarly communication process.more » The constructed model will serve as the basis of a creating a large-scale semantic network that seamlessly relates citation, bibliographic and usage data from a variety of sources. A subsequent program that uses the established semantic network as a reference data set will determine the characteristics and semantics of a variety of usage-based metrics of schlolarly impact. This paper outlines the architecture and methodology adopted by the MESUR project and its future direction.« less
2015-01-01
Large area arrays of magnetic, semiconducting, and insulating nanorings were created by coupling colloidal lithography with nanoscale electrodeposition. This versatile nanoscale fabrication process allows for the independent tuning of the spacing, diameter, and width of the nanorings with typical values of 1.0 μm, 750 nm, and 100 nm, respectively, and was used to form nanorings from a host of materials: Ni, Co, bimetallic Ni/Au, CdSe, and polydopamine. These nanoring arrays have potential applications in memory storage, optical materials, and biosensing. A modified version of this nanoscale electrodeposition process was also used to create arrays of split gold nanorings. The size of the split nanoring opening was controlled by the angle of photoresist exposure during the fabrication process and could be varied from 50% down to 10% of the ring circumference. The large area (cm2 scale) gold split nanoring array surfaces exhibited strong polarization-dependent plasmonic absorption bands for wavelengths from 1 to 5 μm. Plasmonic nanoscale split ring arrays are potentially useful as tunable dichroic materials throughout the infrared and near-infrared spectral regions. PMID:25553204
Dow, Christopher B; Collins, Brandon M; Stephens, Scott L
2016-03-01
Finding novel ways to plan and implement landscape-level forest treatments that protect sensitive wildlife and other key ecosystem components, while also reducing the risk of large-scale, high-severity fires, can prove to be difficult. We examined alternative approaches to landscape-scale fuel-treatment design for the same landscape. These approaches included two different treatment scenarios generated from an optimization algorithm that reduces modeled fire spread across the landscape, one with resource-protection constrains and one without the same. We also included a treatment scenario that was the actual fuel-treatment network implemented, as well as a no-treatment scenario. For all the four scenarios, we modeled hazardous fire potential based on conditional burn probabilities, and projected fire emissions. Results demonstrate that in all the three active treatment scenarios, hazardous fire potential, fire area, and emissions were reduced by approximately 50 % relative to the untreated condition. Results depict that incorporation of constraints is more effective at reducing modeled fire outputs, possibly due to the greater aggregation of treatments, creating greater continuity of fuel-treatment blocks across the landscape. The implementation of fuel-treatment networks using different planning techniques that incorporate real-world constraints can reduce the risk of large problematic fires, allow for landscape-level heterogeneity that can provide necessary ecosystem services, create mixed forest stand structures on a landscape, and promote resilience in the uncertain future of climate change.
Leaders' perspectives in the Yellowstone to Yukon Conservation Initiative
Mattson, D.J.; Clark, S.G.; Byrd, K.L.; Brown, S.R.; Robinson, B.
2011-01-01
The Yellowstone to Yukon Conservation Initiative (Y2Y) was created in 1993 to advance conservation in a 1.2 million km2 portion of the North American Rocky Mountains. We assembled 21 people with influence over Y2Y in a workshop to elucidate perspectives on challenges and solutions for this organization at a key point in its evolution, and used Q method to define four perspectives on challenges and three on solutions. Participants were differentiated by four models for effecting change-vision-based advocacy, practice-based learning, political engagement, and scientific management-with emphasis on the first three. Those with authority in Y2Y aligned with vision-based advocacy and expressed ambivalence about practice-based adaptive learning and rigorous appraisals of existing strategies. Workshop results were consistent with an apparent trend toward organizational maturation focused on stabilizing revenues, developing formal organizational arrangements, and focusing strategies. Consolidation of power in Y2Y around a long-standing formula does not bode well for the effectiveness of Y2Y. We recommend that leaders in Y2Y and similar organizations focused on large-scale conservation to create and maintain an open system-philosophically and operationally-that capitalizes on the diverse perspectives and skills of individuals who are attracted to such efforts. We also recommend that the Y2Y initiative be followed closely to harvest additional lessons for potential application to large-scale conservation efforts elsewhere. ?? Springer Science+Business Media, LLC(outside the USA).2011.
A Mapping of Drug Space from the Viewpoint of Small Molecule Metabolism
Basuino, Li; Chambers, Henry F.; Lee, Deok-Sun; Wiest, Olaf G.; Babbitt, Patricia C.
2009-01-01
Small molecule drugs target many core metabolic enzymes in humans and pathogens, often mimicking endogenous ligands. The effects may be therapeutic or toxic, but are frequently unexpected. A large-scale mapping of the intersection between drugs and metabolism is needed to better guide drug discovery. To map the intersection between drugs and metabolism, we have grouped drugs and metabolites by their associated targets and enzymes using ligand-based set signatures created to quantify their degree of similarity in chemical space. The results reveal the chemical space that has been explored for metabolic targets, where successful drugs have been found, and what novel territory remains. To aid other researchers in their drug discovery efforts, we have created an online resource of interactive maps linking drugs to metabolism. These maps predict the “effect space” comprising likely target enzymes for each of the 246 MDDR drug classes in humans. The online resource also provides species-specific interactive drug-metabolism maps for each of the 385 model organisms and pathogens in the BioCyc database collection. Chemical similarity links between drugs and metabolites predict potential toxicity, suggest routes of metabolism, and reveal drug polypharmacology. The metabolic maps enable interactive navigation of the vast biological data on potential metabolic drug targets and the drug chemistry currently available to prosecute those targets. Thus, this work provides a large-scale approach to ligand-based prediction of drug action in small molecule metabolism. PMID:19701464
NASA Astrophysics Data System (ADS)
Uritskaya, Olga Y.
2005-05-01
Results of fractal stability analysis of daily exchange rate fluctuations of more than 30 floating currencies for a 10-year period are presented. It is shown for the first time that small- and large-scale dynamical instabilities of national monetary systems correlate with deviations of the detrended fluctuation analysis (DFA) exponent from the value 1.5 predicted by the efficient market hypothesis. The observed dependence is used for classification of long-term stability of floating exchange rates as well as for revealing various forms of distortion of stable currency dynamics prior to large-scale crises. A normal range of DFA exponents consistent with crisis-free long-term exchange rate fluctuations is determined, and several typical scenarios of unstable currency dynamics with DFA exponents fluctuating beyond the normal range are identified. It is shown that monetary crashes are usually preceded by prolonged periods of abnormal (decreased or increased) DFA exponent, with the after-crash exponent tending to the value 1.5 indicating a more reliable exchange rate dynamics. Statistically significant regression relations (R=0.99, p<0.01) between duration and magnitude of currency crises and the degree of distortion of monofractal patterns of exchange rate dynamics are found. It is demonstrated that the parameters of these relations characterizing small- and large-scale crises are nearly equal, which implies a common instability mechanism underlying these events. The obtained dependences have been used as a basic ingredient of a forecasting technique which provided correct in-sample predictions of monetary crisis magnitude and duration over various time scales. The developed technique can be recommended for real-time monitoring of dynamical stability of floating exchange rate systems and creating advanced early-warning-system models for currency crisis prevention.
Chen, S. N.; Iwawaki, T.; Morita, K.; Antici, P.; Baton, S. D.; Filippi, F.; Habara, H.; Nakatsutsumi, M.; Nicolaï , P.; Nazarov, W.; Rousseaux, C.; Starodubstev, M.; Tanaka, K. A.; Fuchs, J.
2016-01-01
The ability to produce long-scale length (i.e. millimeter scale-length), homogeneous plasmas is of interest in studying a wide range of fundamental plasma processes. We present here a validated experimental platform to create and diagnose uniform plasmas with a density close or above the critical density. The target consists of a polyimide tube filled with an ultra low-density plastic foam where it was heated by x-rays, produced by a long pulse laser irradiating a copper foil placed at one end of the tube. The density and temperature of the ionized foam was retrieved by using x-ray radiography and proton radiography was used to verify the uniformity of the plasma. Plasma temperatures of 5–10 eV and densities around 1021 cm−3 are measured. This well-characterized platform of uniform density and temperature plasma is of interest for experiments using large-scale laser platforms conducting High Energy Density Physics investigations. PMID:26923471
Wafer-scale growth of VO2 thin films using a combinatorial approach
Zhang, Hai-Tian; Zhang, Lei; Mukherjee, Debangshu; Zheng, Yuan-Xia; Haislmaier, Ryan C.; Alem, Nasim; Engel-Herbert, Roman
2015-01-01
Transition metal oxides offer functional properties beyond conventional semiconductors. Bridging the gap between the fundamental research frontier in oxide electronics and their realization in commercial devices demands a wafer-scale growth approach for high-quality transition metal oxide thin films. Such a method requires excellent control over the transition metal valence state to avoid performance deterioration, which has been proved challenging. Here we present a scalable growth approach that enables a precise valence state control. By creating an oxygen activity gradient across the wafer, a continuous valence state library is established to directly identify the optimal growth condition. Single-crystalline VO2 thin films have been grown on wafer scale, exhibiting more than four orders of magnitude change in resistivity across the metal-to-insulator transition. It is demonstrated that ‘electronic grade' transition metal oxide films can be realized on a large scale using a combinatorial growth approach, which can be extended to other multivalent oxide systems. PMID:26450653
Federated queries of clinical data repositories: Scaling to a national network.
Weber, Griffin M
2015-06-01
Federated networks of clinical research data repositories are rapidly growing in size from a handful of sites to true national networks with more than 100 hospitals. This study creates a conceptual framework for predicting how various properties of these systems will scale as they continue to expand. Starting with actual data from Harvard's four-site Shared Health Research Information Network (SHRINE), the framework is used to imagine a future 4000 site network, representing the majority of hospitals in the United States. From this it becomes clear that several common assumptions of small networks fail to scale to a national level, such as all sites being online at all times or containing data from the same date range. On the other hand, a large network enables researchers to select subsets of sites that are most appropriate for particular research questions. Developers of federated clinical data networks should be aware of how the properties of these networks change at different scales and design their software accordingly. Copyright © 2015 Elsevier Inc. All rights reserved.
A Close Look At The Relationship Between WMAP (ILC) Small-Scale Features And Galactic HI Structure
NASA Astrophysics Data System (ADS)
Verschuur, Gerrit L.
2012-05-01
Galactic HI emission profiles surrounding two pairs of features located where large-scale filaments at very different velocities overlap were decomposed into Gaussian components. Families of components defined by similarity of center velocities and line widths were identified and found to be spatially related. Each of the two pairs of HI peaks straddle a high-frequency continuum source revealed in the WMAP survey data. It is suggested that where filamentary HI features are directly interacting high-frequency continuum radiation is being produced. The previously hypothesized mechanism for producing high-frequency continuum radiation involving free-free emission from electrons in the interstellar medium, in this case created where HI filaments interact to produce fractional ionizations of order 5 to 15%, fit the data very closely. The results confirm that WMAP data on small-scale structures believed to be cosmological in origin are in fact compromised by the presence of intervening galactic sources of interstellar electrons clumped on scales typical of interstellar HI structure.
A modeling process to understand complex system architectures
NASA Astrophysics Data System (ADS)
Robinson, Santiago Balestrini
2009-12-01
In recent decades, several tools have been developed by the armed forces, and their contractors, to test the capability of a force. These campaign level analysis tools, often times characterized as constructive simulations are generally expensive to create and execute, and at best they are extremely difficult to verify and validate. This central observation, that the analysts are relying more and more on constructive simulations to predict the performance of future networks of systems, leads to the two central objectives of this thesis: (1) to enable the quantitative comparison of architectures in terms of their ability to satisfy a capability without resorting to constructive simulations, and (2) when constructive simulations must be created, to quantitatively determine how to spend the modeling effort amongst the different system classes. The first objective led to Hypothesis A, the first main hypotheses, which states that by studying the relationships between the entities that compose an architecture, one can infer how well it will perform a given capability. The method used to test the hypothesis is based on two assumptions: (1) the capability can be defined as a cycle of functions, and that it (2) must be possible to estimate the probability that a function-based relationship occurs between any two types of entities. If these two requirements are met, then by creating random functional networks, different architectures can be compared in terms of their ability to satisfy a capability. In order to test this hypothesis, a novel process for creating representative functional networks of large-scale system architectures was developed. The process, named the Digraph Modeling for Architectures (DiMA), was tested by comparing its results to those of complex constructive simulations. Results indicate that if the inputs assigned to DiMA are correct (in the tests they were based on time-averaged data obtained from the ABM), DiMA is able to identify which of any two architectures is better more than 98% of the time. The second objective led to Hypothesis B, the second of the main hypotheses. This hypothesis stated that by studying the functional relations, the most critical entities composing the architecture could be identified. The critical entities are those that when their behavior varies slightly, the behavior of the overall architecture varies greatly. These are the entities that must be modeled more carefully and where modeling effort should be expended. This hypothesis was tested by simplifying agent-based models to the non-trivial minimum, and executing a large number of different simulations in order to obtain statistically significant results. The tests were conducted by evolving the complex model without any error induced, and then evolving the model once again for each ranking and assigning error to any of the nodes with a probability inversely proportional to the ranking. The results from this hypothesis test indicate that depending on the structural characteristics of the functional relations, it is useful to use one of two of the intelligent rankings tested, or it is best to expend effort equally amongst all the entities. Random ranking always performed worse than uniform ranking, indicating that if modeling effort is to be prioritized amongst the entities composing the large-scale system architecture, it should be prioritized intelligently. The benefit threshold between intelligent prioritization and no prioritization lays on the large-scale system's chaotic boundary. If the large-scale system behaves chaotically, small variations in any of the entities tends to have a great impact on the behavior of the entire system. Therefore, even low ranking entities can still affect the behavior of the model greatly, and error should not be concentrated in any one entity. It was discovered that the threshold can be identified from studying the structure of the networks, in particular the cyclicity, the Off-diagonal Complexity, and the Digraph Algebraic Connectivity. (Abstract shortened by UMI.)
Activity-Based Introductory Physics Reform *
NASA Astrophysics Data System (ADS)
Thornton, Ronald
2004-05-01
Physics education research has shown that learning environments that engage students and allow them to take an active part in their learning can lead to large conceptual gains compared to those of good traditional instruction. Examples of successful curricula and methods include Peer Instruction, Just in Time Teaching, RealTime Physics, Workshop Physics, Scale-Up, and Interactive Lecture Demonstrations (ILDs). RealTime Physics promotes interaction among students in a laboratory setting and makes use of powerful real-time data logging tools to teach concepts as well as quantitative relationships. An active learning environment is often difficult to achieve in large lecture sessions and Workshop Physics and Scale-Up largely eliminate lectures in favor of collaborative student activities. Peer Instruction, Just in Time Teaching, and Interactive Lecture Demonstrations (ILDs) make lectures more interactive in complementary ways. This presentation will introduce these reforms and use Interactive Lecture Demonstrations (ILDs) with the audience to illustrate the types of curricula and tools used in the curricula above. ILDs make use real experiments, real-time data logging tools and student interaction to create an active learning environment in large lecture classes. A short video of students involved in interactive lecture demonstrations will be shown. The results of research studies at various institutions to measure the effectiveness of these methods will be presented.
Lewis, Jesse S.; Farnsworth, Matthew L.; Burdett, Chris L.; Theobald, David M.; Gray, Miranda; Miller, Ryan S.
2017-01-01
Biotic and abiotic factors are increasingly acknowledged to synergistically shape broad-scale species distributions. However, the relative importance of biotic and abiotic factors in predicting species distributions is unclear. In particular, biotic factors, such as predation and vegetation, including those resulting from anthropogenic land-use change, are underrepresented in species distribution modeling, but could improve model predictions. Using generalized linear models and model selection techniques, we used 129 estimates of population density of wild pigs (Sus scrofa) from 5 continents to evaluate the relative importance, magnitude, and direction of biotic and abiotic factors in predicting population density of an invasive large mammal with a global distribution. Incorporating diverse biotic factors, including agriculture, vegetation cover, and large carnivore richness, into species distribution modeling substantially improved model fit and predictions. Abiotic factors, including precipitation and potential evapotranspiration, were also important predictors. The predictive map of population density revealed wide-ranging potential for an invasive large mammal to expand its distribution globally. This information can be used to proactively create conservation/management plans to control future invasions. Our study demonstrates that the ongoing paradigm shift, which recognizes that both biotic and abiotic factors shape species distributions across broad scales, can be advanced by incorporating diverse biotic factors. PMID:28276519
Blueprint for a microwave trapped ion quantum computer
Lekitsch, Bjoern; Weidt, Sebastian; Fowler, Austin G.; Mølmer, Klaus; Devitt, Simon J.; Wunderlich, Christof; Hensinger, Winfried K.
2017-01-01
The availability of a universal quantum computer may have a fundamental impact on a vast number of research fields and on society as a whole. An increasingly large scientific and industrial community is working toward the realization of such a device. An arbitrarily large quantum computer may best be constructed using a modular approach. We present a blueprint for a trapped ion–based scalable quantum computer module, making it possible to create a scalable quantum computer architecture based on long-wavelength radiation quantum gates. The modules control all operations as stand-alone units, are constructed using silicon microfabrication techniques, and are within reach of current technology. To perform the required quantum computations, the modules make use of long-wavelength radiation–based quantum gate technology. To scale this microwave quantum computer architecture to a large size, we present a fully scalable design that makes use of ion transport between different modules, thereby allowing arbitrarily many modules to be connected to construct a large-scale device. A high error–threshold surface error correction code can be implemented in the proposed architecture to execute fault-tolerant operations. With appropriate adjustments, the proposed modules are also suitable for alternative trapped ion quantum computer architectures, such as schemes using photonic interconnects. PMID:28164154
An investigation of turbulent transport in the extreme lower atmosphere
NASA Technical Reports Server (NTRS)
Koper, C. A., Jr.; Sadeh, W. Z.
1975-01-01
A model in which the Lagrangian autocorrelation is expressed by a domain integral over a set of usual Eulerian autocorrelations acquired concurrently at all points within a turbulence box is proposed along with a method for ascertaining the statistical stationarity of turbulent velocity by creating an equivalent ensemble to investigate the flow in the extreme lower atmosphere. Simultaneous measurements of turbulent velocity on a turbulence line along the wake axis were carried out utilizing a longitudinal array of five hot-wire anemometers remotely operated. The stationarity test revealed that the turbulent velocity is approximated as a realization of a weakly self-stationary random process. Based on the Lagrangian autocorrelation it is found that: (1) large diffusion time predominated; (2) ratios of Lagrangian to Eulerian time and spatial scales were smaller than unity; and, (3) short and long diffusion time scales and diffusion spatial scales were constrained within their Eulerian counterparts.
Exclusively visual analysis of classroom group interactions
NASA Astrophysics Data System (ADS)
Tucker, Laura; Scherr, Rachel E.; Zickler, Todd; Mazur, Eric
2016-12-01
Large-scale audiovisual data that measure group learning are time consuming to collect and analyze. As an initial step towards scaling qualitative classroom observation, we qualitatively coded classroom video using an established coding scheme with and without its audio cues. We find that interrater reliability is as high when using visual data only—without audio—as when using both visual and audio data to code. Also, interrater reliability is high when comparing use of visual and audio data to visual-only data. We see a small bias to code interactions as group discussion when visual and audio data are used compared with video-only data. This work establishes that meaningful educational observation can be made through visual information alone. Further, it suggests that after initial work to create a coding scheme and validate it in each environment, computer-automated visual coding could drastically increase the breadth of qualitative studies and allow for meaningful educational analysis on a far greater scale.
The Updating of Geospatial Base Data
NASA Astrophysics Data System (ADS)
Alrajhi, Muhamad N.; Konecny, Gottfried
2018-04-01
Topopographic mapping issues concern the area coverage at different scales and their age. The age of the map is determined by the system of updating. The United Nations (UNGGIM) have attempted to track the global map coverage at various scale ranges, which has greatly improved in recent decades. However the poor state of updating of base maps is still a global problem. In Saudi Arabia large scale mapping is carried out for all urban, suburban and rural areas by aerial surveys. Updating is carried out by remapping every 5 to 10 years. Due to the rapid urban development this is not satisfactory, but faster update methods are forseen by use of high resolution satellite imagery and the improvement of object oriented geodatabase structures, which will permit to utilize various survey technologies to update the photogrammetry established geodatabases. The longterm goal is to create an geodata infrastructure, which exists in Great Britain or Germany.
Development of a 3D printer using scanning projection stereolithography
Lee, Michael P.; Cooper, Geoffrey J. T.; Hinkley, Trevor; Gibson, Graham M.; Padgett, Miles J.; Cronin, Leroy
2015-01-01
We have developed a system for the rapid fabrication of low cost 3D devices and systems in the laboratory with micro-scale features yet cm-scale objects. Our system is inspired by maskless lithography, where a digital micromirror device (DMD) is used to project patterns with resolution up to 10 µm onto a layer of photoresist. Large area objects can be fabricated by stitching projected images over a 5cm2 area. The addition of a z-stage allows multiple layers to be stacked to create 3D objects, removing the need for any developing or etching steps but at the same time leading to true 3D devices which are robust, configurable and scalable. We demonstrate the applications of the system by printing a range of micro-scale objects as well as a fully functioning microfluidic droplet device and test its integrity by pumping dye through the channels. PMID:25906401
Properties of galaxies reproduced by a hydrodynamic simulation.
Vogelsberger, M; Genel, S; Springel, V; Torrey, P; Sijacki, D; Xu, D; Snyder, G; Bird, S; Nelson, D; Hernquist, L
2014-05-08
Previous simulations of the growth of cosmic structures have broadly reproduced the 'cosmic web' of galaxies that we see in the Universe, but failed to create a mixed population of elliptical and spiral galaxies, because of numerical inaccuracies and incomplete physical models. Moreover, they were unable to track the small-scale evolution of gas and stars to the present epoch within a representative portion of the Universe. Here we report a simulation that starts 12 million years after the Big Bang, and traces 13 billion years of cosmic evolution with 12 billion resolution elements in a cube of 106.5 megaparsecs a side. It yields a reasonable population of ellipticals and spirals, reproduces the observed distribution of galaxies in clusters and characteristics of hydrogen on large scales, and at the same time matches the 'metal' and hydrogen content of galaxies on small scales.
Digital disruption ?syndromes.
Sullivan, Clair; Staib, Andrew
2017-05-18
The digital transformation of hospitals in Australia is occurring rapidly in order to facilitate innovation and improve efficiency. Rapid transformation can cause temporary disruption of hospital workflows and staff as processes are adapted to the new digital workflows. The aim of this paper is to outline various types of digital disruption and some strategies for effective management. A large tertiary university hospital recently underwent a rapid, successful roll-out of an integrated electronic medical record (EMR). We observed this transformation and propose several digital disruption "syndromes" to assist with understanding and management during digital transformation: digital deceleration, digital transparency, digital hypervigilance, data discordance, digital churn and post-digital 'depression'. These 'syndromes' are defined and discussed in detail. Successful management of this temporary digital disruption is important to ensure a successful transition to a digital platform. What is known about this topic? Digital disruption is defined as the changes facilitated by digital technologies that occur at a pace and magnitude that disrupt established ways of value creation, social interactions, doing business and more generally our thinking. Increasing numbers of Australian hospitals are implementing digital solutions to replace traditional paper-based systems for patient care in order to create opportunities for improved care and efficiencies. Such large scale change has the potential to create transient disruption to workflows and staff. Managing this temporary disruption effectively is an important factor in the successful implementation of an EMR. What does this paper add? A large tertiary university hospital recently underwent a successful rapid roll-out of an integrated electronic medical record (EMR) to become Australia's largest digital hospital over a 3-week period. We observed and assisted with the management of several cultural, behavioural and operational forms of digital disruption which lead us to propose some digital disruption 'syndromes'. The definition and management of these 'syndromes' are discussed in detail. What are the implications for practitioners? Minimising the temporary effects of digital disruption in hospitals requires an understanding that these digital 'syndromes' are to be expected and actively managed during large-scale transformation.
NASA Astrophysics Data System (ADS)
Hadsell, Michael John, Jr.
Microbeam radiation therapy (MRT) is a new type of cancer treatment currently being studied at scattered synchrotron sites throughout the world. It has been shown to be capable of ablating aggressive brain tumors in rats while almost completely sparing the surrounding normal tissue. This promising technique has yet to find its way to the clinic, however, because the radiobiological mechanisms behind its efficacy are still largely unknown. This is partly due to the lack of a compact device that could facilitate more large scale research. The challenges inherent to creating a compact device lie within the structure of MRT, which uses parallel arrays of ultra high-dose, orthovoltage, microplanar beams on the order of 100mum thick and separated by four to ten times their width. Because of focal spot limitations, current commercial orthovoltage devices are simply not capable of creating such arrays at dose rates high enough for effective treatment while maintaining the microbeam pattern necessary to retain the high therapeutic ratio of the technique. Therefore, the development of a compact MRT device using carbon nanotube (CNT) cathode based X-ray technology is presented here. CNT cathodes have been shown to be capable of creating novel focal spot arrays on a single anode while being robust enough for long-term use in X-ray tubes. Using these cathodes, an X-ray tube with a single focal line has been created for the delivery of MRT dose distributions in radiobiological studies on small animals. In this work, the development process and final design of this specialized device will be detailed, along with the optimization and stabilization of its use for small animal studies. In addition, a detailed characterization of its final capabilities will be given; including a comprehensive measurement of its X-ray focal line dimensions, a description and evaluation of its collimator alignment and microbeam dimensions, and a full-scale phantom-based quantification of its dosimetric output. Finally, future project directions will be described briefly along with plans for a second generation device. Based on the results of this work, it is the author's belief that compact CNT MRT devices have definite commercialization potential for radiobiological research.
Modeling spatially-varying landscape change points in species occurrence thresholds
Wagner, Tyler; Midway, Stephen R.
2014-01-01
Predicting species distributions at scales of regions to continents is often necessary, as large-scale phenomena influence the distributions of spatially structured populations. Land use and land cover are important large-scale drivers of species distributions, and landscapes are known to create species occurrence thresholds, where small changes in a landscape characteristic results in abrupt changes in occurrence. The value of the landscape characteristic at which this change occurs is referred to as a change point. We present a hierarchical Bayesian threshold model (HBTM) that allows for estimating spatially varying parameters, including change points. Our model also allows for modeling estimated parameters in an effort to understand large-scale drivers of variability in land use and land cover on species occurrence thresholds. We use range-wide detection/nondetection data for the eastern brook trout (Salvelinus fontinalis), a stream-dwelling salmonid, to illustrate our HBTM for estimating and modeling spatially varying threshold parameters in species occurrence. We parameterized the model for investigating thresholds in landscape predictor variables that are measured as proportions, and which are therefore restricted to values between 0 and 1. Our HBTM estimated spatially varying thresholds in brook trout occurrence for both the proportion agricultural and urban land uses. There was relatively little spatial variation in change point estimates, although there was spatial variability in the overall shape of the threshold response and associated uncertainty. In addition, regional mean stream water temperature was correlated to the change point parameters for the proportion of urban land use, with the change point value increasing with increasing mean stream water temperature. We present a framework for quantify macrosystem variability in spatially varying threshold model parameters in relation to important large-scale drivers such as land use and land cover. Although the model presented is a logistic HBTM, it can easily be extended to accommodate other statistical distributions for modeling species richness or abundance.
Spatial Heterogeneity, Scale, Data Character and Sustainable Transport in the Big Data Era
NASA Astrophysics Data System (ADS)
Jiang, Bin
2018-04-01
In light of the emergence of big data, I have advocated and argued for a paradigm shift from Tobler's law to scaling law, from Euclidean geometry to fractal geometry, from Gaussian statistics to Paretian statistics, and - more importantly - from Descartes' mechanistic thinking to Alexander's organic thinking. Fractal geometry falls under the third definition of fractal - that is, a set or pattern is fractal if the scaling of far more small things than large ones recurs multiple times (Jiang and Yin 2014) - rather than under the second definition of fractal, which requires a power law between scales and details (Mandelbrot 1982). The new fractal geometry is more towards living geometry that "follows the rules, constraints, and contingent conditions that are, inevitably, encountered in the real world" (Alexander et al. 2012, p. 395), not only for understanding complexity, but also for creating complex or living structure (Alexander 2002-2005). This editorial attempts to clarify why the paradigm shift is essential and to elaborate on several concepts, including spatial heterogeneity (scaling law), scale (or the fourth meaning of scale), data character (in contrast to data quality), and sustainable transport in the big data era.
Water Flow Testing and Unsteady Pressure Analysis of a Two-Bladed Liquid Oxidizer Pump Inducer
NASA Technical Reports Server (NTRS)
Schwarz, Jordan B.; Mulder, Andrew; Zoladz, Thomas
2011-01-01
The unsteady fluid dynamic performance of a cavitating two-bladed oxidizer turbopump inducer was characterized through sub-scale water flow testing. While testing a novel inlet duct design that included a cavitation suppression groove, unusual high-frequency pressure oscillations were observed. With potential implications for inducer blade loads, these high-frequency components were analyzed extensively in order to understand their origins and impacts to blade loading. Water flow testing provides a technique to determine pump performance without the costs and hazards associated with handling cryogenic propellants. Water has a similar density and Reynolds number to liquid oxygen. In a 70%-scale water flow test, the inducer-only pump performance was evaluated. Over a range of flow rates, the pump inlet pressure was gradually reduced, causing the flow to cavitate near the pump inducer. A nominal, smooth inducer inlet was tested, followed by an inlet duct with a circumferential groove designed to suppress cavitation. A subsequent 52%-scale water flow test in another facility evaluated the combined inducer-impeller pump performance. With the nominal inlet design, the inducer showed traditional cavitation and surge characteristics. Significant bearing loads were created by large side loads on the inducer during synchronous cavitation. The grooved inlet successfully mitigated these loads by greatly reducing synchronous cavitation, however high-frequency pressure oscillations were observed over a range of frequencies. Analytical signal processing techniques showed these oscillations to be created by a rotating, multi-celled train of pressure pulses, and subsequent CFD analysis suggested that such pulses could be created by the interaction of rotating inducer blades with fluid trapped in a cavitation suppression groove. Despite their relatively low amplitude, these high-frequency pressure oscillations posed a design concern due to their sensitivity to flow conditions and test scale. The amplitude and frequency of oscillations varied considerably over the pump s operating space, making it difficult to predict blade loads.
Cryogenic Calcite: A Morphologic and Isotopic Analog to the ALH84001 Carbonates
NASA Technical Reports Server (NTRS)
Niles, P. B.; Leshin, L. A.; Socki, R. A.; Guan, Y.; Ming, D. W.; Gibson, E. K.
2004-01-01
Martian meteorite ALH84001 carbonates preserve large and variable microscale isotopic compositions, which in some way reflect their formation environment. These measurements show large variations (>20%) in the carbon and oxygen isotopic compositions of the carbonates on a 10-20 micron scale that are correlated with chemical composition. However, the utilization of these data sets for interpreting the formation conditions of the carbonates is complex due to lack of suitable terrestrial analogs and the difficulty of modeling under non-equilibrium conditions. Thus, the mechanisms and processes are largely unknown that create and preserve large microscale isotopic variations in carbonate minerals. Experimental tests of the possible environments and mechanisms that lead to large microscale isotopic variations can help address these concerns. One possible mechanism for creating large carbon isotopic variations in carbonates involves the freezing of water. Carbonates precipitate during extensive CO2 degassing that occurs during the freezing process as the fluid s decreasing volume drives CO2 out. This rapid CO2 degassing results in a kinetic isotopic fractionation where the CO2 gas has a much lighter isotopic composition causing an enrichment of 13C in the remaining dissolved bicarbonate. This study seeks to determine the suitability of cryogenically formed carbonates as analogs to ALH84001 carbonates. Specifically, our objective is to determine how accurately models using equilibrium fractionation factors approximate the isotopic compositions of cryogenically precipitated carbonates. This includes determining the accuracy of applying equilibrium fractionation factors during a kinetic process, and determining how isotopic variations in the fluid are preserved in microscale variations in the precipitated carbonates.
Policy approaches to renewable energy investment in the Mediterranean region
NASA Astrophysics Data System (ADS)
Patt, A.; Komendantova, N.; Battaglini, A.; Lilliestam, J.; Williges, K.
2009-04-01
Europe's climate policy objective of 20% renewable energy by 2020, and the call by the IPCC to reduce greenhouse gas emissions by 80% by 2050, pose major challenges for the European Union. Several policy options are available to move towards these objectives. In this paper, we will address the most critical policy and governance issues associated with one particular approach to scaling up renewable energy resources: reliance on large-scale energy generation facilities outside the European continent, such as onshore and offshore wind farms and concentrating solar power (CSP) facilities in the Mediterranean region. Several feasibility studies completed over the past three years (German Aerospace Center 2006; German Aerospace Center 2005; Czisch, Elektrotechnik 2005, p. 488; Lorenz, Pinner, Seitz, McKinsey Quarterly 2008, p.10; German Aerospace Center 2005; Knies 2008, The Club of Rome; Khosla, Breaking the Climate Deadlock Briefing Papers, 2008, p.19) have convincingly demonstrated that large-scale wind and CSP projects ought to be very attractive for a number of reasons, including cost, reliability of power supply, and technological maturity. According to these studies it would be technically possible for Europe to rely on large-scale wind and CSP for the majority of its power needs by 2050—indeed enough to completely replace its reliance on fossil fuels for power generation—at competitive cost over its current, carbon intensive system. While it has been shown to be technically feasible to develop renewable resources in North Africa to account for a large share of Europe's energy needs, doing so would require sustained double digit rates of growth in generating and long-distance transmission capacity, and would potentially require a very different high voltage grid architecture within Europe. Doing so at a large scale could require enormous up-front investments in technical capacity, financial instruments and human resources. What are the policy instruments best suited to achieving such growth quickly and smoothly? What bottlenecks—in terms of supply chains, human capital, finance, and transmission capacity—need to be anticipated and addressed if the rate of capacity growth is to be sustained over several decades? What model of governance would create a safe investment climate in consistence with new EU legislation (i.e. EU Renewable Energy Directive) as well as expected post-Kyoto targets and mechanisms? The material that we present here is based on a series of workshops held between November 2008 and January 2009, in which a wide range of stakeholders expressed their views about the fundamental needs for policy intervention. Supplementing the results from these workshops have been additional expert interviews, and basic financial modeling. One of the interesting results from this research is the need for a multi-pronged approach. First, there is a need for a support scheme, potentially compatible with in all cases supplementing the EU REN Directive, that would create a stable market for North African electricity in Europe. Second, there is a need for policies that facilitate the formation of public private partnerships in North Africa, as the specific investment vehicle, as a way to manage some of the uncertainties associated with large-scale investments in the region. Third, attention has to be paid to the development of supply chains within the Mediterranean region, as a way of ensuring the compatibility of such investments with sustainable development.
The study of integration about measurable image and 4D production
NASA Astrophysics Data System (ADS)
Zhang, Chunsen; Hu, Pingbo; Niu, Weiyun
2008-12-01
In this paper, we create the geospatial data of three-dimensional (3D) modeling by the combination of digital photogrammetry and digital close-range photogrammetry. For large-scale geographical background, we make the establishment of DEM and DOM combination of three-dimensional landscape model based on the digital photogrammetry which uses aerial image data to make "4D" (DOM: Digital Orthophoto Map, DEM: Digital Elevation Model, DLG: Digital Line Graphic and DRG: Digital Raster Graphic) production. For the range of building and other artificial features which the users are interested in, we realize that the real features of the three-dimensional reconstruction adopting the method of the digital close-range photogrammetry can come true on the basis of following steps : non-metric cameras for data collection, the camera calibration, feature extraction, image matching, and other steps. At last, we combine three-dimensional background and local measurements real images of these large geographic data and realize the integration of measurable real image and the 4D production.The article discussed the way of the whole flow and technology, achieved the three-dimensional reconstruction and the integration of the large-scale threedimensional landscape and the metric building.
Geophysical potential for wind energy over the open oceans
2017-01-01
Wind turbines continuously remove kinetic energy from the lower troposphere, thereby reducing the wind speed near hub height. The rate of electricity generation in large wind farms containing multiple wind arrays is, therefore, constrained by the rate of kinetic energy replenishment from the atmosphere above. In recent years, a growing body of research argues that the rate of generated power is limited to around 1.5 W m−2 within large wind farms. However, in this study, we show that considerably higher power generation rates may be sustainable over some open ocean areas. In particular, the North Atlantic is identified as a region where the downward transport of kinetic energy may sustain extraction rates of 6 W m−2 and above over large areas in the annual mean. Furthermore, our results indicate that the surface heat flux from the oceans to the atmosphere may play an important role in creating regions where sustained high rates of downward transport of kinetic energy and thus, high rates of kinetic energy extraction may be geophysical possible. While no commercial-scale deep water wind farms yet exist, our results suggest that such technologies, if they became technically and economically feasible, could potentially provide civilization-scale power. PMID:29073053
NASA Technical Reports Server (NTRS)
Parsons, David S.; Ordway, David; Johnson, Kenneth
2013-01-01
This experimental study seeks to quantify the impact various composite parameters have on the structural response of a composite structure in a pyroshock environment. The prediction of an aerospace structure's response to pyroshock induced loading is largely dependent on empirical databases created from collections of development and flight test data. While there is significant structural response data due to pyroshock induced loading for metallic structures, there is much less data available for composite structures. One challenge of developing a composite pyroshock response database as well as empirical prediction methods for composite structures is the large number of parameters associated with composite materials. This experimental study uses data from a test series planned using design of experiments (DOE) methods. Statistical analysis methods are then used to identify which composite material parameters most greatly influence a flat composite panel's structural response to pyroshock induced loading. The parameters considered are panel thickness, type of ply, ply orientation, and pyroshock level induced into the panel. The results of this test will aid in future large scale testing by eliminating insignificant parameters as well as aid in the development of empirical scaling methods for composite structures' response to pyroshock induced loading.
NASA Technical Reports Server (NTRS)
Parsons, David S.; Ordway, David O.; Johnson, Kenneth L.
2013-01-01
This experimental study seeks to quantify the impact various composite parameters have on the structural response of a composite structure in a pyroshock environment. The prediction of an aerospace structure's response to pyroshock induced loading is largely dependent on empirical databases created from collections of development and flight test data. While there is significant structural response data due to pyroshock induced loading for metallic structures, there is much less data available for composite structures. One challenge of developing a composite pyroshock response database as well as empirical prediction methods for composite structures is the large number of parameters associated with composite materials. This experimental study uses data from a test series planned using design of experiments (DOE) methods. Statistical analysis methods are then used to identify which composite material parameters most greatly influence a flat composite panel's structural response to pyroshock induced loading. The parameters considered are panel thickness, type of ply, ply orientation, and pyroshock level induced into the panel. The results of this test will aid in future large scale testing by eliminating insignificant parameters as well as aid in the development of empirical scaling methods for composite structures' response to pyroshock induced loading.
Geophysical potential for wind energy over the open oceans.
Possner, Anna; Caldeira, Ken
2017-10-24
Wind turbines continuously remove kinetic energy from the lower troposphere, thereby reducing the wind speed near hub height. The rate of electricity generation in large wind farms containing multiple wind arrays is, therefore, constrained by the rate of kinetic energy replenishment from the atmosphere above. In recent years, a growing body of research argues that the rate of generated power is limited to around 1.5 W m -2 within large wind farms. However, in this study, we show that considerably higher power generation rates may be sustainable over some open ocean areas. In particular, the North Atlantic is identified as a region where the downward transport of kinetic energy may sustain extraction rates of 6 W m -2 and above over large areas in the annual mean. Furthermore, our results indicate that the surface heat flux from the oceans to the atmosphere may play an important role in creating regions where sustained high rates of downward transport of kinetic energy and thus, high rates of kinetic energy extraction may be geophysical possible. While no commercial-scale deep water wind farms yet exist, our results suggest that such technologies, if they became technically and economically feasible, could potentially provide civilization-scale power.
NASA Astrophysics Data System (ADS)
de Rooij, G. H.
2010-09-01
Soil water is confined behind the menisci of its water-air interface. Catchment-scale fluxes (groundwater recharge, evaporation, transpiration, precipitation, etc.) affect the matric potential, and thereby the interface curvature and the configuration of the phases. In turn, these affect the fluxes (except precipitation), creating feedbacks between pore-scale and catchment-scale processes. Tracking pore-scale processes beyond the Darcy scale is not feasible. Instead, for a simplified system based on the classical Darcy's Law and Laplace-Young Law we i) clarify how menisci transfer pressure from the atmosphere to the soil water, ii) examine large-scale phenomena arising from pore-scale processes, and iii) analyze the relationship between average meniscus curvature and average matric potential. In stagnant water, changing the gravitational potential or the curvature of the air-water interface changes the pressure throughout the water. Adding small amounts of water can thus profoundly affect water pressures in a much larger volume. The pressure-regulating effect of the interface curvature showcases the meniscus as a pressure port that transfers the atmospheric pressure to the water with an offset directly proportional to its curvature. This property causes an extremely rapid rise of phreatic levels in soils once the capillary fringe extends to the soil surface and the menisci flatten. For large bodies of subsurface water, the curvature and vertical position of any meniscus quantify the uniform hydraulic potential under hydrostatic equilibrium. During unit-gradient flow, the matric potential corresponding to the mean curvature of the menisci should provide a good approximation of the intrinsic phase average of the matric potential.
Implementation and evaluation of a community-based interprofessional learning activity.
Luebbers, Ellen L; Dolansky, Mary A; Vehovec, Anton; Petty, Gayle
2017-01-01
Implementation of large-scale, meaningful interprofessional learning activities for pre-licensure students has significant barriers and requires novel approaches to ensure success. To accomplish this goal, faculty at Case Western Reserve University, Ohio, USA, used the Ottawa Model of Research Use (OMRU) framework to create, improve, and sustain a community-based interprofessional learning activity for large numbers of medical students (N = 177) and nursing students (N = 154). The model guided the process and included identification of context-specific barriers and facilitators, continual monitoring and improvement using data, and evaluation of student learning outcomes as well as programme outcomes. First year Case Western Reserve University medical students and undergraduate nursing students participated in team-structured prevention screening clinics in the Cleveland Metropolitan Public School District. Identification of barriers and facilitators assisted with overcoming logistic and scheduling issues, large class size, differing ages and skill levels of students and creating sustainability. Continual monitoring led to three distinct phases of improvement and resulted in the creation of an authentic team structure, role clarification, and relevance for students. Evaluation of student learning included both qualitative and quantitative methods, resulting in statistically significant findings and qualitative themes of learner outcomes. The OMRU implementation model provided a useful framework for successful implementation resulting in a sustainable interprofessional learning activity.
NASA Astrophysics Data System (ADS)
Howard, E. A.; Coleman, K. J.; Barford, C. L.; Kucharik, C.; Foley, J. A.
2005-12-01
Understanding environmental problems that cross physical and disciplinary boundaries requires a more holistic view of the world - a "systems" approach. Yet it is a challenge for many learners to start thinking this way, particularly when the problems are large in scale and not easily visible. We will describe our online university course, "Humans and the Changing Biosphere," which takes a whole-systems perspective for teaching regional to global-scale environmental science concepts, including climate, hydrology, ecology, and human demographics. We will share our syllabus and learning objectives and summarize our efforts to incorporate "best" practices for online teaching. We will describe challenges we have faced, and our efforts to reach different learner types. Our goals for this presentation are: (1) to communicate how a systems approach ties together environmental sciences (including climate, hydrology, ecology, biogeochemistry, and demography) that are often taught as separate disciplines; (2) to generate discussion about challenges of teaching large-scale environmental processes; (3) to share our experiences in teaching these topics online; (4) to receive ideas and feedback on future teaching strategies. We will explain why we developed this course online, and share our experiences about benefits and challenges of teaching over the web - including some suggestions about how to use technology to supplement face-to-face learning experiences (and vice versa). We will summarize assessment data about what students learned during the course, and discuss key misconceptions and barriers to learning. We will highlight the role of an online discussion board in creating classroom community, identifying misconceptions, and engaging different types of learners.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Babu, Sudarsanam Suresh; Love, Lonnie J.; Peter, William H.
Additive manufacturing (AM) is considered an emerging technology that is expected to transform the way industry can make low-volume, high value complex structures. This disruptive technology promises to replace legacy manufacturing methods for the fabrication of existing components in addition to bringing new innovation for new components with increased functional and mechanical properties. This report outlines the outcome of a workshop on large-scale metal additive manufacturing held at Oak Ridge National Laboratory (ORNL) on March 11, 2016. The charter for the workshop was outlined by the Department of Energy (DOE) Advanced Manufacturing Office program manager. The status and impact ofmore » the Big Area Additive Manufacturing (BAAM) for polymer matrix composites was presented as the background motivation for the workshop. Following, the extension of underlying technology to low-cost metals was proposed with the following goals: (i) High deposition rates (approaching 100 lbs/h); (ii) Low cost (<$10/lbs) for steel, iron, aluminum, nickel, as well as, higher cost titanium, (iii) large components (major axis greater than 6 ft) and (iv) compliance of property requirements. The above concept was discussed in depth by representatives from different industrial sectors including welding, metal fabrication machinery, energy, construction, aerospace and heavy manufacturing. In addition, DOE’s newly launched High Performance Computing for Manufacturing (HPC4MFG) program was reviewed. This program will apply thermo-mechanical models to elucidate deeper understanding of the interactions between design, process, and materials during additive manufacturing. Following these presentations, all the attendees took part in a brainstorming session where everyone identified the top 10 challenges in large-scale metal AM from their own perspective. The feedback was analyzed and grouped in different categories including, (i) CAD to PART software, (ii) selection of energy source, (iii) systems development, (iv) material feedstock, (v) process planning, (vi) residual stress & distortion, (vii) post-processing, (viii) qualification of parts, (ix) supply chain and (x) business case. Furthermore, an open innovation network methodology was proposed to accelerate the development and deployment of new large-scale metal additive manufacturing technology with the goal of creating a new generation of high deposition rate equipment, affordable feed stocks, and large metallic components to enhance America’s economic competitiveness.« less
NASA Astrophysics Data System (ADS)
Truebenbach, Alexandra; Darling, Jeremy
2018-01-01
We present the VLBA Extragalactic Proper Motion Catalog, a catalog of extragalactic proper motions created using archival VLBI data and our own VLBA astrometry. The catalog contains 713 proper motions, with average uncertainties of ~ 24 microarcsec/yr, including 40 new or improved proper motion measurements using relative astrometry with the VLBA. We detect the secular aberration drift – the apparent motion of extragalactic objects caused by the solar system's acceleration around the Galactic Center – at 6.3 sigma significance with an amplitude of 1.69 +/- 0.27 microarcsec/yr and an apex consistent with the Galactic Center (275.2 +/- 10.0 deg, -29.4 +/- 8.8 deg). Our dipole model detects the aberration drift at a higher significance than some previous studies (e.g., Titov & Lambert 2013), but at a lower amplitude than expected or previously measured. We then use the correlated relative proper motions of extragalactic objects to place upper limits on the rate of large-scale structure collapse (e.g., Quercellini et al. 2009; Darling 2013). Pairs of small separation objects that are in gravitationally interacting structures such as filaments of large-scale structure will show a net decrease in angular separation (> - 15.5 microarcsec/yr) as they move towards each other, while pairs of large separation objects that are gravitationally unbound and move with the Hubble expansion will show no net change in angular separation. With our catalog, we place a 3 sigma limit on the rate of convergence of large-scale structure of -11.4 microarcsec/yr for extragalactic objects within 100 comoving Mpc of each other. We also confirm that large separation objects (> 800 comoving Mpc) move with the Hubble flow to within ~ 2.2 microarcsec/yr. In the future, we plan to incorporate the upcoming Gaia proper motions into our catalog to achieve a higher precision measurement of the average relative proper motion of gravitationally interacting extragalactic objects and to refine our measurement of the collapse of large-scale structure. This research was performed with support from the NSF grant AST-1411605.Darling, J. 2013, AJ, 777, L21; Quercellini et al. 2009. Phys. Rev. Lett., 102, 151302; Titov, O. & Lambert, S. 2013, A&A, 559, A95
Intuitive web-based experimental design for high-throughput biomedical data.
Friedrich, Andreas; Kenar, Erhan; Kohlbacher, Oliver; Nahnsen, Sven
2015-01-01
Big data bioinformatics aims at drawing biological conclusions from huge and complex biological datasets. Added value from the analysis of big data, however, is only possible if the data is accompanied by accurate metadata annotation. Particularly in high-throughput experiments intelligent approaches are needed to keep track of the experimental design, including the conditions that are studied as well as information that might be interesting for failure analysis or further experiments in the future. In addition to the management of this information, means for an integrated design and interfaces for structured data annotation are urgently needed by researchers. Here, we propose a factor-based experimental design approach that enables scientists to easily create large-scale experiments with the help of a web-based system. We present a novel implementation of a web-based interface allowing the collection of arbitrary metadata. To exchange and edit information we provide a spreadsheet-based, humanly readable format. Subsequently, sample sheets with identifiers and metainformation for data generation facilities can be created. Data files created after measurement of the samples can be uploaded to a datastore, where they are automatically linked to the previously created experimental design model.
Bini, Stefano A; Mahajan, John
2016-11-01
Little is known about the implementation rate of clinical practice guidelines (CPGs). Our purpose was to report on the adoption rate of CPGs created and implemented by a large orthopedic group using the Delphi consensus method. The draft CPGs were created before the group's annual meeting by 5 teams each assigned a subset of topics. The draft guidelines included a statement and a summary of the available evidence. Each guideline was debated in both small-group and plenary sessions. Voting was anonymous and a 75% supermajority was required for passage. A Likert scale was used to survey the patient's experience with the process at 1 week, and the Kirkpatrick evaluation model was used to gauge the efficacy of the process over a 6-month time frame. Eighty-five orthopedic surgeons attended the meeting. Fifteen guidelines grouped into 5 topics were created. All passed. Eighty-six percent of attendees found the process effective and 84% felt that participating in the process made it more likely that they would adopt the guidelines. At 1 week, an average of 62% of attendees stated they were practicing the guideline as written (range: 35%-72%), and at 6 months, 96% stated they were practicing them (range: 82%-100%). We have demonstrated that a modified Delphi method for reaching consensus can be very effective in both creating CPGs and leading to their adoption. Further we have shown that the process is well received by participants and that an inclusionary approach can be highly successful. Copyright © 2016 Elsevier Inc. All rights reserved.
Clues on the Milky Way disc formation from population synthesis simulations
NASA Astrophysics Data System (ADS)
Robin, A. C.; Reylé, C.; Bienaymé, O.; Fernandez-Trincado, J. G.; Amôres, E. B.
2016-09-01
In recent years the stellar populations of the Milky Way have been investigated from large scale surveys in different ways, from pure star count analysis to detailed studies based on spectroscopic surveys. While in the former case the data can constrain the scale height and scale length thanks to completeness, they suffer from high correlation between these two values. On the other hand, spectroscopic surveys suffer from complex selection functions which hardly allow to derive accurate density distributions. The scale length in particular has been difficult to be constrained, resulting in discrepant values in the literature. Here, we investigate the thick disc characteristics by comparing model simulations with large scale data sets. The simulations are done from the population synthesis model of Besançon. We explore the parameters of the thick disc (shape, local density, age, metallicity) using a Monte Carlo Markov Chain method to constrain the model free parameters (Robin et al. 2014). Correlations between parameters are limited due to the vast spatial coverage of the used surveys (SDSS + 2MASS). We show that the thick disc was created during a long phase of formation, starting about 12 Gyr ago and finishing about 10 Gyr ago, during which gravitational contraction occurred, both vertically and radially. Moreover, in its early phase the thick disc was flaring in the outskirts. We conclude that the thick disc has been created prior to the thin disc during a gravitational collapse phase, slowed down by turbulence related to a high star formation rate, as explained for example in Bournaud et al. (2009) or Lehnert et al. (2009). Our result does not favor a formation from an initial thin disc thickened later by merger events or by secular evolution of the thin disc. We then study the in-plane distribution of stars in the thin disc from 2MASS and show that the thin disc scale length varies as a function of age, indicating an inside out formation. Moreover, we investigate the warp and flare and demonstrate that the warp amplitude is changing with time and the node angle is slightly precessing. Finally, we show comparisons between the new model and spectroscopic surveys. The new model allows to correctly simulate the kinematics, the metallicity, and α-abundance distributions in the solar neighbourhood as well as in the bulge region.
ESONET , a milestone towards sustained multidisciplinary ocean observation.
NASA Astrophysics Data System (ADS)
Rolin, J.-F.
2012-04-01
At the end of a 4 year project dedicated to the constitution of a Network of Excellence (NoE) on subsea observatories in Europe, large expectations are still in the agenda. The economical crisis changes the infrastructure construction planning in many ways but the objectives are quite clear and may be reached at European scale. The overall objective of the ESONET NoE was to create an organisation able to implement, operate and maintain a sustainable underwater observation network, extending into deep water, capable of monitoring biological, geo-chemical, geological, geophysical and physical processes occurring throughout the water column, sea floor interface and solid earth below. This main objective of ESONET has been met by creating the network of 11 permanent underwater observation sites together with the "ESONET Vi" Virtual Institute organising the exchange of staff and joint experiments on EMSO large research infrastructure observatories. The development of recommendations on best practices, standardization and interoperability concepts concerning underwater observatory equipment, as synthetized by the so called ESONET Label document has been created. The ESONET Label is a set of criteria to be met by the deep-sea observatory equipment as well as recommended solutions and options to guarantee their optimal operation in the ocean over long time periods. ESONET contributes to the fixed point sustained observatory community which extends worldwide, is fully multidisciplinary and in its way may open a new page in ocean sciences history.
Producing ‘superponderomotive’ electrons in ashort cavitated channel
NASA Astrophysics Data System (ADS)
Wang, J.; Yang, Y.; Zhao, Z. Q.; Wu, Y. C.; Dong, K. G.; Zhang, T. K.; Gu, Y. Q.
2017-11-01
Particle-in-cell simulations suggest that a short cavitated channel in relativistic near-critical density plasma can be readily created by relativistic self-transparency and hole-boring effect. The strong self-generated magnetic fields confine the electrons in the plasma channel. Assisted by the magnetic fields, the electrons resonance with the laser fields at betatron resonance frequency {ω }β . Consequently, highly energetic electrons with small divergence can be created by the betatron resonance regime. However, preliminary experiment results show higher temperature and larger divergence, compared to our simulation results. We argue that this difference would come from imperfect homogenization of foam target. Thus, the classical betatron resonance heating regime in a large scale of pre-plasma, which is proposed by Pukhov et al (1996 Phys. Rev. Lett. 76, 3975-8), would explain the experiment results instead.
Approximation concepts for efficient structural synthesis
NASA Technical Reports Server (NTRS)
Schmit, L. A., Jr.; Miura, H.
1976-01-01
It is shown that efficient structural synthesis capabilities can be created by using approximation concepts to mesh finite element structural analysis methods with nonlinear mathematical programming techniques. The history of the application of mathematical programming techniques to structural design optimization problems is reviewed. Several rather general approximation concepts are described along with the technical foundations of the ACCESS 1 computer program, which implements several approximation concepts. A substantial collection of structural design problems involving truss and idealized wing structures is presented. It is concluded that since the basic ideas employed in creating the ACCESS 1 program are rather general, its successful development supports the contention that the introduction of approximation concepts will lead to the emergence of a new generation of practical and efficient, large scale, structural synthesis capabilities in which finite element analysis methods and mathematical programming algorithms will play a central role.
NASA Astrophysics Data System (ADS)
Collins, Patrick; Autino, Adriano
2010-06-01
The authors argue that the creation of a popular new industry of passenger space travel could be economically and socially very beneficial in creating new employment in aerospace and related fields in order to supply these services. In doing so, the application of nearly a half-century of technological development that has yet to be used commercially could create many new aerospace engineering business opportunities. In addition, by growing to large scale, space tourism has unique potential to reduce the cost of space travel sharply, thereby making many other activities in space feasible and profitable. The paper discusses the scope for new employment, stimulating economic growth, reducing environmental damage, sustaining education particularly in the sciences, stimulating cultural growth, and preserving peace by eliminating any need for "resource wars".
NASA Technical Reports Server (NTRS)
Singh, N.
1994-01-01
We examined the various likely processes for creating the cavities and found that the mirror force acting on the transversely heated ions is the most likely mechanism. The pondermotive force causing the wave collapse was found to be a much weaker force than the mirror force on the transversely heated ions observed inside the cavities along with the lower hybrid waves. Using a hydrodynamic model for the polar wind we modeled the cavity formation and found that for the heating rate obtained from the observed waves, the mirror force does create cavities with depletions as observed. Some initial results from this study were published in a recent Geophysical Research Letters and were reported in the Fall AGU meeting in San Francisco. We have continued this investigation using a large-scale semikinetic model.
King, Gary; Pan, Jennifer; Roberts, Margaret E
2014-08-22
Existing research on the extensive Chinese censorship organization uses observational methods with well-known limitations. We conducted the first large-scale experimental study of censorship by creating accounts on numerous social media sites, randomly submitting different texts, and observing from a worldwide network of computers which texts were censored and which were not. We also supplemented interviews with confidential sources by creating our own social media site, contracting with Chinese firms to install the same censoring technologies as existing sites, and--with their software, documentation, and even customer support--reverse-engineering how it all works. Our results offer rigorous support for the recent hypothesis that criticisms of the state, its leaders, and their policies are published, whereas posts about real-world events with collective action potential are censored. Copyright © 2014, American Association for the Advancement of Science.
Development of a Computer Vision Technology for the Forest Products Manufacturing Industry
D. Earl Kline; Richard Conners; Philip A. Araman
1992-01-01
The goal of this research is to create an automated processing/grading system for hardwood lumber that will be of use to the forest products industry. The objective of creating a full scale machine vision prototype for inspecting hardwood lumber will become a reality in calendar year 1992. Space for the full scale prototype has been created at the Brooks Forest...
Development of novel IVD assays: a manufacturer's perspective.
Metcalfe, Thomas A
2010-01-01
IVD manufacturers are heavily reliant on novel IVD assays to fuel their growth and drive innovation within the industry. They represent a key part of the IVD industry's value proposition to customers and the healthcare industry in general, driving product differentiation, helping to create demand for new systems and generating incremental revenue. However, the discovery of novel biomarkers and their qualification for a specific clinical purpose is a high risk undertaking and the large, risky investments associated with doing this on a large scale are incompatible with IVD manufacturer's business models. This article describes the sources of novel IVD assays, the processes for discovering and qualifying novel assays and the reliance of IVD manufacturers on collaborations and in-licensing to source new IVD assays for their platforms.
Resilience of natural gas networks during conflicts, crises and disruptions.
Carvalho, Rui; Buzna, Lubos; Bono, Flavio; Masera, Marcelo; Arrowsmith, David K; Helbing, Dirk
2014-01-01
Human conflict, geopolitical crises, terrorist attacks, and natural disasters can turn large parts of energy distribution networks offline. Europe's current gas supply network is largely dependent on deliveries from Russia and North Africa, creating vulnerabilities to social and political instabilities. During crises, less delivery may mean greater congestion, as the pipeline network is used in ways it has not been designed for. Given the importance of the security of natural gas supply, we develop a model to handle network congestion on various geographical scales. We offer a resilient response strategy to energy shortages and quantify its effectiveness for a variety of relevant scenarios. In essence, Europe's gas supply can be made robust even to major supply disruptions, if a fair distribution strategy is applied.
[Principles of management of All-Russia Disaster Medicine Services].
Sakhno, I I
2000-11-01
Experience of liquidation of earthquake consequences in Armenia (1988) has shown that it is extremely necessary to create the system of management in regions of natural disaster, large accident or catastrophe before arrival of main forces in order to provide reconnaissance, to receive the arriving units. It will help to make well-grounded decisions, to set tasks in time, to organize and conduct emergency-and-rescue works. The article contains general material concerning the structure of All-Russia service of disaster medicine (ARSDM), organization of management at all levels and interaction between the components of ARSDM and other subsystems of Russian Service of Extreme Situations. It is recommended how to organize management of ARSDM during liquidation of medical-and-sanitary consequences of large-scale extreme situations.
Phenomenological model for the evolution of radio galaxies such as Cygnus A
NASA Astrophysics Data System (ADS)
Artyukh, V. S.
2015-06-01
A phenomenological model for the evolution of classical radio galaxies such as Cygnus A is presented. An activity cycle of the host galaxy in the radio begins with the birth of radio jets, which correspond to shocks on scales ˜1 pc (the radio galaxy B0108+388). In the following stage of the evolution, the radio emission comes predominantly from formations on scales of 10-100 pc, whose physical parameters are close to those of the hot spots of Cygnus A (this corresponds to GHz-peaked spectrum radio sources). Further, the hot spots create radio lobes on scales of 103-104 pc (compact steep-spectrum radio sources). The fully formed radio galaxies have radio jets, hot spots, and giant radio lobes; the direction of the jets can vary in a discrete steps with time, creating new hot spots and inflating the radio lobes (as in Cygnus A). In the final stage of the evolutionary cycle, first the radio jets disappear, then the hot spots, and finally the radio lobes (similar to the giant radio galaxies DA 240 and 3C 236). A large fraction of radio galaxies with repeating activity cycles is observed. The close connection between Cygnus A-type radio galaxies and optical quasars is noted, as well as similarity in the cosmological evolution of powerful radio galaxies and optical quasars.
NASA Astrophysics Data System (ADS)
Hubert-Ferrari, Aurélia; King, Geoffrey; Manighetti, Isabelle; Armijo, Rolando; Meyer, Bertrand; Tapponnier, Paul
2003-04-01
The evolution of the Gulf of Aden and the Anatolian Fault systems are modelled using the principles of elastic fracture mechanics usually applied to smaller scale cracks or faults. The lithosphere is treated as a plate, and simple boundary conditions are applied that correspond to the known plate boundary geometry and slip vectors. The models provide a simple explanation for many observed geological features. For the Gulf of Aden the model predicts why the ridge propagated from east to west from the Owen Fracture Zone towards the Afar and the overall form of its path. The smaller en echelon offsets can be explained by upward propagation from the initially created mantle dyke while the larger ones may be attributed to the propagating rupture interacting with pre-existing structures. For Anatolia the modelling suggests that the East Anatolian Fault was created before the North Anatolian Fault could form. Once both faults were formed however, activity could switch between them. The time scales over which this should take place are not known, but evidence for switching can be found in the historical seismicity. For Aden and Anatolia pre-existing structures or inhomogeneous stress fields left from earlier orogenic events have modified the processes of propagation and without an understanding of the existence of such features the propagation processes cannot be fully understood. Furthermore a propagating fault can extend into an active region where it would not have initiated. The North Anatolian Fault encountered slow but active extension when it entered the Aegean about 5 Ma and the stress field associated with the extending fault has progressively modified Aegean extension. In the central Aegean activity has been reduced while to the north-west on features such as the Gulfs of Evvia and Corinth activity has been increased. The field observation that major structures propagate and the success of simple elastic models suggest that the continental crust behaves in an elastic-brittle or elastic-plastic fashion even though laboratory tests may be interpreted to suggest viscous behaviour. There are major problems in scaling from the behaviour of small homogeneous samples to the large heterogeneous mantle and large-scale observations should be treated more seriously than extrapolations of the behaviour of laboratory experiments over many orders of magnitude in space and time. The retention of long-term elasticity and localised failure suggests a similar gross rheology for the oceanic and continental lithospheres. Even though it is incorrect to attribute differences in behaviour to the former being rigid (i.e. elastic) and the latter viscous, oceanic and continental lithosphere behave in different ways. Unlike oceanic crust, continental crust is buoyant and cannot be simply created or destroyed. The process of thickening or thinning works against gravity preventing large displacements on extensional or contractional features in the upper mantle. The equivalents of ridge or subduction systems are suppressed before they can accommodate large displacements and activity must shift elsewhere. On the other hand, strike-slip boundaries and extrusion processes are favoured.
Progress in long scale length laser plasma interactions
NASA Astrophysics Data System (ADS)
Glenzer, S. H.; Arnold, P.; Bardsley, G.; Berger, R. L.; Bonanno, G.; Borger, T.; Bower, D. E.; Bowers, M.; Bryant, R.; Buckman, S.; Burkhart, S. C.; Campbell, K.; Chrisp, M. P.; Cohen, B. I.; Constantin, C.; Cooper, F.; Cox, J.; Dewald, E.; Divol, L.; Dixit, S.; Duncan, J.; Eder, D.; Edwards, J.; Erbert, G.; Felker, B.; Fornes, J.; Frieders, G.; Froula, D. H.; Gardner, S. D.; Gates, C.; Gonzalez, M.; Grace, S.; Gregori, G.; Greenwood, A.; Griffith, R.; Hall, T.; Hammel, B. A.; Haynam, C.; Heestand, G.; Henesian, M.; Hermes, G.; Hinkel, D.; Holder, J.; Holdner, F.; Holtmeier, G.; Hsing, W.; Huber, S.; James, T.; Johnson, S.; Jones, O. S.; Kalantar, D.; Kamperschroer, J. H.; Kauffman, R.; Kelleher, T.; Knight, J.; Kirkwood, R. K.; Kruer, W. L.; Labiak, W.; Landen, O. L.; Langdon, A. B.; Langer, S.; Latray, D.; Lee, A.; Lee, F. D.; Lund, D.; MacGowan, B.; Marshall, S.; McBride, J.; McCarville, T.; McGrew, L.; Mackinnon, A. J.; Mahavandi, S.; Manes, K.; Marshall, C.; Menapace, J.; Mertens, E.; Meezan, N.; Miller, G.; Montelongo, S.; Moody, J. D.; Moses, E.; Munro, D.; Murray, J.; Neumann, J.; Newton, M.; Ng, E.; Niemann, C.; Nikitin, A.; Opsahl, P.; Padilla, E.; Parham, T.; Parrish, G.; Petty, C.; Polk, M.; Powell, C.; Reinbachs, I.; Rekow, V.; Rinnert, R.; Riordan, B.; Rhodes, M.; Roberts, V.; Robey, H.; Ross, G.; Sailors, S.; Saunders, R.; Schmitt, M.; Schneider, M. B.; Shiromizu, S.; Spaeth, M.; Stephens, A.; Still, B.; Suter, L. J.; Tietbohl, G.; Tobin, M.; Tuck, J.; Van Wonterghem, B. M.; Vidal, R.; Voloshin, D.; Wallace, R.; Wegner, P.; Whitman, P.; Williams, E. A.; Williams, K.; Winward, K.; Work, K.; Young, B.; Young, P. E.; Zapata, P.; Bahr, R. E.; Seka, W.; Fernandez, J.; Montgomery, D.; Rose, H.
2004-12-01
The first experiments on the National Ignition Facility (NIF) have employed the first four beams to measure propagation and laser backscattering losses in large ignition-size plasmas. Gas-filled targets between 2 and 7 mm length have been heated from one side by overlapping the focal spots of the four beams from one quad operated at 351 nm (3ω) with a total intensity of 2 × 1015 W cm-2. The targets were filled with 1 atm of CO2 producing up to 7 mm long homogeneously heated plasmas with densities of ne = 6 × 1020 cm-3 and temperatures of Te = 2 keV. The high energy in an NIF quad of beams of 16 kJ, illuminating the target from one direction, creates unique conditions for the study of laser-plasma interactions at scale lengths not previously accessible. The propagation through the large-scale plasma was measured with a gated x-ray imager that was filtered for 3.5 keV x-rays. These data indicate that the beams interact with the full length of this ignition-scale plasma during the last ~1 ns of the experiment. During that time, the full aperture measurements of the stimulated Brillouin scattering and stimulated Raman scattering show scattering into the four focusing lenses of 3% for the smallest length (~2 mm), increasing to 10-12% for ~7 mm. These results demonstrate the NIF experimental capabilities and further provide a benchmark for three-dimensional modelling of the laser-plasma interactions at ignition-size scale lengths.
Tortuosity of lightning return stroke channels
NASA Technical Reports Server (NTRS)
Levine, D. M.; Gilson, B.
1984-01-01
Data obtained from photographs of lightning are presented on the tortuosity of return stroke channels. The data were obtained by making piecewise linear fits to the channels, and recording the cartesian coordinates of the ends of each linear segment. The mean change between ends of the segments was nearly zero in the horizontal direction and was about eight meters in the vertical direction. Histograms of these changes are presented. These data were used to create model lightning channels and to predict the electric fields radiated during return strokes. This was done using a computer generated random walk in which linear segments were placed end-to-end to form a piecewise linear representation of the channel. The computer selected random numbers for the ends of the segments assuming a normal distribution with the measured statistics. Once the channels were simulated, the electric fields radiated during a return stroke were predicted using a transmission line model on each segment. It was found that realistic channels are obtained with this procedure, but only if the model includes two scales of tortuosity: fine scale irregularities corresponding to the local channel tortuosity which are superimposed on large scale horizontal drifts. The two scales of tortuosity are also necessary to obtain agreement between the electric fields computed mathematically from the simulated channels and the electric fields radiated from real return strokes. Without large scale drifts, the computed electric fields do not have the undulations characteristics of the data.
Comparative Tectonics of Europa and Ganymede
NASA Astrophysics Data System (ADS)
Pappalardo, R. T.; Collins, G. C.; Prockter, L. M.; Head, J. W.
2000-10-01
Europa and Ganymede are sibling satellites with tectonic similarities and differences. Ganymede's ancient dark terrain is crossed by furrows, probably related to ancient large impacts, and has been normal faulted to various degrees. Bright grooved is pervasively deformed at multiple scales and is locally highly strained, consistent with normal faulting of an ice-rich lithosphere above a ductile asthenosphere, along with minor horizontal shear. Little evidence has been identified for compressional structures. The relative roles of tectonism and icy cryovolcanism in creating bright grooved terrain is an outstanding issue. Some ridge and trough structures within Europa's bands show tectonic similarities to Ganymede's grooved terrain, specifically sawtooth structures resembling normal fault blocks. Small-scale troughs are consistent with widened tension fractures. Shearing has produced transtensional and transpressional structures in Europan bands. Large-scale folds are recognized on Europa, with synclinal small-scale ridges and scarps probably representing folds and/or thrust blocks. Europa's ubiquitous double ridges may have originated as warm ice upwelled along tidally heated fracture zones. The morphological variety of ridges and troughs on Europa imply that care must be taken in inferring their origin. The relative youth of Europa's surface means that the satellite has preserved near-pristine morphologies of many structures, though sputter erosion could have altered the morphology of older topography. Moderate-resolution imaging has revealed lesser apparent diversity in Ganymede's ridge and trough types. Galileo's 28th orbit has brought new 20 m/pixel imaging of Ganymede, allowing direct comparison to Europa's small-scale structures.
NASA Technical Reports Server (NTRS)
Pogge, Richard W.; Martini, Paul
2002-01-01
We present archival Hubble Space Telescope (HST) images of the nuclear regions of 43 of the 46 Seyfert galaxies found in the volume limited,spectroscopically complete CfA Redshift Survey sample. Using an improved method of image contrast enhancement, we created detailed high-quality " structure maps " that allow us to study the distributions of dust, star clusters, and emission-line gas in the circumnuclear regions (100-1000 pc scales) and in the associated host galaxy. Essentially all of these Seyfert galaxies have circumnuclear dust structures with morphologies ranging from grand-design two-armed spirals to chaotic dusty disks. In most Seyfert galaxies there is a clear physical connection between the nuclear dust spirals on hundreds of parsec scales and large-scale bars and spiral arms in the host galaxies proper. These connections are particularly striking in the interacting and barred galaxies. Such structures are predicted by numerical simulations of gas flows in barred and interacting galaxies and may be related to the fueling of active galactic nuclei by matter inflow from the host galaxy disks. We see no significant differences in the circumnuclear dust morphologies of Seyfert 1s and 2s, and very few Seyfert 2 nuclei are obscured by large-scale dust structures in the host galaxies. If Sevfert 2s are obscured Sevfert Is, then the obscuration must occur on smaller scales than those probed by HST.
Stonestrom, David A.; Blasch, Kyle W.; Stonestrom, David A.; Constantz, Jim
2003-01-01
Advances in electronics leading to improved sensor technologies, large-scale circuit integration, and attendant miniaturization have created new opportunities to use heat as a tracer of subsurface flow. Because nature provides abundant thermal forcing at the land surface, heat is particularly useful in studying stream-groundwater interactions. This appendix describes methods for obtaining the thermal data needed in heat-based investigations of shallow subsurface flow.
NASA Technical Reports Server (NTRS)
Denney, Ewen W.
2015-01-01
The basic vision of AdvoCATE is to automate the creation, manipulation, and management of large-scale assurance cases based on a formal theory of argument structures. Its main purposes are for creating and manipulating argument structures for safety assurance cases using the Goal Structuring Notation (GSN), and as a test bed and proof-of-concept for the formal theory of argument structures. AdvoCATE is available for Windows 7, Macintosh OSX, and Linux. Eventually, AdvoCATE will serve as a dashboard for safety related information and provide an infrastructure for safety decisions and management.