Lim, Chong Wee; Ohmori, Kenji; Petrov, Ivan Georgiev; Greene, Joseph E.
2004-07-13
A method for forming atomic-scale structures on a surface of a substrate on a large-scale includes creating a predetermined amount of surface vacancies on the surface of the substrate by removing an amount of atoms on the surface of the material corresponding to the predetermined amount of the surface vacancies. Once the surface vacancies have been created, atoms of a desired structure material are deposited on the surface of the substrate to enable the surface vacancies and the atoms of the structure material to interact. The interaction causes the atoms of the structure material to form the atomic-scale structures.
Use of tropical maize for bioethanol production
USDA-ARS?s Scientific Manuscript database
Tropical maize is an alternative energy crop being considered as a feedstock for bioethanol production in the North Central and Midwest United States. Tropical maize is advantageous because it produces large amounts of soluble sugars in its stalks, creates a large amount of biomass, and requires lo...
Dynamic travel information personalized and delivered to your cell phone : addendum.
DOT National Transportation Integrated Search
2011-03-01
Real-time travel information must reach a significant amount of travelers to create a large amount of travel behavior change. For this project, since the TRAC-IT mobile phone application is used to monitor user context in terms of location, the mobil...
An Earth-System Approach to Understanding the Deepwater Horizon Oil Spill
ERIC Educational Resources Information Center
Robeck, Edward
2011-01-01
The Deepwater Horizon explosion on April 20, 2010, and the subsequent release of oil into the Gulf of Mexico created an ecological disaster of immense proportions. The estimates of the amounts of oil, whether for the amount released per day or the total amount of oil disgorged from the well, call on numbers so large they defy the capacity of most…
Michael D. Conner; Robert C. Wilkinson
1983-01-01
Ips beetles usually attack weakened, dying, or recently felled trees and fresh logging debris. Large numbers Ips may build up when natural events such as lightning storms, ice storms, tornadoes, wildfires, and droughts create large amounts of pine suitable for the breeding of these beetles. Ips populations may also build up following forestry activities, such as...
Effects of Large Impacts on Mars: Implications for River Formation
NASA Technical Reports Server (NTRS)
Segura, T. L.; Toon, O. B.; Colaprete, A.; Zahnle, K.
2002-01-01
The Martian crater record provides ample evidence of the impacts of large (> 100 km) objects. These objects create hot global debris layers meters or more in depth, cause long term warming, and are capable of melting and precipitating a significant amount of water globally. Additional information is contained in the original extended abstract.
Criminal Intent with Property: A Study of Real Estate Fraud Prediction and Detection
ERIC Educational Resources Information Center
Blackman, David H.
2013-01-01
The large number of real estate transactions across the United States, combined with closing process complexity, creates extremely large data sets that conceal anomalies indicative of fraud. The quantitative amount of damage due to fraud is immeasurable to the lives of individuals who are victims, not to mention the financial impact to…
Ono, Hiroyuki; Saitsu, Hirotomo; Horikawa, Reiko; Nakashima, Shinichi; Ohkubo, Yumiko; Yanagi, Kumiko; Nakabayashi, Kazuhiko; Fukami, Maki; Fujisawa, Yasuko; Ogata, Tsutomu
2018-02-02
Although partial androgen insensitivity syndrome (PAIS) is caused by attenuated responsiveness to androgens, androgen receptor gene (AR) mutations on the coding regions and their splice sites have been identified only in <25% of patients with a diagnosis of PAIS. We performed extensive molecular studies including whole exome sequencing in a Japanese family with PAIS, identifying a deep intronic variant beyond the branch site at intron 6 of AR (NM_000044.4:c.2450-42 G > A). This variant created the splice acceptor motif that was accompanied by pyrimidine-rich sequence and two candidate branch sites. Consistent with this, reverse transcriptase (RT)-PCR experiments for cycloheximide-treated lymphoblastoid cell lines revealed a relatively large amount of aberrant mRNA produced by the newly created splice acceptor site and a relatively small amount of wildtype mRNA produced by the normal splice acceptor site. Furthermore, most of the aberrant mRNA was shown to undergo nonsense mediated decay (NMD) and, if a small amount of aberrant mRNA may have escaped NMD, such mRNA was predicted to generate a truncated AR protein missing some functional domains. These findings imply that the deep intronic mutation creating an alternative splice acceptor site resulted in the production of a relatively small amount of wildtype AR mRNA, leading to PAIS.
Advanced Technologies in Safe and Efficient Operating Rooms
2008-02-01
of team leader) o a learning environment (where humans play the role of students ). As can be seen, this work is at the confluence of several lines... Abstract Routine clinical information systems now have the ability to gather large amounts of data that surgical managers can access to create a...project is to create a computer system for teaching medical students cognitive skills of an attending physician related to diagnosing and treating
T. DeGomez; C.J. Fettig; J.D. McMillin; J.A. Anhold; C.J. Hayes
2008-01-01
Due to high fire hazard and perceived reductions in forest health, thinning of small diameter trees has become a prevalent management activity particularly in dense stands. Creation of large amounts of logging slash, however, has created large quantities of habitat for bark beetles primarily in the Ips genus (Coleoptera: Curculionidae,...
DOE Office of Scientific and Technical Information (OSTI.GOV)
BLEJWAS,THOMAS E.; SANDERS,THOMAS L.; EAGAN,ROBERT J.
2000-01-01
Nuclear power is an important and, the authors believe, essential component of a secure nuclear future. Although nuclear fuel cycles create materials that have some potential for use in nuclear weapons, with appropriate fuel cycles, nuclear power could reduce rather than increase real proliferation risk worldwide. Future fuel cycles could be designed to avoid plutonium production, generate minimal amounts of plutonium in proliferation-resistant amounts or configurations, and/or transparently and efficiently consume plutonium already created. Furthermore, a strong and viable US nuclear infrastructure, of which nuclear power is a large element, is essential if the US is to maintain a leadershipmore » or even participatory role in defining the global nuclear infrastructure and controlling the proliferation of nuclear weapons. By focusing on new fuel cycles and new reactor technologies, it is possible to advantageously burn and reduce nuclear materials that could be used for nuclear weapons rather than increase and/or dispose of these materials. Thus, the authors suggest that planners for a secure nuclear future use technology to design an ideal future. In this future, nuclear power creates large amounts of virtually atmospherically clean energy while significantly lowering the threat of proliferation through the thoughtful use, physical security, and agreed-upon transparency of nuclear materials. The authors must develop options for policy makers that bring them as close as practical to this ideal. Just as Atoms for Peace became the ideal for the first nuclear century, they see a potential nuclear future that contributes significantly to power for peace and prosperity.« less
A new R function, exsic, to assist taxonomists in creating indices
USDA-ARS?s Scientific Manuscript database
Taxonomists manage large amounts of specimen data. This is usually initiated in spreadsheets and then converted for publication into locality lists and in indices to associate collectors and collector numbers from herbarium sheets to identifications, a format technically termed an exsiccate list. Th...
How to leverage a bad inventory situation.
Horsfall, G A
1998-11-01
Small manufacturing companies have a hard time taking advantage of the price breaks that result from large purchase orders. Besides the greater amount of money involved, purchasing large quantities of items demands additional space for storing the items. This article describes a company that created separate inventory management and finance company to provide inventory management services to itself and to market these services to other small companies in its area.
MERCURY IN STAMP SAND DISCHARGES: IMPLICATIONS FOR LAKE SUPERIOR MERCURY CYCLING
Approximately a half billion tons of waste rock from the extraction of native copper and silver ores was discharged into the Lake Superior basin. Stamping was the method of choice to recover these metals from the surrounding poor rock. This process created large amounts of extre...
Social Studies Special Issue: Civic Literacy in a Digital Age
ERIC Educational Resources Information Center
VanFossen, Phillip J.; Berson, Michael J.
2008-01-01
Young people today consume large amounts of information through various media outlets and simultaneously create and distribute their own messages via information and communication technologies and massively multiplayer online gaming. In doing so, these "digital natives" are often exposed to violent, racist, or other deleterious messages.…
ERIC Educational Resources Information Center
Andrade, Alejandro; Danish, Joshua A.; Maltese, Adam V.
2017-01-01
Interactive learning environments with body-centric technologies lie at the intersection of the design of embodied learning activities and multimodal learning analytics. Sensing technologies can generate large amounts of fine-grained data automatically captured from student movements. Researchers can use these fine-grained data to create a…
ERIC Educational Resources Information Center
Lu, Hsin-Min
2010-01-01
Deep penetration of personal computers, data communication networks, and the Internet has created a massive platform for data collection, dissemination, storage, and retrieval. Large amounts of textual data are now available at a very low cost. Valuable information, such as consumer preferences, new product developments, trends, and opportunities,…
Uncovering Problems and Identifying Coping Strategies of Middle Eastern University Students
ERIC Educational Resources Information Center
Alazzi, Khaled; Chiodo, John J.
2006-01-01
For international college students, the failure to achieve their educational goals regarding their program of study creates a large amount of stress. These international students experience pressure to succeed from their families, sponsoring agencies, or even the communities from their home country. For Middle Eastern students who come to study at…
Human-Level Natural Language Understanding: False Progress and Real Challenges
ERIC Educational Resources Information Center
Bignoli, Perrin G.
2013-01-01
The field of Natural Language Processing (NLP) focuses on the study of how utterances composed of human-level languages can be understood and generated. Typically, there are considered to be three intertwined levels of structure that interact to create meaning in language: syntax, semantics, and pragmatics. Not only is a large amount of…
Soil CO2 production in upland tundra where permafrost is thawing
Hanna Lee; Edward A.G. Schuur; Jason G. Vogel
2010-01-01
Permafrost soils store nearly half of global soil carbon (C), and therefore permafrost thawing could lead to large amounts of greenhouse gas emissions via decomposition of soil organic matter. When ice-rich permafrost thaws, it creates a localized surface subsidence called thermokarst terrain, which changes the soil microenvironment. We used soil profile CO2...
Autoclave Meltout of Cast Explosives
1996-08-22
various tanks , kettles , and pelletizing equipment a usable product was recovered. This process creates large amounts of pink water requiring...vacuum treatment melt kettles , flaker belts, and improved material handling equipment in an integrated system. During the 1976/1977 period, AED...McAlester Army Ammo Plant , Oklahoma, to discuss proposed workload and inspect available facilities and equipment . Pilot model production and testing
ERIC Educational Resources Information Center
Zhang, Rui
2013-01-01
The widespread adoption of Electronic Health Record (EHR) has resulted in rapid text proliferation within clinical care. Clinicians' use of copying and pasting functions in EHR systems further compounds this by creating a large amount of redundant clinical information in clinical documents. A mixture of redundant information (especially outdated…
Demands for quick and accurate life cycle assessments create a need for methods to rapidly generate reliable life cycle inventories (LCI). Data mining is a suitable tool for this purpose, especially given the large amount of available governmental data. These data are typically a...
ERIC Educational Resources Information Center
MOGUEROU, PHILIPPE
2005-01-01
In this article, we discuss the recent evolutions of science and engineering doctoral and postdoctoral education in Europe. Indeed, Ph.Ds are crucial to the conduct of research and innovation in the national innovation systems, as they provide a large amount of input into creating the competitive advantage, notably through basic research. First,…
Transient Stability of the US Western Interconnection with High Wind and Solar Generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, Kara; Miller, Nicholas W.; Shao, Miaolei
The addition of large amounts of wind and solar generation to bulk power systems that are traditionally subject to operating constraints set by transient limitations is the subject of considerable concern in the industry. The US Western Interconnection (WI) is expected to experience substantial additional growth in both wind and solar generation. These plants will, to some extent, displace large central station thermal generation, both coal and gas-fired, which have traditionally helped maintain stability. This paper reports the results of a study that investigated the transient stability of the WI with high penetrations of wind and solar generation. The mainmore » goals of this work were to (1) create a realistic, baseline model of the WI, (2) test selected transient stability events, (3) investigate the impact of large amounts of wind and solar generation, and (4) examine means to improve performance.« less
Fractional labelmaps for computing accurate dose volume histograms
NASA Astrophysics Data System (ADS)
Sunderland, Kyle; Pinter, Csaba; Lasso, Andras; Fichtinger, Gabor
2017-03-01
PURPOSE: In radiation therapy treatment planning systems, structures are represented as parallel 2D contours. For treatment planning algorithms, structures must be converted into labelmap (i.e. 3D image denoting structure inside/outside) representations. This is often done by triangulated a surface from contours, which is converted into a binary labelmap. This surface to binary labelmap conversion can cause large errors in small structures. Binary labelmaps are often represented using one byte per voxel, meaning a large amount of memory is unused. Our goal is to develop a fractional labelmap representation containing non-binary values, allowing more information to be stored in the same amount of memory. METHODS: We implemented an algorithm in 3D Slicer, which converts surfaces to fractional labelmaps by creating 216 binary labelmaps, changing the labelmap origin on each iteration. The binary labelmap values are summed to create the fractional labelmap. In addition, an algorithm is implemented in the SlicerRT toolkit that calculates dose volume histograms (DVH) using fractional labelmaps. RESULTS: We found that with manually segmented RANDO head and neck structures, fractional labelmaps represented structure volume up to 19.07% (average 6.81%) more accurately than binary labelmaps, while occupying the same amount of memory. When compared to baseline DVH from treatment planning software, DVH from fractional labelmaps had agreement acceptance percent (1% ΔD, 1% ΔV) up to 57.46% higher (average 4.33%) than DVH from binary labelmaps. CONCLUSION: Fractional labelmaps promise to be an effective method for structure representation, allowing considerably more information to be stored in the same amount of memory.
A Data Mining Approach to Improve Re-Accessibility and Delivery of Learning Knowledge Objects
ERIC Educational Resources Information Center
Sabitha, Sai; Mehrotra, Deepti; Bansal, Abhay
2014-01-01
Today Learning Management Systems (LMS) have become an integral part of learning mechanism of both learning institutes and industry. A Learning Object (LO) can be one of the atomic components of LMS. A large amount of research is conducted into identifying benchmarks for creating Learning Objects. Some of the major concerns associated with LO are…
Shaking It up: How to Run the Best Club and Chapter Program
ERIC Educational Resources Information Center
Peterson, Erin
2012-01-01
Alumni clubs and chapters are powerful tools for keeping alumni connected to each other and the institution, gathering insight into what alumni want from their alma mater, and even raising money for the institution. And while alumni leaders do not need to devote a large amount of their budget to create successful groups, they do need to ensure…
Enhancing Defense Support of Civil Authorities within the National Capital Region
2011-04-04
tenorist organizations center of gravity, critical capabilities, critical requirements, and critical vulnerabilities. For example, to execute the...London Underground bombings ( critical capability), the group (center of gravity) needed money ( critical requirement! critical vulnerability). 8 Usually...the movement of large amounts of money through banks, the internet, or via credit cards creates a critical vulnerability because law enforcement
Landspotting: Social gaming to collect vast amounts of data for satellite validation
NASA Astrophysics Data System (ADS)
Fritz, S.; Purgathofer, P.; Kayali, F.; Fellner, M.; Wimmer, M.; Sturn, T.; Triebnig, G.; Krause, S.; Schindler, F.; Kollegger, M.; Perger, C.; Dürauer, M.; Haberl, W.; See, L.; McCallum, I.
2012-04-01
At present there is no single satellite-derived global land cover product that is accurate enough to provide reliable estimates of forest or cropland area to determine, e.g., how much additional land is available to grow biofuels or to tackle problems of food security. The Landspotting Project aims to improve the quality of this land cover information by vastly increasing the amount of in-situ validation data available for calibration and validation of satellite-derived land cover. The Geo-Wiki (Geo-Wiki.org) system currently allows users to compare three satellite derived land cover products and validate them using Google Earth. However, there is presently no incentive for anyone to provide this data so the amount of validation through Geo-Wiki has been limited. However, recent competitions have proven that incentive driven campaigns can rapidly create large amounts of input. The LandSpotting Project is taking a truly innovative approach through the development of the Landspotting game. The game engages users whilst simultaneously collecting a large amount of in-situ land cover information. The development of the game is informed by the current raft of successful social gaming that is available on the internet and as mobile applications, many of which are geo-spatial in nature. Games that are integrated within a social networking site such as Facebook illustrate the power to reach and continually engage a large number of individuals. The number of active Facebook users is estimated to be greater than 400 million, where 100 million are accessing Facebook from mobile devices. The Landspotting Game has similar game mechanics as the famous strategy game "Civilization" (i.e. build, harvest, research, war, diplomacy, etc.). When a player wishes to make a settlement, they must first classify the land cover over the area they wish to settle. As the game is played on the earth surface with Google Maps, we are able to record and store this land cover/land use classification geographically. Every player can play the game for free (i.e. a massive multiplayer online game). Furthermore, it is a social game on Facebook (e.g. invite friends, send friends messages, purchase gifts, help friends, post messages onto the wall, etc). The game is played in a web browser, therefore it runs everywhere (where Flash is supported) without requiring the user to install anything additional. At the same time, the Geo-Wiki system will be modified to use the acquired in-situ validation information to create new outputs: a hybrid land cover map, which takes the best information from each individual product to create a single integrated version; a database of validation points that will be freely available to the land cover user community; and a facility that allows users to create a specific targeted validation area, which will then be provided to the crowdsourcing community for validation. These outputs will turn Geo-Wiki into a valuable system for earth system scientists.
Perfume formulation: words and chats.
Ellena, Céline
2008-06-01
What does it mean to create fragrances with materials from chemistry and/or from nature? How are they used to display their characteristic differences, their own personality? Is it easier to create with synthetic raw materials or with essential oils? This review explains why a perfume formulation corresponds in fact to a conversation, an interplay between synthetic and natural perfumery materials. A synthetic raw material carries a single information, and usually is very linear. Its smell is uniform, clear, and faithful. Natural raw materials, on the contrary, provide a strong, complex and generous image. While a synthetic material can be seen as a single word, a natural one such as rose oil could be compared to chatting: cold, warm, sticky, heavy, transparent, pepper, green, metallic, smooth, watery, fruity... full of information. Yet, if a very small amount of the natural material is used, nothing happens, the fragrance will not change. However, if a large amount is used, the rose oil will swallow up everything else. The fragrance will smell of nothing else except rose! To formulate a perfume is not to create a culinary recipe, with only dosing the ingredients in well-balanced amounts. To formulate rather means to flexibly knit materials together with a lively stitch, meeting or repelling each other, building a pleasant form, which is neither fixed, nor solid, nor rigid. A fragrance has an overall structure, which ranges from a clear sound, made up of stable, unique, and linear items, to a background chat, comfortable and reassuring. But that does, of course, not mean that there is only one way of creating a fragrance!
Evaluating the Cassandra NoSQL Database Approach for Genomic Data Persistency.
Aniceto, Rodrigo; Xavier, Rene; Guimarães, Valeria; Hondo, Fernanda; Holanda, Maristela; Walter, Maria Emilia; Lifschitz, Sérgio
2015-01-01
Rapid advances in high-throughput sequencing techniques have created interesting computational challenges in bioinformatics. One of them refers to management of massive amounts of data generated by automatic sequencers. We need to deal with the persistency of genomic data, particularly storing and analyzing these large-scale processed data. To find an alternative to the frequently considered relational database model becomes a compelling task. Other data models may be more effective when dealing with a very large amount of nonconventional data, especially for writing and retrieving operations. In this paper, we discuss the Cassandra NoSQL database approach for storing genomic data. We perform an analysis of persistency and I/O operations with real data, using the Cassandra database system. We also compare the results obtained with a classical relational database system and another NoSQL database approach, MongoDB.
Advantages of Fast Ignition Scenarios with Two Hot Spots for Space Propulsion Systems
NASA Astrophysics Data System (ADS)
Shmatov, M. L.
The use of the fast ignition scenarios with the attempts to create two hot spots in one blob of the compressed thermonuclear fuel or, briefly, scenarios with two hot spots in space propulsion systems is proposed. The model, predicting that for such scenarios the probability pf of failure of ignition of thermonuclear microexplosion can be significantly less than that for the similar scenarios with the attempts to create one hot spot in one blob of the compressed fuel, is presented. For space propulsion systems consuming a relatively large amount of propellant, a decrease in pf due to the choice of the scenario with two hot spots can result in large, for example, two-fold, increase in the payload mass. Other advantages of the scenarios with two hot spots and some problems related to them are considered.
Single event upset in avionics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taber, A.; Normand, E.
1993-04-01
Data from military/experimental flights and laboratory testing indicate that typical non radiation-hardened 64K and 256K static random access memories (SRAMs) can experience a significant soft upset rate at aircraft altitudes due to energetic neutrons created by cosmic ray interactions in the atmosphere. It is suggested that error detection and correction (EDAC) circuitry be considered for all avionics designs containing large amounts of semi-conductor memory.
Frequency-Modulated Microwave Photonic Links with Direct Detection: Review and Theory
2010-12-15
create large amounts of signal distortion. Alternatives to MZIs have been pro- posed, including Fabry - Perot interferometers, ber Bragg gratings (FBGs...multiplexed, analog signals for applications in cable television distribution. Experimental results for a Fabry - Perot discriminated, FM subcarrier...multiplexed system were presented by [17]. An array of optical frequency modulated DFB lasers and a Fabry - Perot discriminator were used to transmit and
David K. Weaver; Micaela Buteler; Megan L. Hofland; Justin B. Runyon; Christian Nansen; Luther E. Talbert; Peggy Lamb; Gregg R. Carlson
2009-01-01
The wheat stem sawfly, Cephus cinctus Norton, causes severe losses in wheat grown in the northern Great Plains. Much of the affected area is planted in monoculture with wheat, Triticum aestivum L., grown in large fields alternating yearly between crop and no-till fallow. The crop and fallow fields are adjacent. This cropping landscape creates pronounced edge effects of...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lehto, J.; Ikaeheimonen, T.K.; Salbu, B.
The fallout from a major nuclear accident at a nuclear plant may result in a wide-scale contamination of the environment. Cleanup of contaminated areas is of special importance if these areas are populated or cultivated. All cleanup measures generate high amounts of radioactive waste, which have to be treated and disposed of in a safe manner. Scenarios assessing the amounts and activity concentrations of radioactive wastes for various cleanup measures after severe nuclear accidents have been worked out for urban, forest and agricultural areas. These scenarios are based on contamination levels and ares of contaminated lands from a model accident,more » which simulates a worst case accident at a nuclear power plant. Amounts and activity concentrations of cleanup wastes are not only dependent on the contamination levels and areas of affected lands, but also on the type of deposition, wet or dry, on the time between the deposition and the cleanup work, on the season, at which the deposition took place, and finally on the level of cleanup work. In this study practically all types of cleanup wastes were considered, whether or not the corresponding cleanup measures are cost-effective or justified. All cleanup measures are shown to create large amounts of radioactive wastes, but the amounts, as well as the activity concentrations vary widely from case to case.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Nicholas W.; Shao, Miaolei; Pajic, Slobodan
The addition of large amounts of wind and solar generation to bulk power systems that are traditionally subject to operating constraints set by transient stability and frequency response limitations is the subject of considerable concern in the industry. The US Western Interconnection (WI) is expected to experience substantial additional growth in both wind and solar generation. These plants will, to some extent, displace large central station thermal generation, both coal and gas-fired, which have traditionally helped maintain stability. This paper reports the results of a study that investigated the transient stability and frequency response of the WI with high penetrationsmore » of wind and solar generation. The main goals of this work were to (1) create a realistic, baseline model of the WI, (2) test selected transient stability and frequency events, (3) investigate the impact of large amounts of wind and solar generation, and (4) examine means to improve performance.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, Kara; Miller, Nicholas W.; Shao, Miaolei
Adding large amounts of wind and solar generation to bulk power systems that are traditionally subject to operating constraints set by transient stability and frequency response limitations is the subject of considerable concern in the industry. The US Western Interconnection (WI) is expected to experience substantial additional growth in both wind and solar generation. These plants will, to some extent, displace large central station thermal generation, both coal and gas-fired, which have traditionally helped maintain stability. Our paper reports the results of a study that investigated the transient stability and frequency response of the WI with high penetrations of windmore » and solar generation. Moreover, the main goals of this work were to (1) create a realistic, baseline model of the WI, (2) test selected transient stability and frequency events, (3) investigate the impact of large amounts of wind and solar generation, and (4) examine means to improve performance.« less
Automating the Generation of the Cassini Tour Atlas Database
NASA Technical Reports Server (NTRS)
Grazier, Kevin R.; Roumeliotis, Chris; Lange, Robert D.
2010-01-01
The Tour Atlas is a large database of geometrical tables, plots, and graphics used by Cassini science planning engineers and scientists primarily for science observation planning. Over time, as the contents of the Tour Atlas grew, the amount of time it took to recreate the Tour Atlas similarly grew--to the point that it took one person a week of effort. When Cassini tour designers estimated that they were going to create approximately 30 candidate Extended Mission trajectories--which needed to be analyzed for science return in a short amount of time--it became a necessity to automate. We report on the automation methodology that reduced the amount of time it took one person to (re)generate a Tour Atlas from a week to, literally, one UNIX command.
Shumaker, L; Fetterolf, D E; Suhrie, J
1998-01-01
The recent availability of inexpensive document scanners and optical character recognition technology has created the ability to process surveys in large numbers with a minimum of operator time. Programs, which allow computer entry of such scanned questionnaire results directly into PC based relational databases, have further made it possible to quickly collect and analyze significant amounts of information. We have created an internal capability to easily generate survey data and conduct surveillance across a number of medical practice sites within a managed care/practice management organization. Patient satisfaction surveys, referring physician surveys and a variety of other evidence gathering tools have been deployed.
Studies on urine treatment by biological purification using Azolla and UV photocatalytic oxidation
NASA Astrophysics Data System (ADS)
Liu, Xiaofeng; Chen, Min; Bian, Zuliang; Liu, Chung-Chu
The amount of water consumed in space station operations is very large. In order to reduce the amount of water which must be resupplied from Earth, the space station needs to resolve the problems of water supply. For this reason, the recovery, regeneration and utilization of urine of astronauts are of key importance. Many investigations on this subject have been reported. Our research is based on biological absorption and, purification using UV photocatalytic oxidation techniques to achieve comprehensive treatment for urine. In the treatment apparatus we created, the urine solution is used as part of the nutrient solution for the biological components in our bioregenerative life support system. After being absorbed, the nutrients from the urine were then decomposed, metabolized and purified which creates a favorable condition for the follow-up oxidation treatment by UV photocatalytic oxidation. After these two processes, the treated urine solution reached Chinese national standards for drinking water quality (GB5749-1985).
Knowledge will Propel Machine Understanding of Content: Extrapolating from Current Examples
Sheth, Amit; Perera, Sujan; Wijeratne, Sanjaya; Thirunarayan, Krishnaprasad
2018-01-01
Machine Learning has been a big success story during the AI resurgence. One particular stand out success relates to learning from a massive amount of data. In spite of early assertions of the unreasonable effectiveness of data, there is increasing recognition for utilizing knowledge whenever it is available or can be created purposefully. In this paper, we discuss the indispensable role of knowledge for deeper understanding of content where (i) large amounts of training data are unavailable, (ii) the objects to be recognized are complex, (e.g., implicit entities and highly subjective content), and (iii) applications need to use complementary or related data in multiple modalities/media. What brings us to the cusp of rapid progress is our ability to (a) create relevant and reliable knowledge and (b) carefully exploit knowledge to enhance ML/NLP techniques. Using diverse examples, we seek to foretell unprecedented progress in our ability for deeper understanding and exploitation of multimodal data and continued incorporation of knowledge in learning techniques.
Selected Papers on Low-Energy Antiprotons and Possible Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noble, Robert
1998-09-19
The only realistic means by which to create a facility at Fermilab to produce large amounts of low energy antiprotons is to use resources which already exist. There is simply too little money and manpower at this point in time to generate new accelerators on a time scale before the turn of the century. Therefore, innovation is required to modify existing equipment to provide the services required by experimenters.
Transparency of an instantaneously created electron-positron-photon plasma
NASA Astrophysics Data System (ADS)
Bégué, D.; Vereshchagin, G. V.
2014-03-01
The problem of the expansion of a relativistic plasma generated when a large amount of energy is released in a small volume has been considered by many authors. We use the analytical solution of Bisnovatyi-Kogan and Murzina for the spherically symmetric relativistic expansion. The light curves and the spectra from transparency of an electron-positron-photon plasma are obtained. We compare our results with the work of Goodman.
Development of Telecommunications of Prao ASC Lpi RAS
NASA Astrophysics Data System (ADS)
Isaev, E. A.; Dumskiy, D. V.; Likhachev, S. F.; Shatskaya, M. V.; Pugachev, V. D.; Samodurov, V. A.
The new modern and reliable data storage system was acquired in 2010 in order to develop internal telecommunication resources of the Observatory. The system is designed for store large amounts of observation data obtained from the three radio-astronomy complexes (PT-22, DKR-1000 and BSA). The digital switching system - "Elcom" is installed in the Pushchino Radio Astronomy Observatory to ensure the observatory by phone communications. The phone communication between buildings of the observatory carried out over fiber-optic data links by using the ip-telephony. The direct optical channel from tracking station RT-22 in Pushchino to Moscow processing center has been created and put into operation to transfer large amounts of data at the final stage of the establishment of ground infrastructure for the international space project "Radioastron". A separate backup system for processing and storing data is organized in Pushchino Radio Astronomy Observatory to eliminate data loss during communication sessions with the Space Telescope.
Evaluating the Cassandra NoSQL Database Approach for Genomic Data Persistency
Aniceto, Rodrigo; Xavier, Rene; Guimarães, Valeria; Hondo, Fernanda; Holanda, Maristela; Walter, Maria Emilia; Lifschitz, Sérgio
2015-01-01
Rapid advances in high-throughput sequencing techniques have created interesting computational challenges in bioinformatics. One of them refers to management of massive amounts of data generated by automatic sequencers. We need to deal with the persistency of genomic data, particularly storing and analyzing these large-scale processed data. To find an alternative to the frequently considered relational database model becomes a compelling task. Other data models may be more effective when dealing with a very large amount of nonconventional data, especially for writing and retrieving operations. In this paper, we discuss the Cassandra NoSQL database approach for storing genomic data. We perform an analysis of persistency and I/O operations with real data, using the Cassandra database system. We also compare the results obtained with a classical relational database system and another NoSQL database approach, MongoDB. PMID:26558254
Logistics and quality control for DNA sampling in large multicenter studies.
Nederhand, R J; Droog, S; Kluft, C; Simoons, M L; de Maat, M P M
2003-05-01
To study associations between genetic variation and disease, large bio-banks need to be created in multicenter studies. Therefore, we studied the effects of storage time and temperature on DNA quality and quantity in a simulation experiment with storage up to 28 days frozen, at 4 degrees C and at room temperature. In the simulation experiment, the conditions did not influence the amount or quality of DNA to an unsatisfactory level. However, the amount of extracted DNA was decreased in frozen samples and in samples that were stored for > 7 days at room temperature. In a sample of patients from 24 countries of the EUROPA trial obtained by mail with transport times up to 1 month DNA yield and quality were adequate. From these results we conclude that transport of non-frozen blood by ordinary mail is usable and practical for DNA isolation for polymerase chain reaction in clinical and epidemiological studies.
Software design for analysis of multichannel intracardial and body surface electrocardiograms.
Potse, Mark; Linnenbank, André C; Grimbergen, Cornelis A
2002-11-01
Analysis of multichannel ECG recordings (body surface maps (BSMs) and intracardial maps) requires special software. We created a software package and a user interface on top of a commercial data analysis package (MATLAB) by a combination of high-level and low-level programming. Our software was created to satisfy the needs of a diverse group of researchers. It can handle a large variety of recording configurations. It allows for interactive usage through a fast and robust user interface, and batch processing for the analysis of large amounts of data. The package is user-extensible, includes routines for both common and experimental data processing tasks, and works on several computer platforms. The source code is made intelligible using software for structured documentation and is available to the users. The package is currently used by more than ten research groups analysing ECG data worldwide.
Implementing Solar Technologies at Airports
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kandt, A.; Romero, R.
2014-07-01
Federal agencies, such as the Department of Defense and Department of Homeland Security, as well as numerous private entities are actively pursuing the installation of solar technologies to help reduce fossil fuel energy use and associated emissions, meet sustainability goals, and create more robust or reliable operations. One potential approach identified for siting solar technologies is the installation of solar energy technologies at airports and airfields, which present a significant opportunity for hosting solar technologies due to large amounts of open land. This report focuses largely on the Federal Aviation Administration's (FAA's) policies toward siting solar technologies at airports.
Optimization of a Monte Carlo Model of the Transient Reactor Test Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kristin; DeHart, Mark; Goluoglu, Sedat
2017-03-01
The ultimate goal of modeling and simulation is to obtain reasonable answers to problems that don’t have representations which can be easily evaluated while minimizing the amount of computational resources. With the advances during the last twenty years of large scale computing centers, researchers have had the ability to create a multitude of tools to minimize the number of approximations necessary when modeling a system. The tremendous power of these centers requires the user to possess an immense amount of knowledge to optimize the models for accuracy and efficiency.This paper seeks to evaluate the KENO model of TREAT to optimizemore » calculational efforts.« less
Dust in the Sky: Atmospheric Composition. Modeling of Aerosol Optical Thickness
NASA Technical Reports Server (NTRS)
Chin, Mian; Ginoux, Paul; Kinne, Stefan; Torres, Omar; Holben, Brent; Duncan, Bryan; Martin, Randall; Logan, Jennifer; Higurashi, Akiko; Nakajima, Teruyuki
2000-01-01
Aerosol is any small particle of matter that rests suspended in the atmosphere. Natural sources, such as deserts, create some aerosols; consumption of fossil fuels and industrial activity create other aerosols. All the microscopic aerosol particles add up to a large amount of material floating in the atmosphere. You can see the particles in the haze that floats over polluted cities. Beyond this visible effect, aerosols can actually lower temperatures. They do this by blocking, or scattering, a portion of the sun's energy from reaching the surface. Because of this influence, scientists study the physical properties of atmospheric aerosols. Reliable numerical models for atmospheric aerosols play an important role in research.
NASA Astrophysics Data System (ADS)
Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.
2016-08-01
The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.
26 CFR 1.263(a)-4 - Amounts paid to acquire or create intangibles.
Code of Federal Regulations, 2013 CFR
2013-04-01
... the meaning of this paragraph (b)(3). (v) Creation of package design. Amounts paid to develop a package design are treated as amounts that do not create a separate and distinct intangible asset within the meaning of this paragraph (b)(3). For purposes of this section, the term package design means the...
26 CFR 1.263(a)-4 - Amounts paid to acquire or create intangibles.
Code of Federal Regulations, 2014 CFR
2014-04-01
... the meaning of this paragraph (b)(3). (v) Creation of package design. Amounts paid to develop a package design are treated as amounts that do not create a separate and distinct intangible asset within the meaning of this paragraph (b)(3). For purposes of this section, the term package design means the...
26 CFR 1.263(a)-4 - Amounts paid to acquire or create intangibles.
Code of Federal Regulations, 2012 CFR
2012-04-01
... the meaning of this paragraph (b)(3). (v) Creation of package design. Amounts paid to develop a package design are treated as amounts that do not create a separate and distinct intangible asset within the meaning of this paragraph (b)(3). For purposes of this section, the term package design means the...
Engineering Design Handbook. Development Guide for Reliability. Part Two. Design for Reliability
1976-01-01
Component failure rates, however, have been recorded by many sources as a function of use and environment. Some of these sources are listed in Refs. 13-17...other systems capable of creating an explosive reac- tion. The second category is fairly obvious and includes many variations on methods for providing...aboutthem. 4. Ability to detect signals ( including patterns) in high noise environments. 5. Ability to store large amounts of informa- tion for long
NASA Technical Reports Server (NTRS)
Pope, Kevin O.
1994-01-01
The Chicxulub Crater in Yucatan, Mexico, is the primary candidate for the impact that caused mass extinctions at the Cretaceous/Tertiary boundary. The target rocks at Chicxulub contain 750 to 1500 m of anhydrite (CaSO4), which was vaporized upon impact, creating a large sulfuric acid aerosol cloud. In this study we apply a hydrocode model of asteroid impact to calculate the amount of sulfuric acid produced. We then apply a radiative transfer model to determine the atmospheric effects. Results include 6 to 9 month period of darkness followed by 12 to 26 years of cooling.
The Livermore Brain: Massive Deep Learning Networks Enabled by High Performance Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Barry Y.
The proliferation of inexpensive sensor technologies like the ubiquitous digital image sensors has resulted in the collection and sharing of vast amounts of unsorted and unexploited raw data. Companies and governments who are able to collect and make sense of large datasets to help them make better decisions more rapidly will have a competitive advantage in the information era. Machine Learning technologies play a critical role for automating the data understanding process; however, to be maximally effective, useful intermediate representations of the data are required. These representations or “features” are transformations of the raw data into a form where patternsmore » are more easily recognized. Recent breakthroughs in Deep Learning have made it possible to learn these features from large amounts of labeled data. The focus of this project is to develop and extend Deep Learning algorithms for learning features from vast amounts of unlabeled data and to develop the HPC neural network training platform to support the training of massive network models. This LDRD project succeeded in developing new unsupervised feature learning algorithms for images and video and created a scalable neural network training toolkit for HPC. Additionally, this LDRD helped create the world’s largest freely-available image and video dataset supporting open multimedia research and used this dataset for training our deep neural networks. This research helped LLNL capture several work-for-others (WFO) projects, attract new talent, and establish collaborations with leading academic and commercial partners. Finally, this project demonstrated the successful training of the largest unsupervised image neural network using HPC resources and helped establish LLNL leadership at the intersection of Machine Learning and HPC research.« less
A Primer on Autonomous Aerial Vehicle Design
Coppejans, Hugo H. G.; Myburgh, Herman C.
2015-01-01
There is a large amount of research currently being done on autonomous micro-aerial vehicles (MAV), such as quadrotor helicopters or quadcopters. The ability to create a working autonomous MAV depends mainly on integrating a simultaneous localization and mapping (SLAM) solution with the rest of the system. This paper provides an introduction for creating an autonomous MAV for enclosed environments, aimed at students and professionals alike. The standard autonomous system and MAV automation are discussed, while we focus on the core concepts of SLAM systems and trajectory planning algorithms. The advantages and disadvantages of using remote processing are evaluated, and recommendations are made regarding the viability of on-board processing. Recommendations are made regarding best practices to serve as a guideline for aspirant MAV designers. PMID:26633410
A Primer on Autonomous Aerial Vehicle Design.
Coppejans, Hugo H G; Myburgh, Herman C
2015-12-02
There is a large amount of research currently being done on autonomous micro-aerial vehicles (MAV), such as quadrotor helicopters or quadcopters. The ability to create a working autonomous MAV depends mainly on integrating a simultaneous localization and mapping (SLAM) solution with the rest of the system. This paper provides an introduction for creating an autonomous MAV for enclosed environments, aimed at students and professionals alike. The standard autonomous system and MAV automation are discussed, while we focus on the core concepts of SLAM systems and trajectory planning algorithms. The advantages and disadvantages of using remote processing are evaluated, and recommendations are made regarding the viability of on-board processing. Recommendations are made regarding best practices to serve as a guideline for aspirant MAV designers.
A large-scale solar dynamics observatory image dataset for computer vision applications.
Kucuk, Ahmet; Banda, Juan M; Angryk, Rafal A
2017-01-01
The National Aeronautics Space Agency (NASA) Solar Dynamics Observatory (SDO) mission has given us unprecedented insight into the Sun's activity. By capturing approximately 70,000 images a day, this mission has created one of the richest and biggest repositories of solar image data available to mankind. With such massive amounts of information, researchers have been able to produce great advances in detecting solar events. In this resource, we compile SDO solar data into a single repository in order to provide the computer vision community with a standardized and curated large-scale dataset of several hundred thousand solar events found on high resolution solar images. This publicly available resource, along with the generation source code, will accelerate computer vision research on NASA's solar image data by reducing the amount of time spent performing data acquisition and curation from the multiple sources we have compiled. By improving the quality of the data with thorough curation, we anticipate a wider adoption and interest from the computer vision to the solar physics community.
Rehabilitation of the Overhead Athlete’s Elbow
Wilk, Kevin E.; Macrina, Leonard C.; Cain, E. Lyle; Dugas, Jeffrey R.; Andrews, James R.
2012-01-01
The activities required during overhead sports, particularly during baseball pitching, produce large forces at the elbow joint. Injuries to the elbow joint frequently occur in the overhead athlete because of the large amount of forces observed during the act of throwing, playing tennis, or playing golf. Injuries may result because of repetitive overuse, leading to tissue failure. Rehabilitation following injury or surgery to the throwing elbow is vital to fully restore normal function and return the athlete to competition as quickly and safely as possible. Rehabilitation of the elbow, whether following injury or postsurgical, must follow a progressive and sequential order, building on the previous phase, to ensure that healing tissues are not compromised. Emphasis is placed on restoring full motion, muscular strength, and neuromuscular control while gradually applying loads to healing tissue. In addition, when one is creating a rehabilitation plan for athletes, it is imperative to treat the entire upper extremity, core, and legs to create and dissipate the forces generated at each joint. PMID:23016113
The emergence of understanding in a computer model of concepts and analogy-making
NASA Astrophysics Data System (ADS)
Mitchell, Melanie; Hofstadter, Douglas R.
1990-06-01
This paper describes Copycat, a computer model of the mental mechanisms underlying the fluidity and adaptability of the human conceptual system in the context of analogy-making. Copycat creates analogies between idealized situations in a microworld that has been designed to capture and isolate many of the central issues of analogy-making. In Copycat, an understanding of the essence of a situation and the recognition of deep similarity between two superficially different situations emerge from the interaction of a large number of perceptual agents with an associative, overlapping, and context-sensitive network of concepts. Central features of the model are: a high degree of parallelism; competition and cooperation among a large number of small, locally acting agents that together create a global understanding of the situation at hand; and a computational temperature that measures the amount of perceptual organization as processing proceeds and that in turn controls the degree of randomness with which decisions are made in the system.
NASA Astrophysics Data System (ADS)
Lambs, Luc; Muller, Etienne; Fromard, F.
2007-08-01
SummaryFrench Guiana is notable for the extent of its rain forests, which occupy 97% of the country, and the influence of the Amazon along its shores. In fact, the shores and estuaries support a mangrove forest typical of saline conditions. This paper reports the chemical characteristics, conductivity and salinity and the stable isotopes (oxygen and deuterium) of the rivers and shores between the Cayenne area and the border with Surinam. The results show a quite homogenous freshwater pool over the country. However, the low slope of the coast, a result of the wide mud banks deposited by the Amazonian plume, have turned the mouths of the smaller rivers to the northwest, creating large salty areas where mangroves grow several kilometers inland. Despite the large amount of Amazonian water, the Guianan coast exhibits high salinity. In fact, the freshwater itself remains far from the shore, following the north Brazilian current, while only the mud plume arrives at the coast, creating this paradox.
New research discovery may mean less radioactive contamination, safer nuclear power plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murph, S.
Murph has now made another nanoparticle breakthrough that could benefit various work environments such as nuclear power plants. Murph and her team have created nanoparticle treated stainless steel filters that are capable to capturing radioactive vapor materials. Just like air filters capture dust and dirt, these filters are capable of capturing large amounts of radioactive vapors. The new research may one day mean that nuclear power plant workers, and other workers in related fields, will have a safer working environment.
The evolution of educational information systems and nurse faculty roles.
Nelson, Ramona; Meyers, Linda; Rizzolo, Mary Anne; Rutar, Pamela; Proto, Marcia B; Newbold, Susan
2006-01-01
Institutions of higher education are purchasing and/or designing sophisticated administrative information systems to manage such functions as the application, admissions, and registration process, grants management, student records, and classroom scheduling. Although faculty also manage large amounts of data, few automated systems have been created to help faculty improve teaching and learning through the management of information related to individual students, the curriculum, educational programs, and program evaluation. This article highlights the potential benefits that comprehensive educational information systems offer nurse faculty.
Privacy Challenges of Genomic Big Data.
Shen, Hong; Ma, Jian
2017-01-01
With the rapid advancement of high-throughput DNA sequencing technologies, genomics has become a big data discipline where large-scale genetic information of human individuals can be obtained efficiently with low cost. However, such massive amount of personal genomic data creates tremendous challenge for privacy, especially given the emergence of direct-to-consumer (DTC) industry that provides genetic testing services. Here we review the recent development in genomic big data and its implications on privacy. We also discuss the current dilemmas and future challenges of genomic privacy.
Robust Coordination for Large Sets of Simple Rovers
NASA Technical Reports Server (NTRS)
Tumer, Kagan; Agogino, Adrian
2006-01-01
The ability to coordinate sets of rovers in an unknown environment is critical to the long-term success of many of NASA;s exploration missions. Such coordination policies must have the ability to adapt in unmodeled or partially modeled domains and must be robust against environmental noise and rover failures. In addition such coordination policies must accommodate a large number of rovers, without excessive and burdensome hand-tuning. In this paper we present a distributed coordination method that addresses these issues in the domain of controlling a set of simple rovers. The application of these methods allows reliable and efficient robotic exploration in dangerous, dynamic, and previously unexplored domains. Most control policies for space missions are directly programmed by engineers or created through the use of planning tools, and are appropriate for single rover missions or missions requiring the coordination of a small number of rovers. Such methods typically require significant amounts of domain knowledge, and are difficult to scale to large numbers of rovers. The method described in this article aims to address cases where a large number of rovers need to coordinate to solve a complex time dependent problem in a noisy environment. In this approach, each rover decomposes a global utility, representing the overall goal of the system, into rover-specific utilities that properly assign credit to the rover s actions. Each rover then has the responsibility to create a control policy that maximizes its own rover-specific utility. We show a method of creating rover-utilities that are "aligned" with the global utility, such that when the rovers maximize their own utility, they also maximize the global utility. In addition we show that our method creates rover-utilities that allow the rovers to create their control policies quickly and reliably. Our distributed learning method allows large sets rovers be used unmodeled domains, while providing robustness against rover failures and changing environments. In experimental simulations we show that our method scales well with large numbers of rovers in addition to being robust against noisy sensor inputs and noisy servo control. The results show that our method is able to scale to large numbers of rovers and achieves up to 400% performance improvement over standard machine learning methods.
Cloud-enabled large-scale land surface model simulations with the NASA Land Information System
NASA Astrophysics Data System (ADS)
Duffy, D.; Vaughan, G.; Clark, M. P.; Peters-Lidard, C. D.; Nijssen, B.; Nearing, G. S.; Rheingrover, S.; Kumar, S.; Geiger, J. V.
2017-12-01
Developed by the Hydrological Sciences Laboratory at NASA Goddard Space Flight Center (GSFC), the Land Information System (LIS) is a high-performance software framework for terrestrial hydrology modeling and data assimilation. LIS provides the ability to integrate satellite and ground-based observational products and advanced modeling algorithms to extract land surface states and fluxes. Through a partnership with the National Center for Atmospheric Research (NCAR) and the University of Washington, the LIS model is currently being extended to include the Structure for Unifying Multiple Modeling Alternatives (SUMMA). With the addition of SUMMA in LIS, meaningful simulations containing a large multi-model ensemble will be enabled and can provide advanced probabilistic continental-domain modeling capabilities at spatial scales relevant for water managers. The resulting LIS/SUMMA application framework is difficult for non-experts to install due to the large amount of dependencies on specific versions of operating systems, libraries, and compilers. This has created a significant barrier to entry for domain scientists that are interested in using the software on their own systems or in the cloud. In addition, the requirement to support multiple run time environments across the LIS community has created a significant burden on the NASA team. To overcome these challenges, LIS/SUMMA has been deployed using Linux containers, which allows for an entire software package along with all dependences to be installed within a working runtime environment, and Kubernetes, which orchestrates the deployment of a cluster of containers. Within a cloud environment, users can now easily create a cluster of virtual machines and run large-scale LIS/SUMMA simulations. Installations that have taken weeks and months can now be performed in minutes of time. This presentation will discuss the steps required to create a cloud-enabled large-scale simulation, present examples of its use, and describe the potential deployment of this information technology with other NASA applications.
NASA Technical Reports Server (NTRS)
Niles, P.B.
2008-01-01
The chemistry, sedimentology, and geology of the Meridiani sedimentary deposits are best explained by eolian reworking of the sublimation residue of a large scale ice/dust deposit. This large ice deposit was located in close proximity to Terra Meridiani and incorporated large amounts of dust, sand, and SO2 aerosols generated by impacts and volcanism during early martian history. Sulfate formation and chemical weathering of the initial igneous material is hypothesized to have occurred inside of the ice when the darker mineral grains were heated by solar radiant energy. This created conditions in which small films of liquid water were created in and around the mineral grains. This water dissolved the SO2 and reacted with the mineral grains forming an acidic environment under low water/rock conditions. Subsequent sublimation of this ice deposit left behind large amounts of weathered sublimation residue which became the source material for the eolian process that deposited the Terra Meridiani deposit. The following features of the Meridiani sediments are best explained by this model: The large scale of the deposit, its mineralogic similarity across large distances, the cation-conservative nature of the weathering processes, the presence of acidic groundwaters on a basaltic planet, the accumulation of a thick sedimentary sequence outside of a topographic basin, and the low water/rock ratio needed to explain the presence of very soluble minerals and elements in the deposit. Remote sensing studies have linked the Meridiani deposits to a number of other martian surface features through mineralogic similarities, geomorphic similarities, and regional associations. These include layered deposits in Arabia Terra, interior layered deposits in the Valles Marineris system, southern Elysium/Aeolis, Amazonis Planitia, and the Hellas basin, Aram Chaos, Aureum Chaos, and Ioni Chaos. The common properties shared by these deposits suggest that all of these deposits share a common formation process which must have acted over a large area of Mars. The results of this study suggest a mechanism for volatile transport on Mars without invoking an early greenhouse. They also imply a common formation mechanism for most of the sulfate minerals and layered deposits on Mars, which explains their common occurrence.
Polymer Formulations for Cartilage Repair
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutowska, Anna; Jasionowski, Marek; Morris, J. E.
2001-05-15
Regeneration of destroyed articular cartilage can be induced by transplantation of cartilage cells into a defect. The best results are obtained with the use of autologus cells. However, obtaining large amounts of autologus cartilage cells causes a problem of creating a large cartilage defect in a donor site. Techniques are currently being developed to harvest a small number of cells and propagate them in vitro. It is a challenging task, however, due to the fact that ordinarily, in a cell culture on flat surfaces, chondrocytes do not maintain their in vivo phenotype and irreversibly diminish or cease the synthesis ofmore » aggregating proteoglycans. Therefore, the research is continuing to develop culture conditions for chondrocytes with the preserved phenotype.« less
Hot working behavior of selective laser melted and laser metal deposited Inconel 718
NASA Astrophysics Data System (ADS)
Bambach, Markus; Sizova, Irina
2018-05-01
The production of Nickel-based high-temperature components is of great importance for the transport and energy sector. Forging of high-temperature alloys often requires expensive dies, multiple forming steps and leads to forged parts with tolerances that require machining to create the final shape and a large amount of scrap. Additive manufacturing offers the possibility to print the desired shapes directly as net-shape components, requiring only little additional effort in machining. Especially for high-temperature alloys carrying a large amount of energy per unit mass, additive manufacturing could be more energy-efficient than forging if the energy contained in the machining scrap exceeds the energy needed for powder production and laser processing. However, the microstructure and performance of 3d-printed parts will not reach the level of forged material unless further expensive processes such as hot-isostatic pressing are used. Using the design freedom and possibilities to locally engineer material, additive manufacturing could be combined with forging operations to novel process chains, offering the possibility to reduce the number of forging steps and to create near-net shape forgings with desired local properties. Some innovative process chains combining additive manufacturing and forging have been patented recently, but almost no scientific knowledge on the workability of 3D printed preforms exists. The present study investigates the flow stress and microstructure evolution during hot working of pre-forms produced by laser powder deposition and selective laser melting (Figure 1) and puts forward a model for the flow stress.
Sampayan, Stephen E.
1998-01-01
A hybrid emitter exploits the electric field created by a rapidly depoled ferroelectric material. Combining the emission properties of a planar thin film diamond emitter with a ferroelectric alleviates the present technological problems associated with both types of emitters and provides a robust, extremely long life, high current density cathode of the type required by emerging microwave power generation, accelerator technology and display applications. This new hybrid emitter is easy to fabricate and not susceptible to the same failures which plague microstructure field emitter technology. Local electrode geometries and electric field are determined independently from those for optimum transport and brightness preservation. Due to the large amount of surface charge created on the ferroelectric, the emitted electrons have significant energy, thus eliminating the requirement for specialized phosphors in emissive flat-panel displays.
Sampayan, S.E.
1998-03-03
A hybrid emitter exploits the electric field created by a rapidly depoled ferroelectric material. Combining the emission properties of a planar thin film diamond emitter with a ferroelectric alleviates the present technological problems associated with both types of emitters and provides a robust, extremely long life, high current density cathode of the type required by emerging microwave power generation, accelerator technology and display applications. This new hybrid emitter is easy to fabricate and not susceptible to the same failures which plague microstructure field emitter technology. Local electrode geometries and electric field are determined independently from those for optimum transport and brightness preservation. Due to the large amount of surface charge created on the ferroelectric, the emitted electrons have significant energy, thus eliminating the requirement for specialized phosphors in emissive flat-panel displays. 11 figs.
Method for facilitating the introduction of material into cells
Holcomb, David E.; McKnight, Timothy E.
2000-01-01
The present invention is a method for creating a localized disruption within a boundary of a cell or structure by exposing a boundary of a cell or structure to a set of energetically charged particles while regulating the energy of the charged particles so that the charged particles have an amount of kinetic energy sufficient to create a localized disruption within an area of the boundary of the cell or structure, then upon creation of the localized disruption, the amount of kinetic energy decreases to an amount insufficient to create further damage within the cell or structure beyond the boundary. The present invention is also a method for facilitating the introduction of a material into a cell or structure using the same methodology then further exciting the area of the boundary of the cell or structure where the localized disruption was created so to create a localized temporary opening within the boundary then further introducing the material through the temporary opening into the cell or structure.
A Parallel Nonrigid Registration Algorithm Based on B-Spline for Medical Images.
Du, Xiaogang; Dang, Jianwu; Wang, Yangping; Wang, Song; Lei, Tao
2016-01-01
The nonrigid registration algorithm based on B-spline Free-Form Deformation (FFD) plays a key role and is widely applied in medical image processing due to the good flexibility and robustness. However, it requires a tremendous amount of computing time to obtain more accurate registration results especially for a large amount of medical image data. To address the issue, a parallel nonrigid registration algorithm based on B-spline is proposed in this paper. First, the Logarithm Squared Difference (LSD) is considered as the similarity metric in the B-spline registration algorithm to improve registration precision. After that, we create a parallel computing strategy and lookup tables (LUTs) to reduce the complexity of the B-spline registration algorithm. As a result, the computing time of three time-consuming steps including B-splines interpolation, LSD computation, and the analytic gradient computation of LSD, is efficiently reduced, for the B-spline registration algorithm employs the Nonlinear Conjugate Gradient (NCG) optimization method. Experimental results of registration quality and execution efficiency on the large amount of medical images show that our algorithm achieves a better registration accuracy in terms of the differences between the best deformation fields and ground truth and a speedup of 17 times over the single-threaded CPU implementation due to the powerful parallel computing ability of Graphics Processing Unit (GPU).
Geo-reCAPTCHA: Crowdsourcing large amounts of geographic information from earth observation data
NASA Astrophysics Data System (ADS)
Hillen, Florian; Höfle, Bernhard
2015-08-01
The reCAPTCHA concept provides a large amount of valuable information for various applications. First, it provides security, e.g., for a form on a website, by means of a test that only a human could solve. Second, the effort of the user for this test is used to generate additional information, e.g., digitization of books or identification of house numbers. In this work, we present a concept for adapting the reCAPTCHA idea to create user-generated geographic information from earth observation data, and the requirements during the conception and implementation are depicted in detail. Furthermore, the essential parts of a Geo-reCAPTCHA system are described, and afterwards transferred, to a prototype implementation. An empirical user study is conducted to investigate the Geo-reCAPTCHA approach, assessing time and quality of the resulting geographic information. Our results show that a Geo-reCAPTCHA can be solved by the users of our study on building digitization in a short amount of time (19.2 s on average) with an overall average accuracy of the digitizations of 82.2%. In conclusion, Geo-reCAPTCHA has the potential to be a reasonable alternative to the typical reCAPTCHA, and to become a new data-rich channel of crowdsourced geographic information.
NASA Astrophysics Data System (ADS)
Hossain, U. H.; Ensinger, W.
2015-12-01
Devices operating in space, e.g. in satellites, are being hit by cosmic rays. These include so-called HZE-ions, with High mass (Z) and energy (E). These highly energetic heavy ions penetrate deeply into the materials and deposit a large amount of energy, typically several keV per nm range. Serious damage is created. In space vehicles, polymers are used which are degraded under ion bombardment. HZE ion irradiation can experimentally be simulated in large scale accelerators. In the present study, the radiation damage of aliphatic vinyl- and fluoro-polymers by heavy ions with energies in the GeV range is described. The ions cause bond scission and create volatile small molecular species, leading to considerable mass loss of the polymers. Since hydrogen, oxygen and fluorine-containing molecules are created and these elements are depleted, the remaining material is carbon-richer than the original polymers and contains conjugated CC double bonds. This process is investigated by measuring the optical band gap with UV-Vis absorption spectrometry as a function of ion fluence. The results show how the optical band gaps shift from the UV into the Vis region upon ion irradiation for the different polymers.
NASA Astrophysics Data System (ADS)
Andhavarapu, A.; King, W.; Lindsay, A.; Byrns, B.; Knappe, D.; Fonteno, W.; Shannon, S.
2014-10-01
Plasma source generated nitrogen fertilizer is compared to conventional nitrogen fertilizers in water for plant growth. Root, shoot sizes, and weights are used to examine differences between plant treatment groups. With a simple coaxial structure creating a large-volume atmospheric glow discharge, a 162 MHz generator drives the air plasma. The VHF plasma source emits a steady state glow; the high drive frequency is believed to inhibit the glow-to-arc transition for non-thermal discharge generation. To create the plasma activated water (PAW) solutions used for plant treatment, the discharge is held over distilled water until a 100 ppm nitrate aqueous concentration is achieved. The discharge is used to incorporate nitrogen species into aqueous solution, which is used to fertilize radishes, marigolds, and tomatoes. In a four week experiment, these plants are watered with four different solutions: tap water, dissolved ammonium nitrate DI water, dissolved sodium nitrate DI water, and PAW. Ammonium nitrate solution has the same amount of total nitrogen as PAW; sodium nitrate solution has the same amount of nitrate as PAW. T-tests are used to determine statistical significance in plant group growth differences. PAW fertilization chemical mechanisms are presented.
Augmenting Conceptual Design Trajectory Tradespace Exploration with Graph Theory
NASA Technical Reports Server (NTRS)
Dees, Patrick D.; Zwack, Mathew R.; Steffens, Michael; Edwards, Stephen
2016-01-01
Within conceptual design changes occur rapidly due to a combination of uncertainty and shifting requirements. To stay relevant in this fluid time, trade studies must also be performed rapidly. In order to drive down analysis time while improving the information gained by these studies, surrogate models can be created to represent the complex output of a tool or tools within a specified tradespace. In order to create this model however, a large amount of data must be collected in a short amount of time. By this method, the historical approach of relying on subject matter experts to generate the data required is schedule infeasible. However, by implementing automation and distributed analysis the required data can be generated in a fraction of the time. Previous work focused on setting up a tool called multiPOST capable of orchestrating many simultaneous runs of an analysis tool assessing these automated analyses utilizing heuristics gleaned from the best practices of current subject matter experts. In this update to the previous work, elements of graph theory are included to further drive down analysis time by leveraging data previously gathered. It is shown to outperform the previous method in both time required, and the quantity and quality of data produced.
Automated Fault Interpretation and Extraction using Improved Supplementary Seismic Datasets
NASA Astrophysics Data System (ADS)
Bollmann, T. A.; Shank, R.
2017-12-01
During the interpretation of seismic volumes, it is necessary to interpret faults along with horizons of interest. With the improvement of technology, the interpretation of faults can be expedited with the aid of different algorithms that create supplementary seismic attributes, such as semblance and coherency. These products highlight discontinuities, but still need a large amount of human interaction to interpret faults and are plagued by noise and stratigraphic discontinuities. Hale (2013) presents a method to improve on these datasets by creating what is referred to as a Fault Likelihood volume. In general, these volumes contain less noise and do not emphasize stratigraphic features. Instead, planar features within a specified strike and dip range are highlighted. Once a satisfactory Fault Likelihood Volume is created, extraction of fault surfaces is much easier. The extracted fault surfaces are then exported to interpretation software for QC. Numerous software packages have implemented this methodology with varying results. After investigating these platforms, we developed a preferred Automated Fault Interpretation workflow.
CFD Analysis of Flexible Thermal Protection System Shear Configuration Testing in the LCAT Facility
NASA Technical Reports Server (NTRS)
Ferlemann, Paul G.
2014-01-01
This paper documents results of computational analysis performed after flexible thermal protection system shear configuration testing in the LCAT facility. The primary objectives were to predict the shear force on the sample and the sensitivity of all surface properties to the shape of the sample. Bumps of 0.05, 0.10,and 0.15 inches were created to approximate the shape of some fabric samples during testing. A large amount of information was extracted from the CFD solutions for comparison between runs and also current or future flight simulations.
Hα and Gaia-RVS domain spectroscopy of Be stars and interacting binaries with Ondřejov 2m telescope
NASA Astrophysics Data System (ADS)
Koubský, P.; Kotková, L.; Votruba, V.
2011-12-01
A long term project to investigate the spectral appearance over the Gaia RVS domain of a large sample of Be stars and interacting binaries has been undertaken. The aim of the Ondřejov project is to create sufficient amounts of training data in the RVS wavelength domain to complement the Bp/Rp classification of Be stars which may be observed with Gaia. The project's current status is described and sample spectra in both the Hα and RVS wavelength domains are presented and discussed.
NASA Astrophysics Data System (ADS)
Polsterer, K. L.; Gieseke, F.; Igel, C.
2015-09-01
In the last decades more and more all-sky surveys created an enormous amount of data which is publicly available on the Internet. Crowd-sourcing projects such as Galaxy-Zoo and Radio-Galaxy-Zoo used encouraged users from all over the world to manually conduct various classification tasks. The combination of the pattern-recognition capabilities of thousands of volunteers enabled scientists to finish the data analysis within acceptable time. For up-coming surveys with billions of sources, however, this approach is not feasible anymore. In this work, we present an unsupervised method that can automatically process large amounts of galaxy data and which generates a set of prototypes. This resulting model can be used to both visualize the given galaxy data as well as to classify so far unseen images.
Dynamic permeability in fault damage zones induced by repeated coseismic fracturing events
NASA Astrophysics Data System (ADS)
Aben, F. M.; Doan, M. L.; Mitchell, T. M.
2017-12-01
Off-fault fracture damage in upper crustal fault zones change the fault zone properties and affect various co- and interseismic processes. One of these properties is the permeability of the fault damage zone rocks, which is generally higher than the surrounding host rock. This allows large-scale fluid flow through the fault zone that affects fault healing and promotes mineral transformation processes. Moreover, it might play an important role in thermal fluid pressurization during an earthquake rupture. The damage zone permeability is dynamic due to coseismic damaging. It is crucial for earthquake mechanics and for longer-term processes to understand how the dynamic permeability structure of a fault looks like and how it evolves with repeated earthquakes. To better detail coseismically induced permeability, we have performed uniaxial split Hopkinson pressure bar experiments on quartz-monzonite rock samples. Two sample sets were created and analyzed: single-loaded samples subjected to varying loading intensities - with damage varying from apparently intact to pulverized - and samples loaded at a constant intensity but with a varying number of repeated loadings. The first set resembles a dynamic permeability structure created by a single large earthquake. The second set resembles a permeability structure created by several earthquakes. After, the permeability and acoustic velocities were measured as a function of confining pressure. The permeability in both datasets shows a large and non-linear increase over several orders of magnitude (from 10-20 up to 10-14 m2) with an increasing amount of fracture damage. This, combined with microstructural analyses of the varying degrees of damage, suggests a percolation threshold. The percolation threshold does not coincide with the pulverization threshold. With increasing confining pressure, the permeability might drop up to two orders of magnitude, which supports the possibility of large coseismic fluid pulses over relatively large distances along a fault. Also, a relatively small threshold could potentially increase permeability in a large volume of rock, given that previous earthquakes already damaged these rocks.
A Framework for the Design of Effective Graphics for Scientific Visualization
NASA Technical Reports Server (NTRS)
Miceli, Kristina D.
1992-01-01
This proposal presents a visualization framework, based on a data model, that supports the production of effective graphics for scientific visualization. Visual representations are effective only if they augment comprehension of the increasing amounts of data being generated by modern computer simulations. These representations are created by taking into account the goals and capabilities of the scientist, the type of data to be displayed, and software and hardware considerations. This framework is embodied in an assistant-based visualization system to guide the scientist in the visualization process. This will improve the quality of the visualizations and decrease the time the scientist is required to spend in generating the visualizations. I intend to prove that such a framework will create a more productive environment for tile analysis and interpretation of large, complex data sets.
Degassing of molten alloys with the assistance of ultrasonic vibration
Han, Qingyou; Xu, Hanbing; Meek, Thomas T.
2010-03-23
An apparatus and method are disclosed in which ultrasonic vibration is used to assist the degassing of molten metals or metal alloys thereby reducing gas content in the molten metals or alloys. High-intensity ultrasonic vibration is applied to a radiator that creates cavitation bubbles, induces acoustic streaming in the melt, and breaks up purge gas (e.g., argon or nitrogen) which is intentionally introduced in a small amount into the melt in order to collect the cavitation bubbles and to make the cavitation bubbles survive in the melt. The molten metal or alloy in one version of the invention is an aluminum alloy. The ultrasonic vibrations create cavitation bubbles and break up the large purge gas bubbles into small bubbles and disperse the bubbles in the molten metal or alloy more uniformly, resulting in a fast and clean degassing.
CD-ROM technology at the EROS data center
Madigan, Michael E.; Weinheimer, Mary C.
1993-01-01
The vast amount of digital spatial data often required by a single user has created a demand for media alternatives to 1/2" magnetic tape. One such medium that has been recently adopted at the U.S. Geological Survey's EROS Data Center is the compact disc (CD). CD's are a versatile, dynamic, and low-cost method for providing a variety of data on a single media device and are compatible with various computer platforms. CD drives are available for personal computers, UNIX workstations, and mainframe systems, either directly connected, or through a network. This medium furnishes a quick method of reproducing and distributing large amounts of data on a single CD. Several data sets are already available on CD's, including collections of historical Landsat multispectral scanner data and biweekly composites of Advanced Very High Resolution Radiometer data for the conterminous United States. The EROS Data Center intends to provide even more data sets on CD's. Plans include specific data sets on a customized disc to fulfill individual requests, and mass production of unique data sets for large-scale distribution. Requests for a single compact disc-read only memory (CD-ROM) containing a large volume of data either for archiving or for one-time distribution can be addressed with a CD-write once (CD-WO) unit. Mass production and large-scale distribution will require CD-ROM replication and mastering.
NASA Astrophysics Data System (ADS)
Andersen, G.; Dearborn, M.; Hcharg, G.
2010-09-01
We are investigating new technologies for creating ultra-large apertures (>20m) for space-based imagery. Our approach has been to create diffractive primaries in flat membranes deployed from compact payloads. These structures are attractive in that they are much simpler to fabricate, launch and deploy compared to conventional three-dimensional optics. In this case the flat focusing element is a photon sieve which consists of a large number of holes in an otherwise opaque substrate. A photon sieve is essentially a large number of holes located according to an underlying Fresnel Zone Plate (FZP) geometry. The advantages over the FZP are that there are no support struts which lead to diffraction spikes in the far-field and non-uniform tension which can cause wrinkling of the substrate. Furthermore, with modifications in hole size and distribution we can achieve improved resolution and contrast over conventional optics. The trade-offs in using diffractive optics are the large amounts of dispersion and decreased efficiency. We present both theoretical and experimental results from small-scale prototypes. Several key solutions to issues of limited bandwidth and efficiency have been addressed. Along with these we have studied the materials aspects in order to optimize performance and achieve a scalable solution to an on-orbit demonstrator. Our current efforts are being directed towards an on-orbit 1m solar observatory demonstration deployed from a CubeSat bus.
Polo, John A.; Hallgren, S.W.; Leslie,, David M.
2013-01-01
Dead woody material, long ignored or viewed as a nuisance for forest management, has gained appreciation for its many roles in the forest including wildlife habitat, nutrient storage and cycling, energy for trophic webs, protection of soil, fuel for fire and carbon storage. The growing interest in managing dead woody material has created strong demand for greater understanding of factors controlling amounts and turnover. Prescribed burning, an important management tool, may have strong effects of dead woody material given fire’s capacity to create and consume dead woody material. We determined effects of long-term understory prescribed burning on standing and down woody material in upland oak forests in south-central North America. We hypothesized that as frequency of fire increased in these stands the amount of deadwood would decrease and the fine woody material would decrease more rapidly than coarse woody material. The study was conducted in forests dominated by post oak (Quercus stellata) and blackjack oak (Quercus marilandica) in wildlife management areas where understory prescribed burning had been practiced for over 20 years and the range of burn frequencies was 0 (unburned) fires per decade (FPD) to 4.6 FPD. The amount of deadwood was low compared with more productive forests in southeastern North America. The biomass (24.7 Mg ha-1) and carbon stocks (11.7 Mg ha-1) were distributed among standing dead (22%), coarse woody debris (CWD, dia. > 7.5 cm., 12%), fine woody debris (FWD, dia. < 7.5 cm., 23%), and forest floor (43%). There was no evidence that understory prescribed burning influenced the amount and size distribution of standing and down dead woody material. There were two explanations for the lack of a detectable effect. First, a high incidence of severe weather including ice storms and strong winds that produce large amounts of deadwood intermittently in an irregular pattern across the landscape may preclude detecting a strong effect of understory prescribed burning. Second, fire suppression during the first one-half of the 20th Century may have led to encroachment of woody plants into forest gaps and savannas creating a patchwork of young and old stands that produced deadwood of different sizes and at different rates.
26 CFR 1.666(a)-1 - Amount allocated.
Code of Federal Regulations, 2010 CFR
2010-04-01
...,000 in 1958, none in 1957, $1,000 in 1956. Example 3. A trust is created in 1952 under the laws of...) INCOME TAXES Treatment of Excess Distributions of Trusts Applicable to Taxable Years Beginning Before January 1, 1969 § 1.666(a)-1 Amount allocated. (a)(1) If a trust other than a foreign trust created by a U...
The influece of forest gaps on some properties of humus in a managed beech forest, northern Iran
NASA Astrophysics Data System (ADS)
Vajari, K. A.
2015-10-01
The present research focuses on the effect of eight-year-old artificially created gaps on some properties of humus in managed beech-dominated stand in Hyrcanian forest of northern Iran. In this study, six-teen gaps were sampled in site and were classified into four classes (small, medium, large, and very large) with four replications for each. Humus sampling was carried out at the centre and at the cardinal points within each gap as well as in the adjacent closed stand, separately, as composite samples. The variables of organic carbon, P, K, pH, and total N were measured for each sample. It was found that the gap size had significant effect only on total N (%) and organic carbon (%) in beech stand. The amount of potassium clearly differed among three positions in beech forest. The adjacent stand had higher significantly potassium than center and edge of gaps. Different amount of potassium was detected in gap center and gap edge. Comparison of humus properties between gaps and its adjacent stand pointed to the higher amount of potassium in adjacent stand than that in gaps but there was no difference between them regarding other humus properties. According to the results, it can be concluded that there is relatively similar condition among gaps and closed adjacent stands in terms of humus properties eight years after logging in the beech stand.
A Parallel Nonrigid Registration Algorithm Based on B-Spline for Medical Images
Wang, Yangping; Wang, Song
2016-01-01
The nonrigid registration algorithm based on B-spline Free-Form Deformation (FFD) plays a key role and is widely applied in medical image processing due to the good flexibility and robustness. However, it requires a tremendous amount of computing time to obtain more accurate registration results especially for a large amount of medical image data. To address the issue, a parallel nonrigid registration algorithm based on B-spline is proposed in this paper. First, the Logarithm Squared Difference (LSD) is considered as the similarity metric in the B-spline registration algorithm to improve registration precision. After that, we create a parallel computing strategy and lookup tables (LUTs) to reduce the complexity of the B-spline registration algorithm. As a result, the computing time of three time-consuming steps including B-splines interpolation, LSD computation, and the analytic gradient computation of LSD, is efficiently reduced, for the B-spline registration algorithm employs the Nonlinear Conjugate Gradient (NCG) optimization method. Experimental results of registration quality and execution efficiency on the large amount of medical images show that our algorithm achieves a better registration accuracy in terms of the differences between the best deformation fields and ground truth and a speedup of 17 times over the single-threaded CPU implementation due to the powerful parallel computing ability of Graphics Processing Unit (GPU). PMID:28053653
Eleuthera Island, Bahamas seen from STS-66
1994-11-14
The striking views provided by the Bahama Islands lend insights into the important problems of limestone (CaCO3) production and transport. This photograph includes the southern part of Eleuthera Island in the northern Bahamas. The hook-shaped island encloses a relatively shallow platform (light blue) which is surrounded by deep water (dark blue). The feathery pattern along the western edge of Eleuthera's platform are sand bars and sand channels created by tidal currents sweeping on and off the platform. The channels serve to funnel large amounts of CaCO3 off the platform and into the deeper water.
The Kepler End-to-End Model: Creating High-Fidelity Simulations to Test Kepler Ground Processing
NASA Technical Reports Server (NTRS)
Bryson, Stephen T.; Jenkins, Jon M.; Peters, Dan J.; Tenenbaum, Peter P.; Klaus, Todd C.; Gunter, Jay P.; Cote, Miles T.; Caldwell, Douglas A.
2010-01-01
The Kepler mission is designed to detect the transit of Earth-like planets around Sun-like stars by observing 100,000 stellar targets. Developing and testing the Kepler ground-segment processing system, in particular the data analysis pipeline, requires high-fidelity simulated data. This simulated data is provided by the Kepler End-to-End Model (ETEM). ETEM simulates the astrophysics of planetary transits and other phenomena, properties of the Kepler spacecraft and the format of the downlinked data. Major challenges addressed by ETEM include the rapid production of large amounts of simulated data, extensibility and maintainability.
Using Mosix for Wide-Area Compuational Resources
Maddox, Brian G.
2004-01-01
One of the problems with using traditional Beowulf-type distributed processing clusters is that they require an investment in dedicated computer resources. These resources are usually needed in addition to pre-existing ones such as desktop computers and file servers. Mosix is a series of modifications to the Linux kernel that creates a virtual computer, featuring automatic load balancing by migrating processes from heavily loaded nodes to less used ones. An extension of the Beowulf concept is to run a Mosixenabled Linux kernel on a large number of computer resources in an organization. This configuration would provide a very large amount of computational resources based on pre-existing equipment. The advantage of this method is that it provides much more processing power than a traditional Beowulf cluster without the added costs of dedicating resources.
Exhaust after-treatment system with in-cylinder addition of unburnt hydrocarbons
Coleman, Gerald N.; Kesse, Mary L.
2007-10-30
Certain exhaust after-treatment devices, at least periodically, require the addition of unburnt hydrocarbons in order to create reductant-rich exhaust conditions. The present disclosure adds unburnt hydrocarbons to exhaust from at least one combustion chamber by positioning, at least partially within a combustion chamber, a mixed-mode fuel injector operable to inject fuel into the combustion chamber in a first spray pattern with a small average angle relative to a centerline of the combustion chamber and a second spray pattern with a large average angle relative to the centerline of the combustion chamber. An amount of fuel is injected in the first spray pattern into a non-combustible environment within the at least one combustion chamber during at least one of an expansion stroke and exhaust stroke. The exhaust with the unburnt amount of fuel is moved into an exhaust passage via an exhaust valve.
Protestant Clergy and the Culture Wats: An Empirical Test of Hunter's Thesis.
Uecker, Jeremy E; Lucke, Glenn
2011-12-01
This study instead focuses on culture wars among religious elites-clergy-and tests three aspects of the culture wars thesis: (1) whether cultural wars exist at all among religious elites, (2) whether clergy attitudes are polarized on these issues, and (3) whether religious authority or religious affiliation is more salient in creating culture wars cleavages. Using data from a large random sample of Protestant clergy, we find a substantial amount of engagement in culture wars by all types of Protestant clergy. The amount of polarization is more attributable to views of religious authority (i.e., biblical inerrancy) than to religious tradition. Moreover, polarization among clergy is somewhat more evident on culture wars issues than on other social and political issues. These findings are generally supportive of the culture wars thesis and should help return examinations of culture wars back to where they were originally theorized to be waged: among elites.
Charging and Discharging of Lichtenberg Electrets
NASA Astrophysics Data System (ADS)
Wood, Monika
The research presented here describes a unique way to deposit a large amount of charge onto the surface of a thin dielectric sheet to create a Lichtenberg electret that can be discharged elsewhere to form spectacular Lichtenberg figures. This study examines how the amount of charge deposited onto the surface, the geometry of the probes, and the type of material used can all impact the formation of the Lichtenberg figures. Photographs of the Lichtenberg figures were taken and used to determine the voltage, current, and energy released during each discharge. It was found that a single discharge can release 0.49 J of energy in 1.24 micros for a Lichtenberg figure that covers approximately 500 cm. 2. Lichtenberg figures can be used to characterize high-voltage surgeson power lines, to diagnose lightning strike victims, to analyze electrical breakdown of insulating materials, for artistic purposes, and for similar applications where pulsed capacitors are commonly used.
Challenges in disposing of anthrax waste.
Lesperance, Ann M; Stein, Steve; Upton, Jaki F; Toomey, Chris
2011-09-01
Disasters often create large amounts of waste that must be managed as part of both immediate response and long-term recovery. While many federal, state, and local agencies have debris management plans, these plans often do not address chemical, biological, and radiological contamination. The Interagency Biological Restoration Demonstration's (IBRD) purpose was to holistically assess all aspects of an anthrax incident and assist in the development of a plan for long-term recovery. In the case of wide-area anthrax contamination and the follow-on response and recovery activities, a significant amount of material would require decontamination and disposal. Accordingly, IBRD facilitated the development of debris management plans to address contaminated waste through a series of interviews and workshops with local, state, and federal representatives. The outcome of these discussions was the identification of 3 primary topical areas that must be addressed: planning, unresolved research questions, and resolving regulatory issues.
Protestant Clergy and the Culture Wats: An Empirical Test of Hunter’s Thesis
Uecker, Jeremy E.; Lucke, Glenn
2013-01-01
This study instead focuses on culture wars among religious elites—clergy—and tests three aspects of the culture wars thesis: (1) whether cultural wars exist at all among religious elites, (2) whether clergy attitudes are polarized on these issues, and (3) whether religious authority or religious affiliation is more salient in creating culture wars cleavages. Using data from a large random sample of Protestant clergy, we find a substantial amount of engagement in culture wars by all types of Protestant clergy. The amount of polarization is more attributable to views of religious authority (i.e., biblical inerrancy) than to religious tradition. Moreover, polarization among clergy is somewhat more evident on culture wars issues than on other social and political issues. These findings are generally supportive of the culture wars thesis and should help return examinations of culture wars back to where they were originally theorized to be waged: among elites. PMID:24072933
Highly entangled states with almost no secrecy.
Christandl, Matthias; Schuch, Norbert; Winter, Andreas
2010-06-18
In this Letter we illuminate the relation between entanglement and secrecy by providing the first example of a quantum state that is highly entangled, but from which, nevertheless, almost no secrecy can be extracted. More precisely, we provide two bounds on the bipartite entanglement of the totally antisymmetric state in dimension d×d. First, we show that the amount of secrecy that can be extracted from the state is low; to be precise it is bounded by O(1/d). Second, we show that the state is highly entangled in the sense that we need a large amount of singlets to create the state: entanglement cost is larger than a constant, independent of d. In order to obtain our results we use representation theory, linear programming, and the entanglement measure known as squashed entanglement. Our findings also clarify the relation between the squashed entanglement and the relative entropy of entanglement.
Challenges in Disposing of Anthrax Waste
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lesperance, Ann M.; Stein, Steven L.; Upton, Jaki F.
2011-09-01
Disasters often create large amounts of waste that must be managed as part of both immediate response and long-term recovery. While many federal, state, and local agencies have debris management plans, these plans often do not address chemical, biological, and radiological contamination. The Interagency Biological Restoration Demonstration’s (IBRD) purpose was to holistically assess all aspects of an anthrax incident and assist the development of a plan for long-term recovery. In the case of wide-area anthrax contamination and the follow-on response and recovery activities, a significant amount of material will require decontamination and disposal. Accordingly, IBRD facilitated the development of debrismore » management plans to address contaminated waste through a series of interviews and workshops with local, state, and federal representatives. The outcome of these discussion was the identification of three primary topical areas that must be addressed: 1) Planning; 2) Unresolved research questions, and resolving regulatory issues.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waltman, Melanie J.
2010-05-01
Explosives detection is a necessary and wide spread field of research. From large shipping containers to airline luggage, numerous items are tested for explosives every day. In the area of trace explosives detection, ion mobility spectrometry (IMS) is the technique employed most often because it is a quick, simple, and accurate way to test many items in a short amount of time. Detection by IMS is based on the difference in drift times of product ions through the drift region of an IMS instrument. The product ions are created when the explosive compounds, introduced to the instrument, are chemically ionizedmore » through interactions with the reactant ions. The identity of the reactant ions determines the outcomes of the ionization process. This research investigated the reactant ions created by various ionization sources and looked into ways to manipulate the chemistry occurring in the sources.« less
Fe 2O 3-Au hybrid nanoparticles for sensing applications via sers analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murph, Simona Hunyadi; Searles, Emily
2017-06-25
Nanoparticles with large amounts of surface area and unique characteristics that are distinct from their bulk material provide an interesting application in the enhancement of inelastic scattering signal. Surface Enhanced Raman Spectroscopy (SERS) strives to increase the Raman scattering effect when chemical species of interest are in the close proximity of metallic nnaostructures. Gold nanoparticles of various shapes have been used for sensing applications via SERS as they demonstrate the greatest effect of plasmonic behavior in the visible-near IR region of the spectrum. When coupled with other nanoparticles, namely iron oxide nanoparticles, hybrid structures with increased functionality were produced. Multifunctionalmore » iron oxide-gold hybrid nanostructures have been created via solution chemistries and investigated for analyte detection of a model analyte. By exploiting their magnetic properties, nanogaps or “hot spots” were rationally created and evaluated for SERS enhancement studies.« less
RE-PLAN: An Extensible Software Architecture to Facilitate Disaster Response Planning
O’Neill, Martin; Mikler, Armin R.; Indrakanti, Saratchandra; Tiwari, Chetan; Jimenez, Tamara
2014-01-01
Computational tools are needed to make data-driven disaster mitigation planning accessible to planners and policymakers without the need for programming or GIS expertise. To address this problem, we have created modules to facilitate quantitative analyses pertinent to a variety of different disaster scenarios. These modules, which comprise the REsponse PLan ANalyzer (RE-PLAN) framework, may be used to create tools for specific disaster scenarios that allow planners to harness large amounts of disparate data and execute computational models through a point-and-click interface. Bio-E, a user-friendly tool built using this framework, was designed to develop and analyze the feasibility of ad hoc clinics for treating populations following a biological emergency event. In this article, the design and implementation of the RE-PLAN framework are described, and the functionality of the modules used in the Bio-E biological emergency mitigation tool are demonstrated. PMID:25419503
A knowledge-based, concept-oriented view generation system for clinical data.
Zeng, Q; Cimino, J J
2001-04-01
Information overload is a well-known problem for clinicians who must review large amounts of data in patient records. Concept-oriented views, which organize patient data around clinical concepts such as diagnostic strategies and therapeutic goals, may offer a solution to the problem of information overload. However, although concept-oriented views are desirable, they are difficult to create and maintain. We have developed a general-purpose, knowledge-based approach to the generation of concept-oriented views and have developed a system to test our approach. The system creates concept-oriented views through automated identification of relevant patient data. The knowledge in the system is represented by both a semantic network and rules. The key relevant data identification function is accomplished by a rule-based traversal of the semantic network. This paper focuses on the design and implementation of the system; an evaluation of the system is reported separately.
Donner, D.M.; Ribic, C.A.; Probst, J.R.
2009-01-01
Forest planners must evaluate how spatiotemporal changes in habitat amount and configuration across the landscape as a result of timber management will affect species' persistence. However, there are few long-term programs available for evaluation. We investigated the response of male Kirtland's Warbler (Dendroica kirtlandii) to 26 years of changing patch and landscape structure during a large, 26-year forestry-habitat restoration program within the warbler's primary breeding range. We found that the average density of male Kirtland's Warblers was related to a different combination of patch and landscape attributes depending on the species' regional population level and habitat amounts on the landscape (early succession jack pine (Pinus banksiana) forests; 15-42% habitat cover). Specifically, patch age and habitat regeneration type were important at low male population and total habitat amounts, while patch age and distance to an occupied patch were important at relatively high population and habitat amounts. Patch age and size were more important at increasing population levels and an intermediate amount of habitat. The importance of patch age to average male density during all periods reflects the temporal buildup and decline of male numbers as habitat suitability within the patch changed with succession. Habitat selection (i.e., preference for wildfire-regenerated habitat) and availability may explain the importance of habitat type and patch size during lower population and habitat levels. The relationship between male density and distance when there was the most habitat on the landscape and the male population was large and still increasing may be explained by the widening spatial dispersion of the increasing male population at the regional scale. Because creating or preserving habitat is not a random process, management efforts would benefit from more investigations of managed population responses to changes in spatial structure that occur through habitat gain rather than habitat loss to further our empirical understanding of general principles of the fragmentation process and habitat cover threshold effects within dynamic landscapes.
NASA Technical Reports Server (NTRS)
Matty, Christopher M.
2010-01-01
Crewed space vehicles have a common requirement to remove the carbon dioxide (CO2) created by the metabolic processes of the crew. The space shuttle [Space Transportation System (STS)] and International Space Station (ISS) each have systems in place that allow control and removal of CO2 from the habitable cabin environment. During periods in which the space shuttle is docked to the ISS, known as "joint docked operations," the space shuttle and ISS share a common atmosphere environment. During this period, an elevated amount of CO2 is produced through the combined metabolic activity of the STS and ISS crews. This elevated CO2 production, together with the large effective atmosphere created by collective volumes of the docked vehicles, creates a unique set of requirements for CO2 removal. This paper will describe individual CO2 control plans implemented by STS and ISS engineering teams, as well as the integrated plans used when both vehicles are docked. The paper will also discuss some of the issues and anomalies experienced by both engineering teams.
Development of a Two-Stage Microalgae Dewatering Process – A Life Cycle Assessment Approach
Soomro, Rizwan R.; Zeng, Xianhai; Lu, Yinghua; Lin, Lu; Danquah, Michael K.
2016-01-01
Even though microalgal biomass is leading the third generation biofuel research, significant effort is required to establish an economically viable commercial-scale microalgal biofuel production system. Whilst a significant amount of work has been reported on large-scale cultivation of microalgae using photo-bioreactors and pond systems, research focus on establishing high performance downstream dewatering operations for large-scale processing under optimal economy is limited. The enormous amount of energy and associated cost required for dewatering large-volume microalgal cultures has been the primary hindrance to the development of the needed biomass quantity for industrial-scale microalgal biofuels production. The extremely dilute nature of large-volume microalgal suspension and the small size of microalgae cells in suspension create a significant processing cost during dewatering and this has raised major concerns towards the economic success of commercial-scale microalgal biofuel production as an alternative to conventional petroleum fuels. This article reports an effective framework to assess the performance of different dewatering technologies as the basis to establish an effective two-stage dewatering system. Bioflocculation coupled with tangential flow filtration (TFF) emerged a promising technique with total energy input of 0.041 kWh, 0.05 kg CO2 emissions and a cost of $ 0.0043 for producing 1 kg of microalgae biomass. A streamlined process for operational analysis of two-stage microalgae dewatering technique, encompassing energy input, carbon dioxide emission, and process cost, is presented. PMID:26904075
The origin of volatiles in the Earth's mantle
NASA Astrophysics Data System (ADS)
Hier-Majumder, Saswata; Hirschmann, Marc M.
2017-08-01
The Earth's deep interior contains significant reservoirs of volatiles such as H, C, and N. Due to the incompatible nature of these volatile species, it has been difficult to reconcile their storage in the residual mantle immediately following crystallization of the terrestrial magma ocean (MO). As the magma ocean freezes, it is commonly assumed that very small amounts of melt are retained in the residual mantle, limiting the trapped volatile concentration in the primordial mantle. In this article, we show that inefficient melt drainage out of the freezing front can retain large amounts of volatiles hosted in the trapped melt in the residual mantle while creating a thick early atmosphere. Using a two-phase flow model, we demonstrate that compaction within the moving freezing front is inefficient over time scales characteristic of magma ocean solidification. We employ a scaling relation between the trapped melt fraction, the rate of compaction, and the rate of freezing in our magma ocean evolution model. For cosmochemically plausible fractions of volatiles delivered during the later stages of accretion, our calculations suggest that up to 77% of total H2O and 12% of CO2 could have been trapped in the mantle during magma ocean crystallization. The assumption of a constant trapped melt fraction underestimates the mass of volatiles in the residual mantle by more than an order of magnitude.
NASA Astrophysics Data System (ADS)
Harris, B.; McDougall, K.; Barry, M.
2012-07-01
Digital Elevation Models (DEMs) allow for the efficient and consistent creation of waterways and catchment boundaries over large areas. Studies of waterway delineation from DEMs are usually undertaken over small or single catchment areas due to the nature of the problems being investigated. Improvements in Geographic Information Systems (GIS) techniques, software, hardware and data allow for analysis of larger data sets and also facilitate a consistent tool for the creation and analysis of waterways over extensive areas. However, rarely are they developed over large regional areas because of the lack of available raw data sets and the amount of work required to create the underlying DEMs. This paper examines definition of waterways and catchments over an area of approximately 25,000 km2 to establish the optimal DEM scale required for waterway delineation over large regional projects. The comparative study analysed multi-scale DEMs over two test areas (Wivenhoe catchment, 543 km2 and a detailed 13 km2 within the Wivenhoe catchment) including various data types, scales, quality, and variable catchment input parameters. Historic and available DEM data was compared to high resolution Lidar based DEMs to assess variations in the formation of stream networks. The results identified that, particularly in areas of high elevation change, DEMs at 20 m cell size created from broad scale 1:25,000 data (combined with more detailed data or manual delineation in flat areas) are adequate for the creation of waterways and catchments at a regional scale.
NASA Astrophysics Data System (ADS)
Duffy, D.; Maxwell, T. P.; Doutriaux, C.; Williams, D. N.; Chaudhary, A.; Ames, S.
2015-12-01
As the size of remote sensing observations and model output data grows, the volume of the data has become overwhelming, even to many scientific experts. As societies are forced to better understand, mitigate, and adapt to climate changes, the combination of Earth observation data and global climate model projects is crucial to not only scientists but to policy makers, downstream applications, and even the public. Scientific progress on understanding climate is critically dependent on the availability of a reliable infrastructure that promotes data access, management, and provenance. The Earth System Grid Federation (ESGF) has created such an environment for the Intergovernmental Panel on Climate Change (IPCC). ESGF provides a federated global cyber infrastructure for data access and management of model outputs generated for the IPCC Assessment Reports (AR). The current generation of the ESGF federated grid allows consumers of the data to find and download data with limited capabilities for server-side processing. Since the amount of data for future AR is expected to grow dramatically, ESGF is working on integrating server-side analytics throughout the federation. The ESGF Compute Working Team (CWT) has created a Web Processing Service (WPS) Application Programming Interface (API) to enable access scalable computational resources. The API is the exposure point to high performance computing resources across the federation. Specifically, the API allows users to execute simple operations, such as maximum, minimum, average, and anomalies, on ESGF data without having to download the data. These operations are executed at the ESGF data node site with access to large amounts of parallel computing capabilities. This presentation will highlight the WPS API, its capabilities, provide implementation details, and discuss future developments.
Meeting global policy commitments carbon sequestration and southern pine forests
Kurt H. Johnsen; David N. Wear; R. Oren; R.O. Teskey; Felipe Sanchez; Rodney E. Will; John Butnor; D. Markewitz; D. Richter; T. Rials; H.L. Allen; J. Seiler; D. Ellsworth; Christopher Maier; G. Katul; P.M. Dougherty
2001-01-01
In managed forests, the amount of carbon further sequestered will be determined by (1) the increased amount of carbon in standing biomass (resulting from land-use changes and increased productivity); (2) the amount of recalcitrant carbon remaining below ground at the end of rotations; and (3) the amount of carbon sequestered in products created from harvested wood....
NASA Astrophysics Data System (ADS)
Richardson, R.; Legleiter, C. J.; Harrison, L.
2015-12-01
Salmonids are threatened with extinction across the world from the fragmentation of riverine ecosystems from dams and diversions. In California, efforts to expand the range of spawnable habitat for native salmon by transporting fish around reservoirs is a potentially species saving idea. But, strong scientific evidence of the amount of high quality habitat is required to make these difficult management decisions. Remote sensing has long been used in fluvial settings to identify physical parameters that drive the quality of aquatic habitat; however, the true strength of remote sensing to cover large spatial extents has not been applied with the resolution that is relevant to salmonids. This project utilizes hyperspectral data of over 250 km of the Tuolumne and Merced Rivers to extract depth and bed slope from the wetted channel and NIR LiDAR for the surrounding topography. The Optimal Band Ratio Analysis (OBRA) has proven as an effective tool to create bathymetric maps of river channels in ideal settings with clear water, high amounts of bottom reflectance, and less than 3 meters deep over short distances. Results from this study show that OBRA can be applied over larger riverscapes at high resolutions (0.5 m). The depth and bed slope estimations are used to classify habitat units that are crucial to quantifying the quality and amount of habitat in these river that once produced large populations of native salmonids. As more managers look to expand habitat for these threatened species the tools developed here will be cost effective over the large extents that salmon migrate to spawn.
ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wieselquist, William A.; Thompson, Adam B.; Bowman, Stephen M.
2016-04-01
Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process datamore » to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.« less
Oil recovery by alkaline waterflooding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cooke, C.E. Jr.; Williams, R.E.; Kolodzie, P.A.
1974-01-01
Flooding of oil containing organic acids with alkaline water under favorable conditions can result in recovery of around 50% of the residual oil left in a watered-out model. A high recovery efficiency results from the formation of a bank of viscous water-in-oil emulsion as surface active agents (soaps) are created by reactions of base in the water with the organic acids in the oil. The type and amount of organic acids in the oil, the pH and salt content of the water, and the amount of fines in the porous medium are the primary factors which determine the amount ofmore » additional oil recovered by this method. Interaction of alkaline water with reservoir rock largely determines the amount of chemical needed to flood a reservoir. Laboratory investigations using synthetic oils and crude oils show the importance of oil-water and liquid-solid interfacial properties to the results of an alkaline waterflood. A small field test demonstrated that emulsion banks can be formed in the reservoir and that chemical costs can be reasonable in selected reservoirs. Although studies have provided many qualitative guide lines for evaluating the feasibility of alkaline waterflooding, the economic attractiveness of the process must be considered on an individual reservoir.« less
Phylogenetic search through partial tree mixing
2012-01-01
Background Recent advances in sequencing technology have created large data sets upon which phylogenetic inference can be performed. Current research is limited by the prohibitive time necessary to perform tree search on a reasonable number of individuals. This research develops new phylogenetic algorithms that can operate on tens of thousands of species in a reasonable amount of time through several innovative search techniques. Results When compared to popular phylogenetic search algorithms, better trees are found much more quickly for large data sets. These algorithms are incorporated in the PSODA application available at http://dna.cs.byu.edu/psoda Conclusions The use of Partial Tree Mixing in a partition based tree space allows the algorithm to quickly converge on near optimal tree regions. These regions can then be searched in a methodical way to determine the overall optimal phylogenetic solution. PMID:23320449
Tunneling into fuzzball states
NASA Astrophysics Data System (ADS)
Mathur, Samir D.
2010-01-01
String theory suggests that black hole microstates are quantum, horizon sized ‘fuzzballs', rather than smooth geometries with horizon. Radiation from fuzzballs can carry information and does not lead to information loss. But if we let a shell of matter collapse then it creates a horizon, and it seems that subsequent radiation will lead to information loss. We argue that the resolution to this problem is that the shell can tunnel to the fuzzball configurations. The amplitude for tunneling is small because we are relating two macroscopically different configurations, but the number of states that we can tunnel to, given through the Bekenstein entropy, is very large. These small and large numbers can cancel each other, making it possible for the shell to tunnel into fuzzball states before a significant amount of radiation has been emitted. This offers a way to resolve the information paradox.
Biofuels done right: land efficient animal feeds enable large environmental and energy benefits.
Dale, Bruce E; Bals, Bryan D; Kim, Seungdo; Eranki, Pragnya
2010-11-15
There is an intense ongoing debate regarding the potential scale of biofuel production without creating adverse effects on food supply. We explore the possibility of three land-efficient technologies for producing food (actually animal feed), including leaf protein concentrates, pretreated forages, and double crops to increase the total amount of plant biomass available for biofuels. Using less than 30% of total U.S. cropland, pasture, and range, 400 billion liters of ethanol can be produced annually without decreasing domestic food production or agricultural exports. This approach also reduces U.S. greenhouse gas emissions by 670 Tg CO₂-equivalent per year, or over 10% of total U.S. annual emissions, while increasing soil fertility and promoting biodiversity. Thus we can replace a large fraction of U.S. petroleum consumption without indirect land use change.
A Bootstrap Approach to Martian Manufacturing
NASA Technical Reports Server (NTRS)
Dorais, Gregory A.
2004-01-01
In-Situ Resource Utilization (ISRU) is an essential element of any affordable strategy for a sustained human presence on Mars. Ideally, Martian habitats would be extremely massive to allow plenty of room to comfortably live and work, as well as to protect the occupants from the environment. Moreover, transportation and power generation systems would also require significant mass if affordable. For our approach to ISRU, we use the industrialization of the U.S. as a metaphor. The 19th century started with small blacksmith shops and ended with massive steel mills primarily accomplished by blacksmiths increasing their production capacity and product size to create larger shops, which produced small mills, which produced the large steel mills that industrialized the country. Most of the mass of a steel mill is comprised of steel in simple shapes, which are produced and repaired with few pieces of equipment also mostly made of steel in basic shapes. Due to this simplicity, we expect that the 19th century manufacturing growth can be repeated on Mars in the 21st century using robots as the primary labor force. We suggest a "bootstrap" approach to manufacturing on Mars that uses a "seed" manufacturing system that uses regolith to create major structural components and spare parts. The regolith would be melted, foamed, and sintered as needed to fabricate parts using casting and solid freeform fabrication techniques. Complex components, such as electronics, would be brought from Earth and integrated as needed. These parts would be assembled to create additional manufacturing systems, which can be both more capable and higher capacity. These subsequent manufacturing systems could refine vast amounts of raw materials to create large components, as well as assemble equipment, habitats, pressure vessels, cranes, pipelines, railways, trains, power generation stations, and other facilities needed to economically maintain a sustained human presence on Mars.
The effects of space radiation on flight film
NASA Technical Reports Server (NTRS)
Holly, Mark H.
1995-01-01
The Shuttle and its cargo are occasionally exposed to an amount of radiation large enough to create non-image forming exposures (fog) on photographic flight film. The television/photography working group proposed a test plan to quantify the sensitivity of photographic films to space radiation. This plan was flown on STS-37 and was later incorporated into a detailed supplementary objective (DSO) which was flown on STS48. This DSO addressed the effects of significant space radiation on representative samples of six highly sensitive flight films. In addition, a lead-lined bag was evaluated as a potential shield for flight film against space radiation.
ASCOT: A Collaborative Platform for the Virtual Observatory
NASA Astrophysics Data System (ADS)
Marcos, D.; Connolly, A. J.; Krughoff, K. S.; Smith, I.; Wallace, S. C.
2012-09-01
The digital networks are changing the way that knowledge is created, structured, curated, consumed, archived and referenced. Projects like Wikipedia, Github or Galaxy Zoo have shown the potential of online communities to develop and communicate ideas. ASCOT is a web based framework that facilitates collaboration among astronomers providing a simple way to share, explore, interact and analyze large amounts of data from a broad range of sources available trough the Virtual Observatories (VO). Designed with a strong emphasis on usability, ASCOT takes advantage of the latest generation of web standards and cloud technologies to implement an extendable and customizable stack of web tools and services.
Lommen, Arjen
2009-04-15
Hyphenated full-scan MS technology creates large amounts of data. A versatile easy to handle automation tool aiding in the data analysis is very important in handling such a data stream. MetAlign softwareas described in this manuscripthandles a broad range of accurate mass and nominal mass GC/MS and LC/MS data. It is capable of automatic format conversions, accurate mass calculations, baseline corrections, peak-picking, saturation and mass-peak artifact filtering, as well as alignment of up to 1000 data sets. A 100 to 1000-fold data reduction is achieved. MetAlign software output is compatible with most multivariate statistics programs.
Chen, Mingyang; Stott, Amanda C; Li, Shenggang; Dixon, David A
2012-04-01
A robust metadata database called the Collaborative Chemistry Database Tool (CCDBT) for massive amounts of computational chemistry raw data has been designed and implemented. It performs data synchronization and simultaneously extracts the metadata. Computational chemistry data in various formats from different computing sources, software packages, and users can be parsed into uniform metadata for storage in a MySQL database. Parsing is performed by a parsing pyramid, including parsers written for different levels of data types and sets created by the parser loader after loading parser engines and configurations. Copyright © 2011 Elsevier Inc. All rights reserved.
PHENIX Measurements of Heavy Flavor in Small Systems
NASA Astrophysics Data System (ADS)
Lebedev, Alexandre
2018-01-01
The study of heavy flavor production in proton-nucleus and nucleus-nucleus collisions is a sensitive probe of the hot and dense matter created in such collisions. Installation of silicon vertex detectors in the PHENIX experiment, and increased performance of the BNL RHIC collider allowed collection of large amount of data on heavy flavor production in small colliding systems. In this talk we will present recent PHENIX results on open heavy flavor and quarkonia production in p+p, p+A, d+A, and He3+A colliding systems in a broad rapidity range, and discuss how these measurements help us to better understand all stages of nuclear collisions at high energy.
The money laundering control act and proposed amendments: Its impact on the casino industry.
Mills, J
1991-12-01
In their efforts to track unreported income, Congress passed the Money Laundering Control Act in 1985. Because they are often involved in large cash transactions, casinos were required to report on cash transactions in amounts of $10,000 or more in much the same manner as banks and other financial institutions. However, because of the unique nature of cash and chip transactions within modern casinos, the Act, or state variants of it, have created significant compliance costs for casinos. This analysis examines the implications of the Act for the casino gaming industry, and evaluates some of the recent suggested Amendments to the Act.
Lin, Yiqing; Li, Weiyong; Xu, Jin; Boulas, Pierre
2015-07-05
The aim of this study is to develop an at-line near infrared (NIR) method for the rapid and simultaneous determination of four structurally similar active pharmaceutical ingredients (APIs) in powder blends intended for the manufacturing of tablets. Two of the four APIs in the formula are present in relatively small amounts, one at 0.95% and the other at 0.57%. Such small amounts in addition to the similarity in structures add significant complexity to the blend uniformity analysis. The NIR method is developed using spectra from six laboratory-created calibration samples augmented by a small set of spectra from a large-scale blending sample. Applying the quality by design (QbD) principles, the calibration design included concentration variations of the four APIs and a main excipient, microcrystalline cellulose. A bench-top FT-NIR instrument was used to acquire the spectra. The obtained NIR spectra were analyzed by applying principal component analysis (PCA) before calibration model development. Score patterns from the PCA were analyzed to reveal relationship between latent variables and concentration variations of the APIs. In calibration model development, both PLS-1 and PLS-2 models were created and evaluated for their effectiveness in predicting API concentrations in the blending samples. The final NIR method shows satisfactory specificity and accuracy. Copyright © 2015 Elsevier B.V. All rights reserved.
Assessing the ERP-SAP implementation strategy from cultural perspectives
NASA Astrophysics Data System (ADS)
Wang, Gunawan; Syaiful, Bakhri; Sfenrianto; Nurul, Fajar Ahmad
2017-09-01
Implementing ERP-SAP projects in Indonesian large enterprises frequently create headaches for the consultants, since there are always be a large gap between the outcomes of the SAP with the expected results. Indonesian enterprises have experience with a huge amount of investments and ended up with minor benefits. Despite its unprecedented benefits, the SAP strategy is still considered as a mandatory enterprise system for every enterprise to compete in the marketplaces. The article examines the SAP implementation from cultural perspectives to present new horizon that commonly ignored by major Indonesian enterprises. The article applies the multiple case studies with three large Indonesia enterprises, such as KS, the largest steel producer; GEM, a subsidiary of conglomerate enterprise operates in the mining industry, and HS, a subsidiary of the largest retailer in Asia with more than 700 stores in Indonesia. The outcome of the article is expected to provide a comprehensive analysis from cultural perspectives regarding to common problems faced by SAP consultants.
26 CFR 1.263(a)-4 - Amounts paid to acquire or create intangibles.
Code of Federal Regulations, 2010 CFR
2010-04-01
...). (v) Creation of package design. Amounts paid to develop a package design are treated as amounts that...). For purposes of this section, the term package design means the specific graphic arrangement or design... design of a container with respect to its shape or function. (4) Coordination with other provisions of...
Polarization of the prompt gamma-ray emission from the gamma-ray burst of 6 December 2002.
Coburn, Wayne; Boggs, Steven E
2003-05-22
Observations of the afterglows of gamma-ray bursts (GRBs) have revealed that they lie at cosmological distances, and so correspond to the release of an enormous amount of energy. The nature of the central engine that powers these events and the prompt gamma-ray emission mechanism itself remain enigmatic because, once a relativistic fireball is created, the physics of the afterglow is insensitive to the nature of the progenitor. Here we report the discovery of linear polarization in the prompt gamma-ray emission from GRB021206, which indicates that it is synchrotron emission from relativistic electrons in a strong magnetic field. The polarization is at the theoretical maximum, which requires a uniform, large-scale magnetic field over the gamma-ray emission region. A large-scale magnetic field constrains possible progenitors to those either having or producing organized fields. We suggest that the large magnetic energy densities in the progenitor environment (comparable to the kinetic energy densities of the fireball), combined with the large-scale structure of the field, indicate that magnetic fields drive the GRB explosion.
Modeling The Atmosphere In The Era Of Big Data From Extremely Wide Field-Of-View Telescopes
NASA Astrophysics Data System (ADS)
Gonzalez Quiles, Junellie; Nordin, Jakob
2018-01-01
Surveys like the Sloan Digital Sky Survey (SDSS), Pan-STARRS and the Palomar Transient Factory Survey (PTF) receive large amounts of data, which need to be processed and calibrated in order to correct for various factors. One of the limiting factors in obtaining high quality data is the atmosphere, and it is therefore essential to find the appropriate calibration for the atmospheric extinction. It is to be expected that a physical atmospheric model, compared to a photometric calibration used currently by PTF, is more effective in calibrating for the atmospheric extinction due to its ability to account for rapid atmospheric fluctuation and objects of different colors. We focused on creating tools to model the atmospheric extinction for the upcoming Zwicky Transient Factory Survey (ZTF). In order to model the atmosphere, we created a program that combines input data and catalogue values, and efficiently handles them. Then, using PTF data and the SDSS catalogue, we created several models to fit the data, and tested the quality of the fits by chi-square minimization. This will allow us to optimize atmospheric extinction for the upcoming ZTF in the near future.
CWDM for very-short-reach and optical-backplane interconnections
NASA Astrophysics Data System (ADS)
Laha, Michael J.
2002-06-01
Course Wavelength Division Multiplexing (CWDM) provides access to next generation optical interconnect data rates by utilizing conventional electro-optical components that are widely available in the market today. This is achieved through the use of CWDM multiplexers and demultiplexers that integrate commodity type active components, lasers and photodiodes, into small optical subassemblies. In contrast to dense wavelength division multiplexing (DWDM), in which multiple serial data streams are combined to create aggregate data pipes perhaps 100s of gigabits wide, CWDM uses multiple laser sources contained in one module to create a serial equivalent data stream. For example, four 2.5 Gb/s lasers are multiplexed to create a 10 Gb/s data pipe. The advantages of CWDM over traditional serial optical interconnects include lower module power consumption, smaller packaging, and a superior electrical interface. This discussion will detail the concept of CWDM and design parameters that are considered when productizing a CWDM module into an industry standard optical interconnect. Additionally, a scalable parallel CWDM hybrid architecture will be described that allows the transport of large amounts of data from rack to rack in an economical fashion. This particular solution is targeted at solving optical backplane bottleneck problems predicted for the next generation terabit and petabit routers.
A replacement for islet equivalents with improved reliability and validity.
Huang, Han-Hung; Ramachandran, Karthik; Stehno-Bittel, Lisa
2013-10-01
Islet equivalent (IE), the standard estimate of isolated islet volume, is an essential measure to determine the amount of transplanted islet tissue in the clinic and is used in research laboratories to normalize results, yet it is based on the false assumption that all islets are spherical. Here, we developed and tested a new easy-to-use method to quantify islet volume with greater accuracy. Isolated rat islets were dissociated into single cells, and the total cell number per islet was determined by using computer-assisted cytometry. Based on the cell number per islet, we created a regression model to convert islet diameter to cell number with a high R2 value (0.8) and good validity and reliability with the same model applicable to young and old rats and males or females. Conventional IE measurements overestimated the tissue volume of islets. To compare results obtained using IE or our new method, we compared Glut2 protein levels determined by Western Blot and proinsulin content via ELISA between small (diameter≤100 μm) and large (diameter≥200 μm) islets. When normalized by IE, large islets showed significantly lower Glut2 level and proinsulin content. However, when normalized by cell number, large and small islets had no difference in Glut2 levels, but large islets contained more proinsulin. In conclusion, normalizing islet volume by IE overestimated the tissue volume, which may lead to erroneous results. Normalizing by cell number is a more accurate method to quantify tissue amounts used in islet transplantation and research.
A Parallel Multiclassification Algorithm for Big Data Using an Extreme Learning Machine.
Duan, Mingxing; Li, Kenli; Liao, Xiangke; Li, Keqin
2018-06-01
As data sets become larger and more complicated, an extreme learning machine (ELM) that runs in a traditional serial environment cannot realize its ability to be fast and effective. Although a parallel ELM (PELM) based on MapReduce to process large-scale data shows more efficient learning speed than identical ELM algorithms in a serial environment, some operations, such as intermediate results stored on disks and multiple copies for each task, are indispensable, and these operations create a large amount of extra overhead and degrade the learning speed and efficiency of the PELMs. In this paper, an efficient ELM based on the Spark framework (SELM), which includes three parallel subalgorithms, is proposed for big data classification. By partitioning the corresponding data sets reasonably, the hidden layer output matrix calculation algorithm, matrix decomposition algorithm, and matrix decomposition algorithm perform most of the computations locally. At the same time, they retain the intermediate results in distributed memory and cache the diagonal matrix as broadcast variables instead of several copies for each task to reduce a large amount of the costs, and these actions strengthen the learning ability of the SELM. Finally, we implement our SELM algorithm to classify large data sets. Extensive experiments have been conducted to validate the effectiveness of the proposed algorithms. As shown, our SELM achieves an speedup on a cluster with ten nodes, and reaches a speedup with 15 nodes, an speedup with 20 nodes, a speedup with 25 nodes, a speedup with 30 nodes, and a speedup with 35 nodes.
LoyalTracker: Visualizing Loyalty Dynamics in Search Engines.
Shi, Conglei; Wu, Yingcai; Liu, Shixia; Zhou, Hong; Qu, Huamin
2014-12-01
The huge amount of user log data collected by search engine providers creates new opportunities to understand user loyalty and defection behavior at an unprecedented scale. However, this also poses a great challenge to analyze the behavior and glean insights into the complex, large data. In this paper, we introduce LoyalTracker, a visual analytics system to track user loyalty and switching behavior towards multiple search engines from the vast amount of user log data. We propose a new interactive visualization technique (flow view) based on a flow metaphor, which conveys a proper visual summary of the dynamics of user loyalty of thousands of users over time. Two other visualization techniques, a density map and a word cloud, are integrated to enable analysts to gain further insights into the patterns identified by the flow view. Case studies and the interview with domain experts are conducted to demonstrate the usefulness of our technique in understanding user loyalty and switching behavior in search engines.
BANNER: an executable survey of advances in biomedical named entity recognition.
Leaman, Robert; Gonzalez, Graciela
2008-01-01
There has been an increasing amount of research on biomedical named entity recognition, the most basic text extraction problem, resulting in significant progress by different research teams around the world. This has created a need for a freely-available, open source system implementing the advances described in the literature. In this paper we present BANNER, an open-source, executable survey of advances in biomedical named entity recognition, intended to serve as a benchmark for the field. BANNER is implemented in Java as a machine-learning system based on conditional random fields and includes a wide survey of the best techniques recently described in the literature. It is designed to maximize domain independence by not employing brittle semantic features or rule-based processing steps, and achieves significantly better performance than existing baseline systems. It is therefore useful to developers as an extensible NER implementation, to researchers as a standard for comparing innovative techniques, and to biologists requiring the ability to find novel entities in large amounts of text.
Big Data Challenges in Climate Science: Improving the Next-Generation Cyberinfrastructure
NASA Technical Reports Server (NTRS)
Schnase, John L.; Lee, Tsengdar J.; Mattmann, Chris A.; Lynnes, Christopher S.; Cinquini, Luca; Ramirez, Paul M.; Hart, Andre F.; Williams, Dean N.; Waliser, Duane; Rinsland, Pamela;
2016-01-01
The knowledge we gain from research in climate science depends on the generation, dissemination, and analysis of high-quality data. This work comprises technical practice as well as social practice, both of which are distinguished by their massive scale and global reach. As a result, the amount of data involved in climate research is growing at an unprecedented rate. Climate model intercomparison (CMIP) experiments, the integration of observational data and climate reanalysis data with climate model outputs, as seen in the Obs4MIPs, Ana4MIPs, and CREATE-IP activities, and the collaborative work of the Intergovernmental Panel on Climate Change (IPCC) provide examples of the types of activities that increasingly require an improved cyberinfrastructure for dealing with large amounts of critical scientific data. This paper provides an overview of some of climate science's big data problems and the technical solutions being developed to advance data publication, climate analytics as a service, and interoperability within the Earth System Grid Federation (ESGF), the primary cyberinfrastructure currently supporting global climate research activities.
How do dragonflies recover from falling upside down?
NASA Astrophysics Data System (ADS)
Wang, Z. Jane; Melfi, James, Jr.; Leonardo, Anthony
2014-11-01
We release dragonflies from a magnetic tether so that they fall from an initially upside down orientation. To recover, the dragonflies roll their body 180 degrees every time. This set up offers an effective method for eliciting a stereotypical turn so that we can collect a large amount of data on the same turn. From the wing and body kinematics, we can tease out the strategy dragonflies use to roll their body. We record these flights with three zoomed in high-speed video cameras. By filming at 4000 to 8000fps, we measure the wing twist along each of the four wings as a part of the 3D wing kinematics. The shape of the wing twist depends on the interaction between the aerodynamic torque and the torque exerted by muscles, therefore providing clues on which of their four wings actively participate in creating the turn. By applying dynamic calculations to the measured kinematics, we further deduce the amount of torques dragonflies exert in order to turn.
Studies on the treatment of urine by the biological purification-UV photocatalytic oxidation
NASA Astrophysics Data System (ADS)
Liu, Ch. Ch; Liu, R. D.; Liu, X. S.; Chen, M.; Bian, Z. L.; Hu, J. Ch.
The water-consuming amount in a long-term astro-navigation is large In order to reduce the burden of water supply from Earth ground the space station needs to resolve the problems of water supply For this reason the recovery and regeneration of urine solution of spacemen and its utilization possess a key importance Many investigations on this aspect have been reported Our research based on biological absorption-purification-UV photocatalytic oxidation techniques with a relevant treating equipment that for a comprehensive treatment to fresh urine of spacemen has been created In this equipment the urine solution was used as the nutrient solution for the biological parts in ecological life ensurant system after absorbing the nutrient it was decomposed metabolized and purified in some distance and created a favorable condition for the follow-up oxidation treatment by UV-Photocatalytic Oxidation After these two processes the treated urine solution reached the GB5749-85 standard of water quality
Electron Impact Ionization: A New Parameterization for 100 eV to 1 MeV Electrons
NASA Technical Reports Server (NTRS)
Fang, Xiaohua; Randall, Cora E.; Lummerzheim, Dirk; Solomon, Stanley C.; Mills, Michael J.; Marsh, Daniel; Jackman, Charles H.; Wang, Wenbin; Lu, Gang
2008-01-01
Low, medium and high energy electrons can penetrate to the thermosphere (90-400 km; 55-240 miles) and mesosphere (50-90 km; 30-55 miles). These precipitating electrons ionize that region of the atmosphere, creating positively charged atoms and molecules and knocking off other negatively charged electrons. The precipitating electrons also create nitrogen-containing compounds along with other constituents. Since the electron precipitation amounts change within minutes, it is necessary to have a rapid method of computing the ionization and production of nitrogen-containing compounds for inclusion in computationally-demanding global models. A new methodology has been developed, which has parameterized a more detailed model computation of the ionizing impact of precipitating electrons over the very large range of 100 eV up to 1,000,000 eV. This new parameterization method is more accurate than a previous parameterization scheme, when compared with the more detailed model computation. Global models at the National Center for Atmospheric Research will use this new parameterization method in the near future.
Problems and Issues of High Rise Low Cost Housing in Malaysia
NASA Astrophysics Data System (ADS)
Wahi, Noraziah; Mohamad Zin, Rosli; Munikanan, Vikneswaran; Mohamad, Ismail; Junaini, Syahrizan
2018-03-01
Major cities in developing countries are undergoing an enormous migration of peoples from countryside regions. This migration from the countryside regions were mostly to develop carrier and expecting for higher salary for their living survival. Consequently, the large amount of immigrants from countryside to the cities each year had created a great demand for urban housing. The impact from that, Kuala Lumpur, Selangor and its surrounding area now is crowded by the low-income group who cannot afford to own an affordable house. The government of Malaysia had aware of this situation and therefore had created the low cost housing especially for urban poor. However, there are many issues and problems arise regarding the low cost housing in Malaysia especially in urban area. The research is regarding a study on problems and issues of high rise low-cost housing in Malaysia. The need to examine the problems associated with the high rise low cost housing is to ensure the success of future low cost housing development in Malaysia.
Faunce, Thomas; Urbas, Gregor; Skillen, Lesley; Smith, Marc
2010-12-01
The Australian Federal Government expends increasingly large amounts of money on pharmaceuticals and medical devices. It is likely, given government experience in other jurisdictions, that a significant proportion of this expenditure is paid as a result of fraudulent claims presented by corporations. In the United States, legislation such as the False Claims Act 1986 (US), the Fraud Enforcement and Recovery Act 2009 (US), the Stark (Physician Self-Referral) Statute 1995 (US), the Anti-Kickback Statute 1972 (US), the Food, Drug and Cosmetic Act 1938 (US), the Social Security Act 1965 (US), and the Patient Protection and Affordable Care Act 2010 (US) has created systematic processes allowing the United States Federal Government to recover billions of dollars in fraudulently made claims in the health and procurement areas. The crucial component involves the creation of financial incentives for information about fraud to be revealed from within the corporate sector to the appropriate state officials. This article explores the opportunities for creating a similar system in Australia in the health care setting.
Using Online Lectures to Make Time for Active Learning
Prunuske, Amy J.; Batzli, Janet; Howell, Evelyn; Miller, Sarah
2012-01-01
To make time in class for group activities devoted to critical thinking, we integrated a series of short online lectures into the homework assignments of a large, introductory biology course at a research university. The majority of students viewed the online lectures before coming to class and reported that the online lectures helped them to complete the in-class activity and did not increase the amount of time they devoted to the course. In addition, students who viewed the online lecture performed better on clicker questions designed to test lower-order cognitive skills. The in-class activities then gave the students practice analyzing the information in groups and provided the instructor with feedback about the students’ understanding of the material. On the basis of the results of this study, we support creating hybrid course models that allow students to learn the fundamental information outside of class time, thereby creating time during the class period to be dedicated toward the conceptual understanding of the material. PMID:22714412
Information Fusion of Conflicting Input Data.
Mönks, Uwe; Dörksen, Helene; Lohweg, Volker; Hübner, Michael
2016-10-29
Sensors, and also actuators or external sources such as databases, serve as data sources in order to realise condition monitoring of industrial applications or the acquisition of characteristic parameters like production speed or reject rate. Modern facilities create such a large amount of complex data that a machine operator is unable to comprehend and process the information contained in the data. Thus, information fusion mechanisms gain increasing importance. Besides the management of large amounts of data, further challenges towards the fusion algorithms arise from epistemic uncertainties (incomplete knowledge) in the input signals as well as conflicts between them. These aspects must be considered during information processing to obtain reliable results, which are in accordance with the real world. The analysis of the scientific state of the art shows that current solutions fulfil said requirements at most only partly. This article proposes the multilayered information fusion system MACRO (multilayer attribute-based conflict-reducing observation) employing the μ BalTLCS (fuzzified balanced two-layer conflict solving) fusion algorithm to reduce the impact of conflicts on the fusion result. The performance of the contribution is shown by its evaluation in the scope of a machine condition monitoring application under laboratory conditions. Here, the MACRO system yields the best results compared to state-of-the-art fusion mechanisms. The utilised data is published and freely accessible.
Information Fusion of Conflicting Input Data
Mönks, Uwe; Dörksen, Helene; Lohweg, Volker; Hübner, Michael
2016-01-01
Sensors, and also actuators or external sources such as databases, serve as data sources in order to realise condition monitoring of industrial applications or the acquisition of characteristic parameters like production speed or reject rate. Modern facilities create such a large amount of complex data that a machine operator is unable to comprehend and process the information contained in the data. Thus, information fusion mechanisms gain increasing importance. Besides the management of large amounts of data, further challenges towards the fusion algorithms arise from epistemic uncertainties (incomplete knowledge) in the input signals as well as conflicts between them. These aspects must be considered during information processing to obtain reliable results, which are in accordance with the real world. The analysis of the scientific state of the art shows that current solutions fulfil said requirements at most only partly. This article proposes the multilayered information fusion system MACRO (multilayer attribute-based conflict-reducing observation) employing the μBalTLCS (fuzzified balanced two-layer conflict solving) fusion algorithm to reduce the impact of conflicts on the fusion result. The performance of the contribution is shown by its evaluation in the scope of a machine condition monitoring application under laboratory conditions. Here, the MACRO system yields the best results compared to state-of-the-art fusion mechanisms. The utilised data is published and freely accessible. PMID:27801874
Extra-metabolic energy use and the rise in human hyper-density
NASA Astrophysics Data System (ADS)
Burger, Joseph R.; Weinberger, Vanessa P.; Marquet, Pablo A.
2017-03-01
Humans, like all organisms, are subject to fundamental biophysical laws. Van Valen predicted that, because of zero-sum dynamics, all populations of all species in a given environment flux the same amount of energy on average. Damuth’s ’energetic equivalence rule’ supported Van Valen´s conjecture by showing a tradeoff between few big animals per area with high individual metabolic rates compared to abundant small species with low energy requirements. We use metabolic scaling theory to compare variation in densities and individual energy use in human societies to other land mammals. We show that hunter-gatherers occurred at densities lower than the average for a mammal of our size. Most modern humans, in contrast, concentrate in large cities at densities up to four orders of magnitude greater than hunter-gatherers, yet consume up to two orders of magnitude more energy per capita. Today, cities across the globe flux greater energy than net primary productivity on a per area basis. This is possible by importing enormous amounts of energy and materials required to sustain hyper-dense, modern humans. The metabolic rift with nature created by modern cities fueled largely by fossil energy poses formidable challenges for establishing a sustainable relationship on a rapidly urbanizing, yet finite planet.
Extra-metabolic energy use and the rise in human hyper-density.
Burger, Joseph R; Weinberger, Vanessa P; Marquet, Pablo A
2017-03-02
Humans, like all organisms, are subject to fundamental biophysical laws. Van Valen predicted that, because of zero-sum dynamics, all populations of all species in a given environment flux the same amount of energy on average. Damuth's 'energetic equivalence rule' supported Van Valen´s conjecture by showing a tradeoff between few big animals per area with high individual metabolic rates compared to abundant small species with low energy requirements. We use metabolic scaling theory to compare variation in densities and individual energy use in human societies to other land mammals. We show that hunter-gatherers occurred at densities lower than the average for a mammal of our size. Most modern humans, in contrast, concentrate in large cities at densities up to four orders of magnitude greater than hunter-gatherers, yet consume up to two orders of magnitude more energy per capita. Today, cities across the globe flux greater energy than net primary productivity on a per area basis. This is possible by importing enormous amounts of energy and materials required to sustain hyper-dense, modern humans. The metabolic rift with nature created by modern cities fueled largely by fossil energy poses formidable challenges for establishing a sustainable relationship on a rapidly urbanizing, yet finite planet.
The Opportunity and Challenge of The Age of Big Data
NASA Astrophysics Data System (ADS)
Yunguo, Hong
2017-11-01
The arrival of large data age has gradually expanded the scale of information industry in China, which has created favorable conditions for the expansion of information technology and computer network. Based on big data the computer system service function is becoming more and more perfect, and the efficiency of data processing in the system is improving, which provides important guarantee for the implementation of production plan in various industries. At the same time, the rapid development of fields such as Internet of things, social tools, cloud computing and the widen of information channel, these make the amount of data is increase, expand the influence range of the age of big data, we need to take the opportunities and challenges of the age of big data correctly, use data information resources effectively. Based on this, this paper will study the opportunities and challenges of the era of large data.
Power feasibility of implantable digital spike sorting circuits for neural prosthetic systems.
Zumsteg, Zachary S; Kemere, Caleb; O'Driscoll, Stephen; Santhanam, Gopal; Ahmed, Rizwan E; Shenoy, Krishna V; Meng, Teresa H
2005-09-01
A new class of neural prosthetic systems aims to assist disabled patients by translating cortical neural activity into control signals for prosthetic devices. Based on the success of proof-of-concept systems in the laboratory, there is now considerable interest in increasing system performance and creating implantable electronics for use in clinical systems. A critical question that impacts system performance and the overall architecture of these systems is whether it is possible to identify the neural source of each action potential (spike sorting) in real-time and with low power. Low power is essential both for power supply considerations and heat dissipation in the brain. In this paper we report that state-of-the-art spike sorting algorithms are not only feasible using modern complementary metal oxide semiconductor very large scale integration processes, but may represent the best option for extracting large amounts of data in implantable neural prosthetic interfaces.
The role of digital cartographic data in the geosciences
Guptill, S.C.
1983-01-01
The increasing demand of the Nation's natural resource developers for the manipulation, analysis, and display of large quantities of earth-science data has necessitated the use of computers and the building of geoscience information systems. These systems require, in digital form, the spatial data on map products. The basic cartographic data shown on quadrangle maps provide a foundation for the addition of geological and geophysical data. If geoscience information systems are to realize their full potential, large amounts of digital cartographic base data must be available. A major goal of the U.S. Geological Survey is to create, maintain, manage, and distribute a national cartographic and geographic digital database. This unified database will contain numerous categories (hydrography, hypsography, land use, etc.) that, through the use of standardized data-element definitions and formats, can be used easily and flexibly to prepare cartographic products and perform geoscience analysis. ?? 1983.
Costs of migratory decisions: A comparison across eight white stork populations
Flack, Andrea; Fiedler, Wolfgang; Blas, Julio; Pokrovsky, Ivan; Kaatz, Michael; Mitropolsky, Maxim; Aghababyan, Karen; Fakriadis, Ioannis; Makrigianni, Eleni; Jerzak, Leszek; Azafzaf, Hichem; Feltrup-Azafzaf, Claudia; Rotics, Shay; Mokotjomela, Thabiso M.; Nathan, Ran; Wikelski, Martin
2016-01-01
Annual migratory movements can range from a few tens to thousands of kilometers, creating unique energetic requirements for each specific species and journey. Even within the same species, migration costs can vary largely because of flexible, opportunistic life history strategies. We uncover the large extent of variation in the lifetime migratory decisions of young white storks originating from eight populations. Not only did juvenile storks differ in their geographically distinct wintering locations, their diverse migration patterns also affected the amount of energy individuals invested for locomotion during the first months of their life. Overwintering in areas with higher human population reduced the stork’s overall energy expenditure because of shorter daily foraging trips, closer wintering grounds, or a complete suppression of migration. Because migrants can change ecological processes in several distinct communities simultaneously, understanding their life history decisions helps not only to protect migratory species but also to conserve stable ecosystems. PMID:26844294
Quantum ensembles of quantum classifiers.
Schuld, Maria; Petruccione, Francesco
2018-02-09
Quantum machine learning witnesses an increasing amount of quantum algorithms for data-driven decision making, a problem with potential applications ranging from automated image recognition to medical diagnosis. Many of those algorithms are implementations of quantum classifiers, or models for the classification of data inputs with a quantum computer. Following the success of collective decision making with ensembles in classical machine learning, this paper introduces the concept of quantum ensembles of quantum classifiers. Creating the ensemble corresponds to a state preparation routine, after which the quantum classifiers are evaluated in parallel and their combined decision is accessed by a single-qubit measurement. This framework naturally allows for exponentially large ensembles in which - similar to Bayesian learning - the individual classifiers do not have to be trained. As an example, we analyse an exponentially large quantum ensemble in which each classifier is weighed according to its performance in classifying the training data, leading to new results for quantum as well as classical machine learning.
Modeling volcanic ash dispersal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Macedonio, Giovanni
2010-10-22
Explosive volcanic eruptions inject into the atmosphere large amounts of volcanic material (ash, blocks and lapilli). Blocks and larger lapilli follow ballistic and non-ballistic trajectories and fall rapidly close to the volcano. In contrast, very fine ashes can remain entrapped in the atmosphere for months to years, and may affect the global climate in the case of large eruptions. Particles having sizes between these two end-members remain airborne from hours to days and can cover wide areas downwind. Such volcanic fallout entails a serious threat to aircraft safety and can create many undesirable effects to the communities located around themore » volcano. The assessment of volcanic fallout hazard is an important scientific, economic, and political issue, especially in densely populated areas. From a scientific point of view, considerable progress has been made during the last two decades through the use of increasingly powerful computational models and capabilities. Nowadays, models are used to quantify hazard scenarios and/or to give short-term forecasts during emergency situations. This talk will be focused on the main aspects related to modeling volcanic ash dispersal and fallout with application to the well known problem created by the Eyjafjöll volcano in Iceland. Moreover, a short description of the main volcanic monitoring techniques is presented.« less
Modeling volcanic ash dispersal
Macedonio, Giovanni
2018-05-22
Explosive volcanic eruptions inject into the atmosphere large amounts of volcanic material (ash, blocks and lapilli). Blocks and larger lapilli follow ballistic and non-ballistic trajectories and fall rapidly close to the volcano. In contrast, very fine ashes can remain entrapped in the atmosphere for months to years, and may affect the global climate in the case of large eruptions. Particles having sizes between these two end-members remain airborne from hours to days and can cover wide areas downwind. Such volcanic fallout entails a serious threat to aircraft safety and can create many undesirable effects to the communities located around the volcano. The assessment of volcanic fallout hazard is an important scientific, economic, and political issue, especially in densely populated areas. From a scientific point of view, considerable progress has been made during the last two decades through the use of increasingly powerful computational models and capabilities. Nowadays, models are used to quantify hazard scenarios and/or to give short-term forecasts during emergency situations. This talk will be focused on the main aspects related to modeling volcanic ash dispersal and fallout with application to the well known problem created by the Eyjafjöll volcano in Iceland. Moreover, a short description of the main volcanic monitoring techniques is presented.
Runaway electrons as a source of impurity and reduced fusion yield in the dense plasma focus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lerner, Eric J.; Yousefi, Hamid R.
2014-10-15
Impurities produced by the vaporization of metals in the electrodes may be a major cause of reduced fusion yields in high-current dense plasma focus devices. We propose here that a major, but hitherto-overlooked, cause of such impurities is vaporization by runaway electrons during the breakdown process at the beginning of the current pulse. This process is sufficient to account for the large amount of erosion observed in many dense plasma focus devices on the anode very near to the insulator. The erosion is expected to become worse with lower pressures, typical of machines with large electrode radii, and would explainmore » the plateauing of fusion yield observed in such machines at higher peak currents. Such runaway electron vaporization can be eliminated by the proper choice of electrode material, by reducing electrode radii and thus increasing fill gas pressure, or by using pre-ionization to eliminate the large fields that create runaway electrons. If these steps are combined with monolithic electrodes to eliminate arcing erosion, large reductions in impurities and large increases in fusion yield may be obtained, as the I{sup 4} scaling is extended to higher currents.« less
Human Growth Hormone Adsorption Kinetics and Conformation on Self-Assembled Monolayers
Buijs, Jos; Britt, David W.; Hlady, Vladimir
2012-01-01
The adsorption process of the recombinant human growth hormone on organic films, created by self-assembly of octadecyltrichlorosilane, arachidic acid, and dipalmitoylphosphatidylcholine, is investigated and compared to adsorption on silica and methylated silica substrates. Information on the adsorption process of human growth hormone (hGH) is obtained by using total internal reflection fluorescence (TIRF). The intensity, spectra, and quenching of the intrinsic fluorescence emitted by the growth hormone’s single tryptophan are monitored and related to adsorption kinetics and protein conformation. For the various alkylated hydrophobic surfaces with differences in surface density and conformational freedom it is observed that the adsorbed amount of growth hormone is relatively large if the alkyl chains are in an ordered structure while the amounts adsorbed are considerably lower for adsorption onto less ordered alkyl chains of fatty acid and phospholipid layers. Adsorption on methylated surfaces results in a relatively large conformational change in the growth hormone’s structure, as displayed by a 7 nm blue shift in emission wavelength and a large increase in the effectiveness of fluorescence quenching. Conformational changes are less evident for hGH adsorption onto the fatty acid and phospholipid alkyl chains. Adsorption kinetics on the hydrophilic head groups of the self-assembled monolayers are similar to those on solid hydrophilic surfaces. The relatively small conformational changes in the hGH structure observed for adsorption on silica are even further reduced for adsorption on fatty acid head groups. PMID:25125795
Using Unplanned Fires to Help Suppressing Future Large Fires in Mediterranean Forests
Regos, Adrián; Aquilué, Núria; Retana, Javier; De Cáceres, Miquel; Brotons, Lluís
2014-01-01
Despite the huge resources invested in fire suppression, the impact of wildfires has considerably increased across the Mediterranean region since the second half of the 20th century. Modulating fire suppression efforts in mild weather conditions is an appealing but hotly-debated strategy to use unplanned fires and associated fuel reduction to create opportunities for suppression of large fires in future adverse weather conditions. Using a spatially-explicit fire–succession model developed for Catalonia (Spain), we assessed this opportunistic policy by using two fire suppression strategies that reproduce how firefighters in extreme weather conditions exploit previous fire scars as firefighting opportunities. We designed scenarios by combining different levels of fire suppression efficiency and climatic severity for a 50-year period (2000–2050). An opportunistic fire suppression policy induced large-scale changes in fire regimes and decreased the area burnt under extreme climate conditions, but only accounted for up to 18–22% of the area to be burnt in reference scenarios. The area suppressed in adverse years tended to increase in scenarios with increasing amounts of area burnt during years dominated by mild weather. Climate change had counterintuitive effects on opportunistic fire suppression strategies. Climate warming increased the incidence of large fires under uncontrolled conditions but also indirectly increased opportunities for enhanced fire suppression. Therefore, to shift fire suppression opportunities from adverse to mild years, we would require a disproportionately large amount of area burnt in mild years. We conclude that the strategic planning of fire suppression resources has the potential to become an important cost-effective fuel-reduction strategy at large spatial scale. We do however suggest that this strategy should probably be accompanied by other fuel-reduction treatments applied at broad scales if large-scale changes in fire regimes are to be achieved, especially in the wider context of climate change. PMID:24727853
Using unplanned fires to help suppressing future large fires in Mediterranean forests.
Regos, Adrián; Aquilué, Núria; Retana, Javier; De Cáceres, Miquel; Brotons, Lluís
2014-01-01
Despite the huge resources invested in fire suppression, the impact of wildfires has considerably increased across the Mediterranean region since the second half of the 20th century. Modulating fire suppression efforts in mild weather conditions is an appealing but hotly-debated strategy to use unplanned fires and associated fuel reduction to create opportunities for suppression of large fires in future adverse weather conditions. Using a spatially-explicit fire-succession model developed for Catalonia (Spain), we assessed this opportunistic policy by using two fire suppression strategies that reproduce how firefighters in extreme weather conditions exploit previous fire scars as firefighting opportunities. We designed scenarios by combining different levels of fire suppression efficiency and climatic severity for a 50-year period (2000-2050). An opportunistic fire suppression policy induced large-scale changes in fire regimes and decreased the area burnt under extreme climate conditions, but only accounted for up to 18-22% of the area to be burnt in reference scenarios. The area suppressed in adverse years tended to increase in scenarios with increasing amounts of area burnt during years dominated by mild weather. Climate change had counterintuitive effects on opportunistic fire suppression strategies. Climate warming increased the incidence of large fires under uncontrolled conditions but also indirectly increased opportunities for enhanced fire suppression. Therefore, to shift fire suppression opportunities from adverse to mild years, we would require a disproportionately large amount of area burnt in mild years. We conclude that the strategic planning of fire suppression resources has the potential to become an important cost-effective fuel-reduction strategy at large spatial scale. We do however suggest that this strategy should probably be accompanied by other fuel-reduction treatments applied at broad scales if large-scale changes in fire regimes are to be achieved, especially in the wider context of climate change.
2017-01-01
Abstract A shared goal of many researchers has been to discover how to improve health and prevent disease, through safely replacing a large amount of daily sedentary time with physical activity in everyone, regardless of age and current health status. This involves contrasting how different muscle contractile activity patterns regulate the underlying molecular and physiological responses impacting health‐related processes. It also requires an equal attention to behavioural feasibility studies in extremely unfit and sedentary people. A sound scientific principle is that the body is constantly sensing and responding to changes in skeletal muscle metabolism induced by contractile activity. Because of that, the rapid time course of health‐related responses to physical inactivity/activity patterns are caused in large part directly because of the variable amounts of muscle inactivity/activity throughout the day. However, traditional modes and doses of exercise fall far short of replacing most of the sedentary time in the modern lifestyle, because both the weekly frequency and the weekly duration of exercise time are an order of magnitude less than those for people sitting inactive. This can explain why high amounts of sedentary time produce distinct metabolic and cardiovascular responses through inactivity physiology that are not sufficiently prevented by low doses of exercise. For these reasons, we hypothesize that maintaining a high metabolic rate over the majority of the day, through safe and sustainable types of muscular activity, will be the optimal way to create a healthy active lifestyle over the whole lifespan. PMID:28657123
Automated Meteor Detection by All-Sky Digital Camera Systems
NASA Astrophysics Data System (ADS)
Suk, Tomáš; Šimberová, Stanislava
2017-12-01
We have developed a set of methods to detect meteor light traces captured by all-sky CCD cameras. Operating at small automatic observatories (stations), these cameras create a network spread over a large territory. Image data coming from these stations are merged in one central node. Since a vast amount of data is collected by the stations in a single night, robotic storage and analysis are essential to processing. The proposed methodology is adapted to data from a network of automatic stations equipped with digital fish-eye cameras and includes data capturing, preparation, pre-processing, analysis, and finally recognition of objects in time sequences. In our experiments we utilized real observed data from two stations.
Exploring Protein Function Using the Saccharomyces Genome Database.
Wong, Edith D
2017-01-01
Elucidating the function of individual proteins will help to create a comprehensive picture of cell biology, as well as shed light on human disease mechanisms, possible treatments, and cures. Due to its compact genome, and extensive history of experimentation and annotation, the budding yeast Saccharomyces cerevisiae is an ideal model organism in which to determine protein function. This information can then be leveraged to infer functions of human homologs. Despite the large amount of research and biological data about S. cerevisiae, many proteins' functions remain unknown. Here, we explore ways to use the Saccharomyces Genome Database (SGD; http://www.yeastgenome.org ) to predict the function of proteins and gain insight into their roles in various cellular processes.
Dynamic Creation of Social Networks for Syndromic Surveillance Using Information Fusion
NASA Astrophysics Data System (ADS)
Holsopple, Jared; Yang, Shanchieh; Sudit, Moises; Stotz, Adam
To enhance the effectiveness of health care, many medical institutions have started transitioning to electronic health and medical records and sharing these records between institutions. The large amount of complex and diverse data makes it difficult to identify and track relationships and trends, such as disease outbreaks, from the data points. INFERD: Information Fusion Engine for Real-Time Decision-Making is an information fusion tool that dynamically correlates and tracks event progressions. This paper presents a methodology that utilizes the efficient and flexible structure of INFERD to create social networks representing progressions of disease outbreaks. Individual symptoms are treated as features allowing multiple hypothesis being tracked and analyzed for effective and comprehensive syndromic surveillance.
Selection Rule of Preferred Doping Site for n-Type Oxides
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, C.; Li, J.; Li, S. S.
2012-06-25
Using first-principles calculations and analysis, we show that to create shallow n-type dopants in oxides, anion site doping is preferred for more covalent oxides such as SnO{sub 2} and cation site doping is preferred for more ionic oxides such as ZnO. This is because for more ionic oxides, the conduction band minimum (CBM) state actually contains a considerable amount of O 3s orbitals, thus anion site doping can cause large perturbation on the CBM and consequently produces deeper donor levels. We also show that whether it is cation site doping or anion site doping, the oxygen-poor condition should always bemore » used.« less
Big Data and machine learning in radiation oncology: State of the art and future prospects.
Bibault, Jean-Emmanuel; Giraud, Philippe; Burgun, Anita
2016-11-01
Precision medicine relies on an increasing amount of heterogeneous data. Advances in radiation oncology, through the use of CT Scan, dosimetry and imaging performed before each fraction, have generated a considerable flow of data that needs to be integrated. In the same time, Electronic Health Records now provide phenotypic profiles of large cohorts of patients that could be correlated to this information. In this review, we describe methods that could be used to create integrative predictive models in radiation oncology. Potential uses of machine learning methods such as support vector machine, artificial neural networks, and deep learning are also discussed. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Rochelo, Mark
Urbanization is a fundamental reality in the developed and developing countries around the world creating large concentrations of the population centering on cities and urban centers. Cities can offer many opportunities for those residing there, including infrastructure, health services, rescue services and more. The living space density of cities allows for the opportunity of more effective and environmentally friendly housing, transportation and resources. Cities play a vital role in generating economic production as entities by themselves and as a part of larger urban complex. The benefits can provide for extraordinary amount of people, but only if proper planning and consideration is undertaken. Global urbanization is a progressive evolution, unique in spatial location while consistent to an overall growth pattern and trend. Remotely sensing these patterns from the last forty years of space borne satellites to understand how urbanization has developed is important to understanding past growth as well as planning for the future. Imagery from the Landsat sensor program provides the temporal component, it was the first satellite launched in 1972, providing appropriate spatial resolution needed to cover a large metropolitan statistical area to monitor urban growth and change on a large scale. This research maps the urban spatial and population growth over the Miami - Fort Lauderdale - West Palm Beach Metropolitan Statistical Area (MSA) covering Miami-Dade, Broward, and Palm Beach counties in Southeast Florida from 1974 to 2010 using Landsat imagery. Supervised Maximum Likelihood classification was performed with a combination of spectral and textural training fields employed in ERDAS Image 2014 to classify the images into urban and non-urban areas. Dasymetric mapping of the classification results were combined with census tract data then created a coherent depiction of the Miami - Fort Lauderdale - West Palm Beach MSA. Static maps and animated files were created from the final datasets for enhanced visualizations and understanding of the MSA evolution from 60-meter resolution remotely sensed Landsat images. The simplified methodology will create a database for urban planning and population growth as well as future work in this area.
31 CFR 344.2 - What general provisions apply to SLGS securities?
Code of Federal Regulations, 2010 CFR
2010-07-01
... program to create a cost-free option; (ii) To purchase a SLGS security with any amount received from the... would result in the SLGS program being used to create a cost-free option. In addition, this practice is... ability to cancel in these circumstances would result in the SLGS program being used to create a cost-free...
31 CFR 344.2 - What general provisions apply to SLGS securities?
Code of Federal Regulations, 2013 CFR
2013-07-01
... program to create a cost-free option; (ii) To purchase a SLGS security with any amount received from the... would result in the SLGS program being used to create a cost-free option. In addition, this practice is... ability to cancel in these circumstances would result in the SLGS program being used to create a cost-free...
31 CFR 344.2 - What general provisions apply to SLGS securities?
Code of Federal Regulations, 2011 CFR
2011-07-01
... program to create a cost-free option; (ii) To purchase a SLGS security with any amount received from the... would result in the SLGS program being used to create a cost-free option. In addition, this practice is... ability to cancel in these circumstances would result in the SLGS program being used to create a cost-free...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weber, Paula D.; Flores, Karen A.; Lord, David L.
Bryan Mound 5 ( BM5 ) and West Hackberry 9 ( WH9 ) have the potential to create a significant amount of new storage space should the caverns be deemed "leach - ready". This study discusses the original drilling history of the caverns, surrounding geology, current stability, and, based on this culmination of data, makes a preliminary assessment of the leach potential for the cavern. The risks associated with leaching BM5 present substantial problems for the SPR . The odd shape and large amount of insoluble material make it difficult to de termine whether a targeted leach would have themore » desired effect and create useable ullage or further distort the shape with preferential leaching . T he likelihood of salt falls and damaged or severed casing string is significant . In addition, a targeted le ach would require the relocation of approximately 27 MMB of oil . Due to the abundance of unknown factors associated with this cavern, a targeted leach of BM5 is not recommended. A targeted leaching of the neck of WH 9 could potentially eliminate or diminis h the mid - cavern ledge result ing in a more stable cavern with a more favorable shape. A better understanding of the composition of the surrounding salt and a less complicated leaching history yields more confidence in the ability to successfully leach this region. A targeted leach of WH9 can be recommended upon the completion of a full leach plan with consideration of the impacts upon nearby caverns .« less
Orzol, Leonard L.; McGrath, Timothy S.
1992-01-01
This report documents modifications to the U.S. Geological Survey modular, three-dimensional, finite-difference, ground-water flow model, commonly called MODFLOW, so that it can read and write files used by a geographic information system (GIS). The modified model program is called MODFLOWARC. Simulation programs such as MODFLOW generally require large amounts of input data and produce large amounts of output data. Viewing data graphically, generating head contours, and creating or editing model data arrays such as hydraulic conductivity are examples of tasks that currently are performed either by the use of independent software packages or by tedious manual editing, manipulating, and transferring data. Programs such as GIS programs are commonly used to facilitate preparation of the model input data and analyze model output data; however, auxiliary programs are frequently required to translate data between programs. Data translations are required when different programs use different data formats. Thus, the user might use GIS techniques to create model input data, run a translation program to convert input data into a format compatible with the ground-water flow model, run the model, run a translation program to convert the model output into the correct format for GIS, and use GIS to display and analyze this output. MODFLOWARC, avoids the two translation steps and transfers data directly to and from the ground-water-flow model. This report documents the design and use of MODFLOWARC and includes instructions for data input/output of the Basic, Block-centered flow, River, Recharge, Well, Drain, Evapotranspiration, General-head boundary, and Streamflow-routing packages. The modification to MODFLOW and the Streamflow-Routing package was minimized. Flow charts and computer-program code describe the modifications to the original computer codes for each of these packages. Appendix A contains a discussion on the operation of MODFLOWARC using a sample problem.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kesner, A; Poli, G; Beykan, S
Purpose: As the field of Nuclear Medicine moves forward with efforts to integrate radiation dosimetry into clinical practice we can identify the challenge posed by the lack of standardized dose calculation methods and protocols. All personalized internal dosimetry is derived by projecting biodistribution measurements into dosimetry calculations. In an effort to standardize organization of data and its reporting, we have developed, as a sequel to the EANM recommendation of “Good Dosimetry Reporting”, a freely available biodistribution template, which can be used to create a common point of reference for dosimetry data. It can be disseminated, interpreted, and used for methodmore » development widely across the field. Methods: A generalized biodistribution template was built in a comma delineated format (.csv) to be completed by users performing biodistribution measurements. The template is available for free download. The download site includes instructions and other usage details on the template. Results: This is a new resource developed for the community. It is our hope that users will consider integrating it into their dosimetry operations. Having biodistribution data available and easily accessible for all patients processed is a strategy for organizing large amounts of information. It may enable users to create their own databases that can be analyzed for multiple aspects of dosimetry operations. Furthermore, it enables population data to easily be reprocessed using different dosimetry methodologies. With respect to dosimetry-related research and publications, the biodistribution template can be included as supplementary material, and will allow others in the community to better compare calculations and results achieved. Conclusion: As dosimetry in nuclear medicine become more routinely applied in clinical applications, we, as a field, need to develop the infrastructure for handling large amounts of data. Our organ level biodistribution template can be used as a standard format for data collection, organization, as well as for dosimetry research and software development.« less
How Choice, Co-Creation, and Culture Are Changing What It Means to Be Net Savvy
ERIC Educational Resources Information Center
Lorenzo, George; Oblinger, Diana; Dziuban, Charles
2007-01-01
The vast amount of readily available information is just one reason for transforming the way in conducting research and acquiring knowledge. The nature of information itself has changed. In text and other formats, information is not just created by experts--it is created and co-created by amateurs. More than ever before, people can choose what,…
Personalized direct marketing using digital publishing
NASA Astrophysics Data System (ADS)
Kutty, Cheeniyil L.; Prabhakaran, Jayasree K.
2006-02-01
In today's cost-conscious business climate, marketing and customer service decision makers are increasingly concerned with how to increase customer response and retention rates. Companies spend large amounts of money on Customer Relationship Management (CRM) solutions and data acquisition but they don't know how to use the information stored in these systems to improve the effectiveness of their direct marketing campaigns. By leveraging the customer information they already have, companies can create personalized, printed direct mail programs that generate high response rates, greater returns, and stronger customer loyalty, while gaining a significant edge over their competitors. To reach the promised land of one-to-one direct marketing (personalized direct marketing - PDM), companies need an end-to-end solution for creating, managing, printing, and distributing personalized direct mail "on demand." Having access to digital printing is just one piece of the solution. A more complete approach includes leveraging personalization technology into a useful direct marketing tool that provides true one-to-one marketing, allowing variable images and text in a personalized direct mail. This paper discusses integration of CRM with a Print-on-Demand solution so as to create truly personalized printed marketing campaigns for one or many individuals based on the profile information, preferences and purchase history stored in the CRM.
NASA Technical Reports Server (NTRS)
Ferencz, Donald C.; Viterna, Larry A.
1991-01-01
ALPS is a computer program which can be used to solve general linear program (optimization) problems. ALPS was designed for those who have minimal linear programming (LP) knowledge and features a menu-driven scheme to guide the user through the process of creating and solving LP formulations. Once created, the problems can be edited and stored in standard DOS ASCII files to provide portability to various word processors or even other linear programming packages. Unlike many math-oriented LP solvers, ALPS contains an LP parser that reads through the LP formulation and reports several types of errors to the user. ALPS provides a large amount of solution data which is often useful in problem solving. In addition to pure linear programs, ALPS can solve for integer, mixed integer, and binary type problems. Pure linear programs are solved with the revised simplex method. Integer or mixed integer programs are solved initially with the revised simplex, and the completed using the branch-and-bound technique. Binary programs are solved with the method of implicit enumeration. This manual describes how to use ALPS to create, edit, and solve linear programming problems. Instructions for installing ALPS on a PC compatible computer are included in the appendices along with a general introduction to linear programming. A programmers guide is also included for assistance in modifying and maintaining the program.
Reduce, reuse and recycle: a green solution to Canada's medical isotope shortage.
Galea, R; Ross, C; Wells, R G
2014-05-01
Due to the unforeseen maintenance issues at the National Research Universal (NRU) reactor at Chalk River and coincidental shutdowns of other international reactors, a global shortage of medical isotopes (in particular technetium-99m, Tc-99m) occurred in 2009. The operation of these research reactors is expensive, their age creates concerns about their continued maintenance and the process results in a large amount of long-lived nuclear waste, whose storage cost has been subsidized by governments. While the NRU has since revived its operations, it is scheduled to cease isotope production in 2016. The Canadian government created the Non-reactor based medical Isotope Supply Program (NISP) to promote research into alternative methods for producing medical isotopes. The NRC was a member of a collaboration looking into the use of electron linear accelerators (LINAC) to produce molybdenum-99 (Mo-99), the parent isotope of Tc-99m. This paper outlines NRC's involvement in every step of this process, from the production, chemical processing, recycling and preliminary animal studies to demonstrate the equivalence of LINAC Tc-99m with the existing supply. This process stems from reusing an old idea, reduces the nuclear waste to virtually zero and recycles material to create a green solution to Canada's medical isotope shortage. © 2013 Published by Elsevier Ltd.
Occupational cancer in the European part of the Commonwealth of Independent States.
Bulbulyan, M A; Boffetta, P
1999-01-01
Precise information on the number of workers currently exposed to carcinogens in the Commonwealth of Independent States (CIS) is lacking. However, the large number of workers employed in high-risk industries such as the chemical and metal industries suggests that the number of workers potentially exposed to carcinogens may be large. In the CIS, women account for almost 50% of the industrial work force. Although no precise data are available on the number of cancers caused by occupational exposures, indirect evidence suggests that the magnitude of the problem is comparable to that observed in Western Europe, representing some 20,000 cases per year. The large number of women employed in the past and at present in industries that create potential exposure to carcinogens is a special characteristic of the CIS. In recent years an increasing amount of high-quality research has been conducted on occupational cancer in the CIS; there is, however, room for further improvement. International training programs should be established, and funds from international research and development programs should be devoted to this area. In recent years, following privatization of many large-scale industries, access to employment and exposure data is becoming increasingly difficult. PMID:10350512
NASA Technical Reports Server (NTRS)
Kolesar, C. E.
1987-01-01
Research activity on an airfoil designed for a large airplane capable of very long endurance times at a low Mach number of 0.22 is examined. Airplane mission objectives and design optimization resulted in requirements for a very high design lift coefficient and a large amount of laminar flow at high Reynolds number to increase the lift/drag ratio and reduce the loiter lift coefficient. Natural laminar flow was selected instead of distributed mechanical suction for the measurement technique. A design lift coefficient of 1.5 was identified as the highest which could be achieved with a large extent of laminar flow. A single element airfoil was designed using an inverse boundary layer solution and inverse airfoil design computer codes to create an airfoil section that would achieve performance goals. The design process and results, including airfoil shape, pressure distributions, and aerodynamic characteristics are presented. A two dimensional wind tunnel model was constructed and tested in a NASA Low Turbulence Pressure Tunnel which enabled testing at full scale design Reynolds number. A comparison is made between theoretical and measured results to establish accuracy and quality of the airfoil design technique.
NASA Astrophysics Data System (ADS)
Welch, H.; Coupe, R.; Aulenbach, B.
2012-04-01
Extreme hydrologic events, such as floods, can overwhelm a surface water system's ability to process chemicals and can move large amounts of material downstream to larger surface water bodies. The Mississippi River is the 3rd largest River in the world behind the Amazon in South America and the Congo in Africa. The Mississippi-Atchafalaya River basin grows much of the country's corn, soybean, rice, cotton, pigs, and chickens. This is large-scale modern day agriculture with large inputs of nutrients to increase yields and large applied amounts of crop protection chemicals, such as pesticides. The basin drains approximately 41% of the conterminous United States and is the largest contributor of nutrients to the Gulf of Mexico each spring. The amount of water and nutrients discharged from the Mississippi River has been related to the size of the low dissolved oxygen area that forms off of the coast of Louisiana and Texas each summer. From March through April 2011, the upper Mississippi River basin received more than five times more precipitation than normal, which combined with snow melt from the Missouri River basin, created a historic flood event that lasted from April through July. The U.S. Geological Survey, as part of the National Stream Quality Accounting Network (NASQAN), collected samples from six sites located in the lower Mississippi-Atchafalaya River basin, as well as, samples from the three flow-diversion structures or floodways: the Birds Point-New Madrid in Missouri and the Morganza and Bonnet Carré in Louisiana, from April through July. Samples were analyzed for nutrients, pesticides, suspended sediments, and particle size; results were used to determine the water quality of the river during the 2011 flood. Monthly loads for nitrate, phosphorus, pesticides (atrazine, glyphosate, fluometuron, and metolachlor), and sediment were calculated to quantify the movement of agricultural chemicals and sediment into the Gulf of Mexico. Nutrient loads were compared to historic loads to assess the effect of the flood on the zone of hypoxia that formed in the Gulf of Mexico during the spring of 2011.
Chuartzman, Silvia G; Schuldiner, Maya
2018-03-25
In the last decade several collections of Saccharomyces cerevisiae yeast strains have been created. In these collections every gene is modified in a similar manner such as by a deletion or the addition of a protein tag. Such libraries have enabled a diversity of systematic screens, giving rise to large amounts of information regarding gene functions. However, often papers describing such screens focus on a single gene or a small set of genes and all other loci affecting the phenotype of choice ('hits') are only mentioned in tables that are provided as supplementary material and are often hard to retrieve or search. To help unify and make such data accessible, we have created a Database of High Throughput Screening Hits (dHITS). The dHITS database enables information to be obtained about screens in which genes of interest were found as well as the other genes that came up in that screen - all in a readily accessible and downloadable format. The ability to query large lists of genes at the same time provides a platform to easily analyse hits obtained from transcriptional analyses or other screens. We hope that this platform will serve as a tool to facilitate investigation of protein functions to the yeast community. © 2018 The Authors Yeast Published by John Wiley & Sons Ltd.
On the relationships between sprite production and convective evolution
NASA Astrophysics Data System (ADS)
Lang, T. J.
2017-12-01
Sprites can occur in the upper atmosphere when powerful lightning creates a large charge moment change (CMC) within a thunderstorm. A growing body of research supports the inference that sprite production and convective vigor are inversely related in mature storms. In the most typical scenario, long-lived organized convection first creates an adjacent region of stratiform precipitation filled with horizontally broad layers of charge. Once the main convective region enters a weakening phase, spatially larger lightning flashes become more prevalent, and these are subsequently more likely to tap the stratiform charge. This makes the occurrence of large-CMC cloud-to-ground (CG) lightning and thus sprites more likely. This process is stochastic, however. For instance, ionospheric conditions are themselves variable and can influence the likelihood of sprites. In addition, convective morphology and microphysical/electrical structure can modulate lightning characteristics, including the frequency and location of CG occurrence, flash polarity, the amount of continuing current, the altitudes of charge layers tapped, etc. This can lead to a broad variety of sprite-producing storms, including anomalously charged convection (i.e., dominant positive charge near -20 Celsius rather than the more typical negative), abnormally small convective systems producing sprites, wintertime sprites, and other interesting examples. A review of past and present research into these and other relationships between sprites and convection will be presented, and future opportunities to study these relationships (including from spaceborne platforms) will be highlighted.
Helium stars: Towards an understanding of Wolf-Rayet evolution
NASA Astrophysics Data System (ADS)
McClelland, Liam A. S.; Eldridge, J. J.
2017-11-01
Recent observational modelling of the atmospheres of hydrogen-free Wolf-Rayet stars have indicated that their stellar surfaces are cooler than those predicted by the latest stellar evolution models. We have created a large grid of pure helium star models to investigate the dependence of the surface temperatures on factors such as the rate of mass loss and the amount of clumping in the outer convection zone. Upon comparing our results with Galactic and LMC WR observations, we find that the outer convection zones should be clumped and that the mass-loss rates need to be slightly reduced. We discuss the implications of these findings in terms of the detectability of Type Ibc supernovae progenitors, and in terms of refining the Conti scenario.
Injection and trapping of tunnel-ionized electrons into laser-produced wakes.
Pak, A; Marsh, K A; Martins, S F; Lu, W; Mori, W B; Joshi, C
2010-01-15
A method, which utilizes the large difference in ionization potentials between successive ionization states of trace atoms, for injecting electrons into a laser-driven wakefield is presented. Here a mixture of helium and trace amounts of nitrogen gas was used. Electrons from the K shell of nitrogen were tunnel ionized near the peak of the laser pulse and were injected into and trapped by the wake created by electrons from majority helium atoms and the L shell of nitrogen. The spectrum of the accelerated electrons, the threshold intensity at which trapping occurs, the forward transmitted laser spectrum, and the beam divergence are all consistent with this injection process. The experimental measurements are supported by theory and 3D OSIRIS simulations.
An Advice Mechanism for Heterogeneous Robot Teams
NASA Astrophysics Data System (ADS)
Daniluk, Steven
The use of reinforcement learning for robot teams has enabled complex tasks to be performed, but at the cost of requiring a large amount of exploration. Exchanging information between robots in the form of advice is one method to accelerate performance improvements. This thesis presents an advice mechanism for robot teams that utilizes advice from heterogeneous advisers via a method guaranteeing convergence to an optimal policy. The presented mechanism has the capability to use multiple advisers at each time step, and decide when advice should be requested and accepted, such that the use of advice decreases over time. Additionally, collective collaborative, and cooperative behavioural algorithms are integrated into a robot team architecture, to create a new framework that provides fault tolerance and modularity for robot teams.
NASA Astrophysics Data System (ADS)
Takeuchi, Kenji; Fujishige, Masatsugu; Ishida, Nobuaki; Kunieda, Yoshihiro; Kato, Yosuke; Tanaka, Yusuke; Ochi, Toshiyuki; Shirotori, Hisashi; Uzuhashi, Yuji; Ito, Suguru; Oshida, Kyo-ichi; Endo, Morinobu
2018-07-01
Carbonization and post-activation of polysaccharides (utilized as food residue) created new bio-nanocarbons for the electrode of electric double layer capacitors (EDLC). Large specific capacitance (46.1 F/g, 26.4 F/cm3) and high rate performance was confirmed under optimized conditions of carbonization temperature (600 °C) and supplied amount of sodium hydroxide in NaOH-activation process (250 wt %). The capacitance and rate performance were larger than the reported values, 42.9 F/g, 19.7 F/cm3 of currently used activated carbon MSP-20. The feature that NaOH is usable as the activation agent, instead of KOH, is advantageous for reducing the cost of EDLC.
Omics approaches in food safety: fulfilling the promise?
Bergholz, Teresa M.; Moreno Switt, Andrea I.; Wiedmann, Martin
2014-01-01
Genomics, transcriptomics, and proteomics are rapidly transforming our approaches to detection, prevention and treatment of foodborne pathogens. Microbial genome sequencing in particular has evolved from a research tool into an approach that can be used to characterize foodborne pathogen isolates as part of routine surveillance systems. Genome sequencing efforts will not only improve outbreak detection and source tracking, but will also create large amounts of foodborne pathogen genome sequence data, which will be available for data mining efforts that could facilitate better source attribution and provide new insights into foodborne pathogen biology and transmission. While practical uses and application of metagenomics, transcriptomics, and proteomics data and associated tools are less prominent, these tools are also starting to yield practical food safety solutions. PMID:24572764
Hamilton, Marc T
2018-04-15
A shared goal of many researchers has been to discover how to improve health and prevent disease, through safely replacing a large amount of daily sedentary time with physical activity in everyone, regardless of age and current health status. This involves contrasting how different muscle contractile activity patterns regulate the underlying molecular and physiological responses impacting health-related processes. It also requires an equal attention to behavioural feasibility studies in extremely unfit and sedentary people. A sound scientific principle is that the body is constantly sensing and responding to changes in skeletal muscle metabolism induced by contractile activity. Because of that, the rapid time course of health-related responses to physical inactivity/activity patterns are caused in large part directly because of the variable amounts of muscle inactivity/activity throughout the day. However, traditional modes and doses of exercise fall far short of replacing most of the sedentary time in the modern lifestyle, because both the weekly frequency and the weekly duration of exercise time are an order of magnitude less than those for people sitting inactive. This can explain why high amounts of sedentary time produce distinct metabolic and cardiovascular responses through inactivity physiology that are not sufficiently prevented by low doses of exercise. For these reasons, we hypothesize that maintaining a high metabolic rate over the majority of the day, through safe and sustainable types of muscular activity, will be the optimal way to create a healthy active lifestyle over the whole lifespan. © 2017 The Authors. The Journal of Physiology published by John Wiley & Sons Ltd on behalf of The Physiological Society.
Big Data Application in Biomedical Research and Health Care: A Literature Review.
Luo, Jake; Wu, Min; Gopukumar, Deepika; Zhao, Yiqing
2016-01-01
Big data technologies are increasingly used for biomedical and health-care informatics research. Large amounts of biological and clinical data have been generated and collected at an unprecedented speed and scale. For example, the new generation of sequencing technologies enables the processing of billions of DNA sequence data per day, and the application of electronic health records (EHRs) is documenting large amounts of patient data. The cost of acquiring and analyzing biomedical data is expected to decrease dramatically with the help of technology upgrades, such as the emergence of new sequencing machines, the development of novel hardware and software for parallel computing, and the extensive expansion of EHRs. Big data applications present new opportunities to discover new knowledge and create novel methods to improve the quality of health care. The application of big data in health care is a fast-growing field, with many new discoveries and methodologies published in the last five years. In this paper, we review and discuss big data application in four major biomedical subdisciplines: (1) bioinformatics, (2) clinical informatics, (3) imaging informatics, and (4) public health informatics. Specifically, in bioinformatics, high-throughput experiments facilitate the research of new genome-wide association studies of diseases, and with clinical informatics, the clinical field benefits from the vast amount of collected patient data for making intelligent decisions. Imaging informatics is now more rapidly integrated with cloud platforms to share medical image data and workflows, and public health informatics leverages big data techniques for predicting and monitoring infectious disease outbreaks, such as Ebola. In this paper, we review the recent progress and breakthroughs of big data applications in these health-care domains and summarize the challenges, gaps, and opportunities to improve and advance big data applications in health care.
Big Data Application in Biomedical Research and Health Care: A Literature Review
Luo, Jake; Wu, Min; Gopukumar, Deepika; Zhao, Yiqing
2016-01-01
Big data technologies are increasingly used for biomedical and health-care informatics research. Large amounts of biological and clinical data have been generated and collected at an unprecedented speed and scale. For example, the new generation of sequencing technologies enables the processing of billions of DNA sequence data per day, and the application of electronic health records (EHRs) is documenting large amounts of patient data. The cost of acquiring and analyzing biomedical data is expected to decrease dramatically with the help of technology upgrades, such as the emergence of new sequencing machines, the development of novel hardware and software for parallel computing, and the extensive expansion of EHRs. Big data applications present new opportunities to discover new knowledge and create novel methods to improve the quality of health care. The application of big data in health care is a fast-growing field, with many new discoveries and methodologies published in the last five years. In this paper, we review and discuss big data application in four major biomedical subdisciplines: (1) bioinformatics, (2) clinical informatics, (3) imaging informatics, and (4) public health informatics. Specifically, in bioinformatics, high-throughput experiments facilitate the research of new genome-wide association studies of diseases, and with clinical informatics, the clinical field benefits from the vast amount of collected patient data for making intelligent decisions. Imaging informatics is now more rapidly integrated with cloud platforms to share medical image data and workflows, and public health informatics leverages big data techniques for predicting and monitoring infectious disease outbreaks, such as Ebola. In this paper, we review the recent progress and breakthroughs of big data applications in these health-care domains and summarize the challenges, gaps, and opportunities to improve and advance big data applications in health care. PMID:26843812
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Xiaoma; Zhou, Yuyu; Asrar, Ghassem R.
High spatiotemporal land surface temperature (LST) datasets are increasingly needed in a variety of fields such as ecology, hydrology, meteorology, epidemiology, and energy systems. Moderate Resolution Imaging Spectroradiometer (MODIS) LST is one of such high spatiotemporal datasets that are widely used. But, it has large amount of missing values primarily because of clouds. Gapfilling the missing values is an important approach to create high spatiotemporal LST datasets. However current gapfilling methods have limitations in terms of accuracy and time required to assemble the data over large areas (e.g., national and continental levels). In this study, we developed a 3-step hybridmore » method by integrating a combination of daily merging, spatiotemporal gapfilling, and temporal interpolation methods, to create a high spatiotemporal LST dataset using the four daily LST observations from the two MODIS instruments on Terra and Aqua satellites. We applied this method in urban and surrounding areas for the conterminous U.S. in 2010. The evaluation of the gapfilled LST product indicates that its root mean squared error (RMSE) to be 3.3K for mid-daytime (1:30 pm) and 2.7K for mid-13 nighttime (1:30 am) observations. The method can be easily extended to other years and regions and is also applicable to other satellite products. This seamless daily (mid-daytime and mid-nighttime) LST product with 1 km spatial resolution is of great value for studying effects of urbanization (e.g., urban heat island) and the related impacts on people, ecosystems, energy systems and other infrastructure for cities.« less
Asbestos. LC Science Tracer Bullet.
ERIC Educational Resources Information Center
Evans, Joanna, Comp.
Asbestos is a generic term that refers to several silicate materials occurring naturally as fibrous rocks. Insignificant amounts of asbestos fiber can be found in ambient air, but this, and materials containing hard asbestos, usually do not create problems. Soft materials, however, can release high amounts of asbestos fibers into the air, and…
Carbonate Sediment Deposits on the Reef Front Around Oahu, Hawaii
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hampton, M A.; Blay, Charles T.; Murray, Christopher J.
2004-06-01
Large sediment deposits on the reef front around Oahu are a possible resource for replenishing eroded beaches. High-resolution subbottom profiles clearly depict the deposits in three study areas: Kailua Bay off the windward coast, Makua to Kahe Point off the leeward coast, and Camp Erdman to Waimea off the north coast. Most of the sediment is in water depths between 20 and 100 m, resting on submerged shelves created during lowstands of sea level. The mapped deposits have a volume of about 400 million cubic meters in water depths less than 100 m, being thickest off the mouth of channelsmore » carved into the modern insular shelf, from which most of the sediment issues. Vibracore samples contain various amounts of sediment of similar size to the sand on Oahu beaches, with the most compatible prospects located off Makaha, Haleiwa, and Camp Erdman and the least compatible ones located in Kailua Bay. Laboratory tests show a positive correlation of abrasion with Halimeda content; samples from Kailua Bay suffered high amounts of attrition but others were comparable to tested beach samples.« less
Waste Management Options for Long-Duration Space Missions: When to Reject, Reuse, or Recycle
NASA Technical Reports Server (NTRS)
Linne, Diane L.; Palaszewski, Bryan A.; Gokoglu, Suleyman; Gallo, Christopher A.; Balasubramaniam, Ramaswamy; Hegde, Uday G.
2014-01-01
The amount of waste generated on long-duration space missions away from Earth orbit creates the daunting challenge of how to manage the waste through reuse, rejection, or recycle. The option to merely dispose of the solid waste through an airlock to space was studied for both Earth-moon libration point missions and crewed Mars missions. Although the unique dynamic characteristics of an orbit around L2 might allow some discarded waste to intersect the lunar surface before re-impacting the spacecraft, the large amount of waste needed to be managed and potential hazards associated with volatiles recondensing on the spacecraft surfaces make this option problematic. A second option evaluated is to process the waste into useful gases to be either vented to space or used in various propulsion systems. These propellants could then be used to provide the yearly station-keeping needs at an L2 orbit, or if processed into oxygen and methane propellants, could be used to augment science exploration by enabling lunar mini landers to the far side of the moon.
Kizinievič, Olga; Balkevičius, Valdas; Pranckevičienė, Jolanta; Kizinievič, Viktor
2014-08-01
Large amounts of centrifuging waste of mineral wool melt (CMWW) are created during the production of mineral wool. CMWW is technogenic aluminum silicate raw material, formed from the particles of undefibred melt (60-70%) and mineral wool fibers (30-40%). 0.3-0.6% of organic binder with phenol and formaldehyde in its composition exists in this material. Objective of the research is to investigate the possibility to use CMWW as an additive for the production of ceramic products, by neutralising phenol and formaldehyde existing in CMWW. Formation masses were prepared by incorporating 10%, 20% and 30% of CMWW additive and burned at various temperatures. It was identified that the amount of 10-30% of CMWW additive influences the following physical and mechanical properties of the ceramic body: lowers drying and firing shrinkage, density, increases compressive strength and water absorption. Investigations carried out show that CMWW waste can be used for the production of ceramic products of various purposes. Copyright © 2014 Elsevier Ltd. All rights reserved.
Cost containment: the Middle East. Israel.
Stern, Z; Altholz, J; Sprung, C L
1994-08-01
The Israeli Health Service was established with the intent of providing an equal standard of care to the entire Israeli population. The Health Service has dealt with changes over the years, including the governing of large populations of Judea, Samaria, and Gaza. In 1990, mass immigration brought 500,000 more individuals to Israel, putting an additional burden on medical services. ICUs in Israel began to emerge after the Six Day War in 1967. The government's Ministry of Health has approved a limited amount of ICU beds. Beyond this set amount, hospital directors decide whether to establish additional ICU beds, weighing departmental pressures from within the hospital to create beds against the knowledge that the hospital will not be reimbursed more than the per diem rate of an ordinary hospital bed ($US 265). Hospital directors and administrators, knowing that the average daily cost of an ICU bed is close to $US 800, turn to their supporting organization to finance the uncontrollable deficit, seek aid from the Ministry of Health to make the per diem rates or diagnosis-related group reimbursements more realistic, and/or implement hospital policies aimed at cutting costs and personnel.
Computational biology in the cloud: methods and new insights from computing at scale.
Kasson, Peter M
2013-01-01
The past few years have seen both explosions in the size of biological data sets and the proliferation of new, highly flexible on-demand computing capabilities. The sheer amount of information available from genomic and metagenomic sequencing, high-throughput proteomics, experimental and simulation datasets on molecular structure and dynamics affords an opportunity for greatly expanded insight, but it creates new challenges of scale for computation, storage, and interpretation of petascale data. Cloud computing resources have the potential to help solve these problems by offering a utility model of computing and storage: near-unlimited capacity, the ability to burst usage, and cheap and flexible payment models. Effective use of cloud computing on large biological datasets requires dealing with non-trivial problems of scale and robustness, since performance-limiting factors can change substantially when a dataset grows by a factor of 10,000 or more. New computing paradigms are thus often needed. The use of cloud platforms also creates new opportunities to share data, reduce duplication, and to provide easy reproducibility by making the datasets and computational methods easily available.
Simple, robust storage of drops and fluids in a microfluidic device.
Boukellal, Hakim; Selimović, Seila; Jia, Yanwei; Cristobal, Galder; Fraden, Seth
2009-01-21
We describe a single microfluidic device and two methods for the passive storage of aqueous drops in a continuous stream of oil without any external control but hydrodynamic flow. Advantages of this device are that it is simple to manufacture, robust under operation, and drops never come into contact with each other, making it unnecessary to stabilize drops against coalescence. In one method the device can be used to store drops that are created upstream from the storage zone. In the second method the same device can be used to simultaneously create and store drops from a single large continuous fluid stream without resorting to the usual flow focusing or T-junction drop generation processes. Additionally, this device stores all the fluid introduced, including the first amount, with zero waste. Transport of drops in this device depends, however, on whether or not the aqueous drops wet the device walls. Analysis of drop transport in these two cases is presented. Finally, a method for extraction of the drops from the device is also presented, which works best when drops do not wet the walls of the chip.
NASA Astrophysics Data System (ADS)
Prisbrey, Shon; Park, Hye-Sook; Huntington, Channing; McNaney, James; Smith, Raym; Wehrenberg, Christopher; Swift, Damian; Panas, Cynthia; Lord, Dawn; Arsenlis, Athanasios
2017-10-01
Strength can be inferred by the amount a Rayleigh-Taylor surface deviates from classical growth when subjected to acceleration. If the acceleration is great enough, even materials highly resistant to deformation will flow. We use the National Ignition Facility (NIF) to create an acceleration profile that will cause sample metals, such as Mo or Cu, to reach peak pressures of 10 Mbar without inducing shock melt. To create such a profile we shock release a stepped density reservoir across a large gap with the stagnation of the reservoir on the far side of the gap resulting in the desired pressure drive history. Low density steps (foams) are a necessary part of this design and have been studied in the last several years on the Omega and NIF facilities. We will present computational and experimental progress that has been made on the 10 Mbar drive designs - including recent drive shots carried out at the NIF. This work was performed under the auspices of the Lawrence Livermore National Security, LLC, (LLNS) under Contract No. DE-AC52-07NA27344. LLNL-ABS-734781.
Combination of real options and game-theoretic approach in investment analysis
NASA Astrophysics Data System (ADS)
Arasteh, Abdollah
2016-09-01
Investments in technology create a large amount of capital investments by major companies. Assessing such investment projects is identified as critical to the efficient assignment of resources. Viewing investment projects as real options, this paper expands a method for assessing technology investment decisions in the linkage existence of uncertainty and competition. It combines the game-theoretic models of strategic market interactions with a real options approach. Several key characteristics underlie the model. First, our study shows how investment strategies rely on competitive interactions. Under the force of competition, firms hurry to exercise their options early. The resulting "hurry equilibrium" destroys the option value of waiting and involves violent investment behavior. Second, we get best investment policies and critical investment entrances. This suggests that integrating will be unavoidable in some information product markets. The model creates some new intuitions into the forces that shape market behavior as noticed in the information technology industry. It can be used to specify best investment policies for technology innovations and adoptions, multistage R&D, and investment projects in information technology.
Chauvin, Christine; Le Bouar, Gilbert
2007-01-01
This study aims at comparing two data bases related to occupational injuries in the French sea fishing industry: a data base created between 1977 and 1980, a data base created between 1996 and 2001. These bases were made from report forms filled out by fishermen after an accident. The study focuses on the accidents occurring when the vessel is in the process of fishing. In the 1980s, as well as today, the processing of fishing gear seems to be a very dangerous task, correlated with the risk of being "struck by, swept along, pinned" by the elements of the rigging and the risk of serious injuries. The processing and handling of the catch also cause a large amount of accidents; during these tasks, fishermen have to cope with two main risks: being "cut or pricked" and making an "excessive effort, awkward movement". Rate and features of occupational accidents do not show notable evolution. These findings lead us to question the different prevention measures implemented in France during the last few decades and to propose new prevention direction.
How much control is enough? Influence of unreliable input on user experience.
van de Laar, Bram; Plass-Oude Bos, Danny; Reuderink, Boris; Poel, Mannes; Nijholt, Anton
2013-12-01
Brain–computer interfaces (BCI) provide a valuable new input modality within human–computer interaction systems. However, like other body-based inputs such as gesture or gaze based systems, the system recognition of input commands is still far from perfect. This raises important questions, such as what level of control should such an interface be able to provide. What is the relationship between actual and perceived control? And in the case of applications for entertainment in which fun is an important part of user experience, should we even aim for the highest level of control, or is the optimum elsewhere? In this paper, we evaluate whether we can modulate the amount of control and if a game can be fun with less than perfect control. In the experiment users (n = 158) played a simple game in which a hamster has to be guided to the exit of a maze. The amount of control the user has over the hamster is varied. The variation of control through confusion matrices makes it possible to simulate the experience of using a BCI, while using the traditional keyboard for input. After each session the user completed a short questionnaire on user experience and perceived control. Analysis of the data showed that the perceived control of the user could largely be explained by the amount of control in the respective session. As expected, user frustration decreases with increasing control. Moreover, the results indicate that the relation between fun and control is not linear. Although at lower levels of control fun does increase with improved control, the level of fun drops just before perfect control is reached (with an optimum around 96%). This poses new insights for developers of games who want to incorporate some form of BCI or other modality with unreliable input in their game: for creating a fun game, unreliable input can be used to create a challenge for the user.
Deciding to Decide: How Decisions Are Made and How Some Forces Affect the Process.
McConnell, Charles R
There is a decision-making pattern that applies in all situations, large or small, although in small decisions, the steps are not especially evident. The steps are gathering information, analyzing information and creating alternatives, selecting and implementing an alternative, and following up on implementation. The amount of effort applied in any decision situation should be consistent with the potential consequences of the decision. Essentially, all decisions are subject to certain limitations or constraints, forces, or circumstances that limit one's range of choices. Follow-up on implementation is the phase of decision making most often neglected, yet it is frequently the phase that determines success or failure. Risk and uncertainty are always present in a decision situation, and the application of human judgment is always necessary. In addition, there are often emotional forces at work that can at times unwittingly steer one away from that which is best or most workable under the circumstances and toward a suboptimal result based largely on the desires of the decision maker.
Expansive Northern Volcanic Plains
2015-04-16
Mercury northern region is dominated by expansive smooth plains, created by huge amounts of volcanic material flooding across Mercury surface in the past, as seen by NASA MESSENGER spacecraft. The volcanic lava flows buried craters, leaving only traces of their rims visible. Such craters are called ghost craters, and there are many visible in this image, including a large one near the center. Wrinkle ridges cross this scene and small troughs are visible regionally within ghost craters, formed as a result of the lava cooling. The northern plains are often described as smooth since their surface has fewer impact craters and thus has been less battered by such events. This indicates that these volcanic plains are younger than Mercury's rougher surfaces. Instrument: Mercury Dual Imaging System (MDIS) Center Latitude: 60.31° N Center Longitude: 36.87° E Scale: The large ghost crater at the center of the image is approximately 103 kilometers (64 miles) in diameter http://photojournal.jpl.nasa.gov/catalog/PIA19415
Creation of a Unified Set of Core-Collapse Supernovae for Training of Photometric Classifiers
NASA Astrophysics Data System (ADS)
D'Arcy Kenworthy, William; Scolnic, Daniel; Kessler, Richard
2017-01-01
One of the key tasks for future supernova cosmology analyses is to photometrically distinguish type Ia supernovae (SNe) from their core collapse (CC) counterparts. In order to train programs for this purpose, it is necessary to train on a large number of core-collapse SNe. However, there are only a handful used for current programs. We plan to use the large amount of CC lightcurves available on the Open Supernova Catalog (OSC). Since this data is scraped from many different surveys, it is given in a number of photometric systems with different calibration and filters. We therefore created a program to fit smooth lightcurves (as a function of time) to photometric observations of arbitrary SNe. The Supercal method is then used to translate the smoothed lightcurves to a single photometric system. We can thus compile a training set of 782 supernovae, of which 127 are not type Ia. These smoothed lightcurves are also being contributed upstream to the OSC as derived data.
Development of Armenian-Georgian Virtual Observatory
NASA Astrophysics Data System (ADS)
Mickaelian, Areg; Kochiashvili, Nino; Astsatryan, Hrach; Harutyunian, Haik; Magakyan, Tigran; Chargeishvili, Ketevan; Natsvlishvili, Rezo; Kukhianidze, Vasil; Ramishvili, Giorgi; Sargsyan, Lusine; Sinamyan, Parandzem; Kochiashvili, Ia; Mikayelyan, Gor
2009-10-01
The Armenian-Georgian Virtual Observatory (ArGVO) project is the first initiative in the world to create a regional VO infrastructure based on national VO projects and regional Grid. The Byurakan and Abastumani Astrophysical Observatories are scientific partners since 1946, after establishment of the Byurakan observatory . The Armenian VO project (ArVO) is being developed since 2005 and is a part of the International Virtual Observatory Alliance (IVOA). It is based on the Digitized First Byurakan Survey (DFBS, the digitized version of famous Markarian survey) and other Armenian archival data. Similarly, the Georgian VO will be created to serve as a research environment to utilize the digitized Georgian plate archives. Therefore, one of the main goals for creation of the regional VO is the digitization of large amounts of plates preserved at the plate stacks of these two observatories. The total amount of plates is more than 100,000 units. Observational programs of high importance have been selected and some 3000 plates will be digitized during the next two years; the priority is being defined by the usefulness of the material for future science projects, like search for new objects, optical identifications of radio, IR, and X-ray sources, study of variability and proper motions, etc. Having the digitized material in VO standards, a VO database through the regional Grid infrastructure will be active. This partnership is being carried out in the framework of the ISTC project A-1606 "Development of Armenian-Georgian Grid Infrastructure and Applications in the Fields of High Energy Physics, Astrophysics and Quantum Physics".
Destruction of Navy Hazardous Wastes by Supercritical Water Oxidation
1994-08-01
cleaning and derusting (nitrite and citric acid solutions), electroplating ( acids and metal bearing solutions), electronics and refrigeration... acid forming chemical species or that contain a large amount of dissolved solids present a challenge to current SCWO •-chnology. Approved for public...Waste streams that contain a large amount of mineral- acid forming chemical species or that contain a large amount of dissolved solids present a challenge
Empirical relationships between tree fall and landscape-level amounts of logging and fire
Blanchard, Wade; Blair, David; McBurney, Lachlan; Stein, John; Banks, Sam C.
2018-01-01
Large old trees are critically important keystone structures in forest ecosystems globally. Populations of these trees are also in rapid decline in many forest ecosystems, making it important to quantify the factors that influence their dynamics at different spatial scales. Large old trees often occur in forest landscapes also subject to fire and logging. However, the effects on the risk of collapse of large old trees of the amount of logging and fire in the surrounding landscape are not well understood. Using an 18-year study in the Mountain Ash (Eucalyptus regnans) forests of the Central Highlands of Victoria, we quantify relationships between the probability of collapse of large old hollow-bearing trees at a site and the amount of logging and the amount of fire in the surrounding landscape. We found the probability of collapse increased with an increasing amount of logged forest in the surrounding landscape. It also increased with a greater amount of burned area in the surrounding landscape, particularly for trees in highly advanced stages of decay. The most likely explanation for elevated tree fall with an increasing amount of logged or burned areas in the surrounding landscape is change in wind movement patterns associated with cutblocks or burned areas. Previous studies show that large old hollow-bearing trees are already at high risk of collapse in our study area. New analyses presented here indicate that additional logging operations in the surrounding landscape will further elevate that risk. Current logging prescriptions require the protection of large old hollow-bearing trees on cutblocks. We suggest that efforts to reduce the probability of collapse of large old hollow-bearing trees on unlogged sites will demand careful landscape planning to limit the amount of timber harvesting in the surrounding landscape. PMID:29474487
Empirical relationships between tree fall and landscape-level amounts of logging and fire.
Lindenmayer, David B; Blanchard, Wade; Blair, David; McBurney, Lachlan; Stein, John; Banks, Sam C
2018-01-01
Large old trees are critically important keystone structures in forest ecosystems globally. Populations of these trees are also in rapid decline in many forest ecosystems, making it important to quantify the factors that influence their dynamics at different spatial scales. Large old trees often occur in forest landscapes also subject to fire and logging. However, the effects on the risk of collapse of large old trees of the amount of logging and fire in the surrounding landscape are not well understood. Using an 18-year study in the Mountain Ash (Eucalyptus regnans) forests of the Central Highlands of Victoria, we quantify relationships between the probability of collapse of large old hollow-bearing trees at a site and the amount of logging and the amount of fire in the surrounding landscape. We found the probability of collapse increased with an increasing amount of logged forest in the surrounding landscape. It also increased with a greater amount of burned area in the surrounding landscape, particularly for trees in highly advanced stages of decay. The most likely explanation for elevated tree fall with an increasing amount of logged or burned areas in the surrounding landscape is change in wind movement patterns associated with cutblocks or burned areas. Previous studies show that large old hollow-bearing trees are already at high risk of collapse in our study area. New analyses presented here indicate that additional logging operations in the surrounding landscape will further elevate that risk. Current logging prescriptions require the protection of large old hollow-bearing trees on cutblocks. We suggest that efforts to reduce the probability of collapse of large old hollow-bearing trees on unlogged sites will demand careful landscape planning to limit the amount of timber harvesting in the surrounding landscape.
The Origin of Life in a Terrestrial Hydrothermal Pool? The Importance of a Catalytic Surface
NASA Astrophysics Data System (ADS)
Sydow, L. A.; Bennett, P.
2013-12-01
A premise of one chemoautotrophic theory for the origin of life is that a recurring reaction catalyzed on the charged surfaces of pyrite served as the first metabolism and was later enveloped by a primitive cellular membrane. This proposed 'surface metabolism' is analogous to the reductive acetyl-CoA pathway (Wächtershäuser 1988) and requires the abiotic formation of methanethiol (CH3SH), the simplest of the alkyl thiols, which would serve the role of coenzyme-A in the surface metabolism. Abiogenic CH3SH has not previously been identified in terrestrial hot springs, but it has been produced in the laboratory under hydrothermal conditions in the presence of a catalyst, usually FeS. Its formation would occur via the following reactions, with reaction 2 requiring catalysis: CO2 + 2H2S --> CS2 + 2H2O (1) CS2 + 3H2 --> CH3SH + H2S (2) We have identified CH3SH in Cinder Pool, an acid-sulfate-chloride hot spring in Yellowstone National Park. This spring is unusual in that it contains a subaqueous molten sulfur layer (~18 m depth) and thousands of iron-sulfur-spherules floating on the surface, which are created by gas bubbling through the molten floor of the spring. Analysis with EDS has shown that cinder material, largely composed of elemental sulfur, also contains trace iron sulfide minerals, meaning it could serve as a reactive and catalytic surface for abiogenic CH3SH formation in Cinder Pool. Furthermore, the cinders themselves are highly porous, and these void spaces could trap necessary reactants near the catalytic surface. Gas samples were collected from Cinder pool in fall of 2011 using the bubble strip method. One sample contained measurable quantities of CH3SH, and all samples contained related reactant sulfur gases such as large amounts of H2S, and smaller amounts of CS2 and dimethyl disulfide. Laboratory microcosm experiments were conducted to replicate these findings in a sterile environment to ensure CH3SH generation was abiotic. Analog Cinder Pool water and either FeS, FeS2, or cinders collected from the pool itself were incubated with H2, CO2, and CS2 as reaction gases for over a week at pool temperatures. An experiment was also conducted without CS2 to see if the solid materials could act as the sole source of sulfur. All of the experimental solids were capable of catalyzing CH3SH production when CS2 was added. Without added CS2, however, only cinders produced CH3SH, while FeS and FeS2 bottles could only make small amounts of H2S. Presumably, the large amount of elemental sulfur in the cinders creates enough H2S for the subsequent generation of CS2 then CH3SH, which is catalyzed by the trace iron-sulfide they also contain. Cinders act as an excellent reaction surface for CH3SH generation, and a similar material may have played a significant part in the advent of life on earth.
Assessment and management of dead-wood habitat
Hagar, Joan
2007-01-01
The Bureau of Land Management (BLM) is in the process of revising its resource management plans for six districts in western and southern Oregon as the result of the settlement of a lawsuit brought by the American Forest Resource Council. A range of management alternatives is being considered and evaluated including at least one that will minimize reserves on O&C lands. In order to develop the bases for evaluating management alternatives, the agency needs to derive a reasonable range of objectives for key issues and resources. Dead-wood habitat for wildlife has been identified as a key resource for which decision-making tools and techniques need to be refined and clarified. Under the Northwest Forest Plan, reserves were to play an important role in providing habitat for species associated with dead wood (U.S. Department of Agriculture Forest Service and U.S. Department of the Interior Bureau of Land Management, 1994). Thus, the BLM needs to: 1) address the question of how dead wood will be provided if reserves are not included as a management strategy in the revised Resource Management Plan, and 2) be able to evaluate the effects of alternative land management approaches. Dead wood has become an increasingly important conservation issue in managed forests, as awareness of its function in providing wildlife habitat and in basic ecological processes has dramatically increased over the last several decades (Laudenslayer et al., 2002). A major concern of forest managers is providing dead wood habitat for terrestrial wildlife. Wildlife in Pacific Northwest forests have evolved with disturbances that create large amounts of dead wood; so, it is not surprising that many species are closely associated with standing (snags) or down, dead wood. In general, the occurrence or abundance of one-quarter to one-third of forest-dwelling vertebrate wildlife species, is strongly associated with availability of suitable dead-wood habitat (Bunnell et al., 1999; Rose et al., 2001). In Oregon and Washington, approximately 150 species of wildlife are reported to use dead wood in forests (O’Neil et al., 2001). Forty-seven sensitive and special-status species are associated with dead wood (Appendix A). These are key species for management consideration because concern over small or declining populations is often related to loss of suitable dead-wood habitat (Marshall et al., 1996). Primary excavators (woodpeckers) also are often the focus of dead-wood management, because they perform keystone functions in forest ecosystems by creating cavities for secondary cavity-nesters (Martin and Eadie, 1999; Aubry and Raley, 2002). A diverse guild of secondary cavity-users (including swallows, bluebirds, several species of ducks and owls, ash-throated flycatcher, flying squirrel, bats, and many other species) is unable to excavate dead wood, and therefore relies on cavities created by woodpeckers for nesting sites. Suitable nest cavities are essential for reproduction, and their availability limits population size (Newton, 1994). Thus, populations of secondary cavity-nesters are tightly linked to the habitat requirements of primary excavators. Although managers often focus on decaying wood as habitat for wildlife, the integral role dead wood plays in ecological processes is an equally important consideration for management. Rose et al. (2001) provide a thorough review of the ecological functions of dead wood in Pacific Northwest forests, briefly summarized here. Decaying wood functions in: soil development and productivity, nutrient cycling, nitrogen fixation, and carbon storage. From ridge tops, to headwater streams, to estuaries and coastal marine ecosystems, decaying wood is fundamental to diverse terrestrial and aquatic food webs. Wildlife species that use dead wood for cover or feeding are linked to these ecosystem processes through a broad array of functional roles, including facilitation of decay and trophic interactions with other organisms (Marcot, 2002; Marcot, 2003). For example, by puncturing bark and fragmenting sapwood, woodpeckers create sites favorable for wood-decaying organisms (Farris et al., 2004), which in turn create habitat for other species and facilitate nutrient cycling. Small mammals that use down wood for cover function in the dispersal of plant seeds and fungal spores (Carey et al., 1999). Resident cavitynesting birds may regulate insect populations by preying on overwintering arthropods (Jackson, 1979; Kroll and Fleet, 1979). These examples illustrate how dead wood not only directly provides habitat for a large number of wildlife species, but also forms the foundation of functional webs that critically influence forest ecosystems (Marcot, 2002; Marcot, 2003). The important and far-reaching implications of management of decaying wood highlight the need for conservation of dead-wood resources in managed forests. Consideration of the key ecological functions of species associated with dead wood can help guide management of dead wood in a framework consistent with the paradigm of ecosystem management (Marcot and Vander Heyden, 2001; Marcot, 2002.) As more information is revealed about the ecological and habitat values of decaying wood, concern has increased over a reduction in the current amounts of dead wood relative to historic levels (Ohmann and Waddell, 2002). Past management practices have tended to severely reduce amounts of dead wood throughout all stages of forest development (Hansen et al., 1991). The large amounts of legacy wood that characterize young post-disturbance forests are not realized in managed stands, because most of the wood volume is removed at harvest for economic and safety reasons. Mid-rotation thinning is used to “salvage” some mortality that might otherwise occur due to suppression, so fewer snags are recruited in mid-seral stages. Harvest rotations of 80 years or less truncate tree size in managed stands, and thus limit the production of large-diameter wood. As a consequence of these practices, dead wood has been reduced by as much as 90% after two rotations of managed Douglas-fir (Rose et al., 2001). Large legacy deadwood is becoming a scarce, critical habitat that will take decades to centuries to replace. Furthermore, management continues to have important direct and indirect effects on the amount and distribution of dead wood in forests. Current guidelines for managing dead wood may be inadequate to maintain habitat for all associated species because they largely focus on a single use of dead wood (nesting habitat) by a small suite of species (cavity-nesting birds), and may under represent the sizes and amounts of dead wood used by many wildlife species (Rose et al., 2001, Wilhere, 2003).
Magma ocean formation due to giant impacts
NASA Technical Reports Server (NTRS)
Tonks, W. B.; Melosh, H. J.
1992-01-01
The effect of giant impacts on the initial chemical and thermal states of the terrestrial planets is just now being explored. A large high speed impact creates an approximately hemispherical melt region with a radius that depends on the projectile's radius and impact speed. It is shown that giant impacts on large planets can create large, intact melt regions containing melt volumes up to a few times the volume of the projectile. These large melt regions are not created on asteroid sized bodies. If extruded to the surface, these regions contain enough melt to create a magma ocean of considerable depth, depending on the impact speed, projectile radius, and gravity of the target planet.
A simple biosynthetic pathway for large product generation from small substrate amounts
NASA Astrophysics Data System (ADS)
Djordjevic, Marko; Djordjevic, Magdalena
2012-10-01
A recently emerging discipline of synthetic biology has the aim of constructing new biosynthetic pathways with useful biological functions. A major application of these pathways is generating a large amount of the desired product. However, toxicity due to the possible presence of toxic precursors is one of the main problems for such production. We consider here the problem of generating a large amount of product from a potentially toxic substrate. To address this, we propose a simple biosynthetic pathway, which can be induced in order to produce a large number of the product molecules, by keeping the substrate amount at low levels. Surprisingly, we show that the large product generation crucially depends on fast non-specific degradation of the substrate molecules. We derive an optimal induction strategy, which allows as much as three orders of magnitude increase in the product amount through biologically realistic parameter values. We point to a recently discovered bacterial immune system (CRISPR/Cas in E. coli) as a putative example of the pathway analysed here. We also argue that the scheme proposed here can be used not only as a stand-alone pathway, but also as a strategy to produce a large amount of the desired molecules with small perturbations of endogenous biosynthetic pathways.
Potential carbon emissions dominated by carbon dioxide from thawed permafrost soils
Schadel, Christina; Bader, Martin K. F.; Schuur, Edward; ...
2016-01-01
Increasing temperatures in northern high latitudes are causing permafrost to thaw, making large amounts of previously frozen organic matter vulnerable to microbial decomposition. Permafrost thaw also creates a fragmented landscape of drier and wetter soil conditions that determine the amount and form (carbon dioxide (CO2), or methane (CH4)) of carbon (C) released to the atmosphere. The rate and form of C release control the magnitude of the permafrost C feedback, so their relative contribution with a warming climate remains unclear. We quantified the effect of increasing temperature and changes from aerobic to anaerobic soil conditions using 25 soil incubation studiesmore » from the permafrost zone. Here we show, using two separate meta-analyses, that a 10 C increase in incubation temperature increased C release by a factor of 2.0 (95% confidence interval (CI), 1.8 to 2.2). Under aerobic incubation conditions, soils released 3.4 (95% CI, 2.2 to 5.2) times more C than under anaerobic conditions. Even when accounting for the higher heat trapping capacity of CH4, soils released 2.3 (95% CI, 1.5 to 3.4) times more C under aerobic conditions. These results imply that permafrost ecosystems thawing under aerobic conditions and releasing CO2 will strengthen the permafrost C feedback more than waterlogged systemsreleasingCO2 andCH4 for a given amount of C.« less
Potential carbon emissions dominated by carbon dioxide from thawed permafrost soils
Schädel, Christina; Bader, Martin K.-F.; Schuur, Edward A.G.; Biasi, Christina; Bracho, Rosvel; Čapek, Petr; De Baets, Sarah; Diáková, Kateřina; Ernakovich, Jessica; Estop-Aragones, Cristian; Graham, David E.; Hartley, Iain P.; Iversen, Colleen M.; Kane, Evan S.; Knoblauch, Christian; Lupascu, Massimo; Martikainen, Pertti J.; Natali, Susan M.; Norby, Richard J.; O'Donnell, Jonathan A.; Roy Chowdhury, Taniya; Šantrůčková, Hana; Shaver, Gaius; Sloan, Victoria L.; Treat, Claire C.; Turetsky, Merritt R.; Waldrop, Mark P.; Wickland, Kimberly P.
2016-01-01
Increasing temperatures in northern high latitudes are causing permafrost to thaw, making large amounts of previously frozen organic matter vulnerable to microbial decomposition. Permafrost thaw also creates a fragmented landscape of drier and wetter soil conditions that determine the amount and form (carbon dioxide (CO2), or methane (CH4)) of carbon (C) released to the atmosphere. The rate and form of C release control the magnitude of the permafrost C feedback, so their relative contribution with a warming climate remains unclear. We quantified the effect of increasing temperature and changes from aerobic to anaerobic soil conditions using 25 soil incubation studies from the permafrost zone. Here we show, using two separate meta-analyses, that a 10 °C increase in incubation temperature increased C release by a factor of 2.0 (95% confidence interval (CI), 1.8 to 2.2). Under aerobic incubation conditions, soils released 3.4 (95% CI, 2.2 to 5.2) times more C than under anaerobic conditions. Even when accounting for the higher heat trapping capacity of CH4, soils released 2.3 (95% CI, 1.5 to 3.4) times more C under aerobic conditions. These results imply that permafrost ecosystems thawing under aerobic conditions and releasing CO2 will strengthen the permafrost C feedback more than waterlogged systems releasing CO2 and CH4 for a given amount of C.
Schopflocher, Donald; VanSpronsen, Eric; Spence, John C; Vallianatos, Helen; Raine, Kim D; Plotnikoff, Ronald C; Nykiforuk, Candace I J
2012-07-26
Detailed assessments of the built environment often resist data reduction and summarization. This project sought to develop a method of reducing built environment data to an extent that they can be effectively communicated to researchers and community stakeholders. We aim to help in an understanding of how these data can be used to create neighbourhood groupings based on built environment characteristics and how the process of discussing these neighbourhoods with community stakeholders can result in the development of community-informed health promotion interventions. We used the Irvine Minnesota Inventory (IMI) to assess 296 segments of a semi-rural community in Alberta. Expert raters "created" neighbourhoods by examining the data. Then, a consensus grouping was developed using cluster analysis, and the number of IMI variables to characterize the neighbourhoods was reduced by multiple discriminant function analysis. The 296 segments were reduced to a consensus set of 10 neighbourhoods, which could be separated from each other by 9 functions constructed from 24 IMI variables. Biplots of these functions were an effective means of summarizing and presenting the results of the community assessment, and stimulated community action. It is possible to use principled quantitative methods to reduce large amounts of information about the built environment into meaningful summaries. These summaries, or built environment neighbourhoods, were useful in catalyzing action with community stakeholders and led to the development of health-promoting built environment interventions.
Recent Advances in Geospatial Visualization with the New Google Earth
NASA Astrophysics Data System (ADS)
Anderson, J. C.; Poyart, E.; Yan, S.; Sargent, R.
2017-12-01
Google Earth's detailed, world-wide imagery and terrain data provide a rich backdrop for geospatial visualization at multiple scales, from global to local. The Keyhole Markup Language (KML) is an open standard that has been the primary way for users to author and share data visualizations in Google Earth. Despite its ease of use and flexibility for relatively small amounts of data, users can quickly run into difficulties and limitations working with large-scale or time-varying datasets using KML in Google Earth. Recognizing these challenges, we present our recent work toward extending Google Earth to be a more powerful data visualization platform. We describe a new KML extension to simplify the display of multi-resolution map tile pyramids - which can be created by analysis platforms like Google Earth Engine, or by a variety of other map tile production pipelines. We also describe how this implementation can pave the way to creating novel data visualizations by leveraging custom graphics shaders. Finally, we present our investigations into native support in Google Earth for data storage and transport formats that are well-suited for big raster and vector data visualization. Taken together, these capabilities make it easier to create and share new scientific data visualization experiences using Google Earth, and simplify the integration of Google Earth with existing map data products, services, and analysis pipelines.
Deep dielectric charging of the lunar regolith within permanently shadowed regions
NASA Astrophysics Data System (ADS)
Jordan, A.; Stubbs, T. J.; Joyce, C. J.; Schwadron, N.; Smith, S. S.; Spence, H.; Wilson, J. K.
2013-12-01
Galactic cosmic rays (GCRs) and solar energetic particles (SEPs) can penetrate within the lunar regolith, causing deep dielectric charging. The discharging timescale depends on the regolith's electrical conductivity and permittivity. In permanently shadowed regions (PSRs) near the lunar poles, this timescale is on the order of a lunation (~20 days). To estimate the resulting electric fields within the regolith, we develop a data-driven, one-dimensional, time-dependent model. For model inputs, we use GCR data from the Cosmic Ray Telescope for the Effects of Radiation (CRaTER) on board the Lunar Reconnaissance Orbiter (LRO) and SEP data from the Electron, Proton, and Alpha Monitor (EPAM) on the Advanced Composition Explorer (ACE). We find that, during the recent solar minimum, GCRs create persistent electric fields up to 700 V/m. We also find that large SEP events create sporadic but strong fields (>10^6 V/m) that may induce dielectric breakdown. Meteoritic gardening limits the amount of time the regolith can spend close enough to the surface to be charged by SEPs, and we find that the gardened regolith within PSRs has likely experienced >10^6 breakdown-inducing events. Since dielectric breakdown typically creates cracks along the boundaries of changes in dielectric constant, we predict repeated breakdown to have fragmented a fraction of the regolith within PSRs into its mineralogical components.
The Secret of Safety Lies in Danger.
ERIC Educational Resources Information Center
Wildavsky, Aaron
In creating and maintaining public safety, risks which entail some amount of danger are necessary. These risks must be rated regarding the amount of benefit and danger they would bring in order to ascertain the worthiness of the risk. It is important to realize that risks can often bring greater safety, that many safety devices themselves involve…
Scott, Timothy C.; Wham, Robert M.
1988-01-01
A method and system for solvent extraction where droplets are shattered by a high intensity electric field. These shattered droplets form a plurality of smaller droplets which have a greater combined surface area than the original droplet. Dispersion, coalescence and phase separation are accomplished in one vessel through the use of the single pulsing high intensity electric field. Electric field conditions are chosen so that simultaneous dispersion and coalescence are taking place in the emulsion formed in the electric field. The electric field creates a large amount of interfacial surface area for solvent extraction when the droplet is disintegrated and is capable of controlling droplet size and thus droplet stability. These operations take place in the presence of a counter current flow of the continuous phase.
Hydrogen Fueling Station Using Thermal Compression: a techno-economic analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kriha, Kenneth; Petitpas, Guillaume; Melchionda, Michael
The goal of this project was to demonstrate the technical and economic feasibility of using thermal compression to create the hydrogen pressure necessary to operate vehicle hydrogen fueling stations. The concept of utilizing the exergy within liquid hydrogen to build pressure rather than mechanical components such as compressors or cryogenic liquid pumps has several advantages. In theory, the compressor-less hydrogen station will have lower operating and maintenance costs because the compressors found in conventional stations require large amounts of electricity to run and are prone to mechanical breakdowns. The thermal compression station also utilizes some of the energy used tomore » liquefy the hydrogen as work to build pressure, this is energy that in conventional stations is lost as heat to the environment.« less
A laboratory information management system for DNA barcoding workflows.
Vu, Thuy Duong; Eberhardt, Ursula; Szöke, Szániszló; Groenewald, Marizeth; Robert, Vincent
2012-07-01
This paper presents a laboratory information management system for DNA sequences (LIMS) created and based on the needs of a DNA barcoding project at the CBS-KNAW Fungal Biodiversity Centre (Utrecht, the Netherlands). DNA barcoding is a global initiative for species identification through simple DNA sequence markers. We aim at generating barcode data for all strains (or specimens) included in the collection (currently ca. 80 k). The LIMS has been developed to better manage large amounts of sequence data and to keep track of the whole experimental procedure. The system has allowed us to classify strains more efficiently as the quality of sequence data has improved, and as a result, up-to-date taxonomic names have been given to strains and more accurate correlation analyses have been carried out.
26 CFR 20.2044-1 - Certain property for which marital deduction was previously allowed.
Code of Federal Regulations, 2011 CFR
2011-04-01
... decedent had a qualifying income interest for life and for which a deduction was allowed under section 2056... reduced by the amount of any section 2503(b) exclusion that applied to the transfer creating the interest.... Similarly, the executor could establish that the transfer creating the decedent's qualifying income interest...
Benchmark of Client and Server-Side Catchment Delineation Approaches on Web-Based Systems
NASA Astrophysics Data System (ADS)
Demir, I.; Sermet, M. Y.; Sit, M. A.
2016-12-01
Recent advances in internet and cyberinfrastructure technologies have provided the capability to acquire large scale spatial data from various gauges and sensor networks. The collection of environmental data increased demand for applications which are capable of managing and processing large-scale and high-resolution data sets. With the amount and resolution of data sets provided, one of the challenging tasks for organizing and customizing hydrological data sets is delineation of watersheds on demand. Watershed delineation is a process for creating a boundary that represents the contributing area for a specific control point or water outlet, with intent of characterization and analysis of portions of a study area. Although many GIS tools and software for watershed analysis are available on desktop systems, there is a need for web-based and client-side techniques for creating a dynamic and interactive environment for exploring hydrological data. In this project, we demonstrated several watershed delineation techniques on the web with various techniques implemented on the client-side using JavaScript and WebGL, and on the server-side using Python and C++. We also developed a client-side GPGPU (General Purpose Graphical Processing Unit) algorithm to analyze high-resolution terrain data for watershed delineation which allows parallelization using GPU. The web-based real-time analysis of watershed segmentation can be helpful for decision-makers and interested stakeholders while eliminating the need of installing complex software packages and dealing with large-scale data sets. Utilization of the client-side hardware resources also eliminates the need of servers due its crowdsourcing nature. Our goal for future work is to improve other hydrologic analysis methods such as rain flow tracking by adapting presented approaches.
2015-07-20
This dramatic image shows the NASA/ESA Hubble Space Telescope’s view of dwarf galaxy known as NGC 1140, which lies 60 million light-years away in the constellation of Eridanus. As can be seen in this image NGC 1140 has an irregular form, much like the Large Magellanic Cloud — a small galaxy that orbits the Milky Way. This small galaxy is undergoing what is known as a starburst. Despite being almost ten times smaller than the Milky Way it is creating stars at about the same rate, with the equivalent of one star the size of the Sun being created per year. This is clearly visible in the image, which shows the galaxy illuminated by bright, blue-white, young stars. Galaxies like NGC 1140 — small, starbursting and containing large amounts of primordial gas with way fewer elements heavier than hydrogen and helium than present in our Sun — are of particular interest to astronomers. Their composition makes them similar to the intensely star-forming galaxies in the early Universe. And these early Universe galaxies were the building blocks of present-day large galaxies like our galaxy, the Milky Way. But, as they are so far away these early Universe galaxies are harder to study so these closer starbursting galaxies are a good substitute for learning more about galaxy evolution . The vigorous star formation will have a very destructive effect on this small dwarf galaxy in its future. When the larger stars in the galaxy die, and explode as supernovae, gas is blown into space and may easily escape the gravitational pull of the galaxy. The ejection of gas from the galaxy means it is throwing out its potential for future stars as this gas is one of the building blocks of star formation. NGC 1140’s starburst cannot last for long.
An effective online data monitoring and saving strategy for large-scale climate simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xian, Xiaochen; Archibald, Rick; Mayer, Benjamin
Large-scale climate simulation models have been developed and widely used to generate historical data and study future climate scenarios. These simulation models often have to run for a couple of months to understand the changes in the global climate over the course of decades. This long-duration simulation process creates a huge amount of data with both high temporal and spatial resolution information; however, how to effectively monitor and record the climate changes based on these large-scale simulation results that are continuously produced in real time still remains to be resolved. Due to the slow process of writing data to disk,more » the current practice is to save a snapshot of the simulation results at a constant, slow rate although the data generation process runs at a very high speed. This study proposes an effective online data monitoring and saving strategy over the temporal and spatial domains with the consideration of practical storage and memory capacity constraints. Finally, our proposed method is able to intelligently select and record the most informative extreme values in the raw data generated from real-time simulations in the context of better monitoring climate changes.« less
An effective online data monitoring and saving strategy for large-scale climate simulations
Xian, Xiaochen; Archibald, Rick; Mayer, Benjamin; ...
2018-01-22
Large-scale climate simulation models have been developed and widely used to generate historical data and study future climate scenarios. These simulation models often have to run for a couple of months to understand the changes in the global climate over the course of decades. This long-duration simulation process creates a huge amount of data with both high temporal and spatial resolution information; however, how to effectively monitor and record the climate changes based on these large-scale simulation results that are continuously produced in real time still remains to be resolved. Due to the slow process of writing data to disk,more » the current practice is to save a snapshot of the simulation results at a constant, slow rate although the data generation process runs at a very high speed. This study proposes an effective online data monitoring and saving strategy over the temporal and spatial domains with the consideration of practical storage and memory capacity constraints. Finally, our proposed method is able to intelligently select and record the most informative extreme values in the raw data generated from real-time simulations in the context of better monitoring climate changes.« less
NASA Astrophysics Data System (ADS)
Ozturk, D.; Chaudhary, A.; Votava, P.; Kotfila, C.
2016-12-01
Jointly developed by Kitware and NASA Ames, GeoNotebook is an open source tool designed to give the maximum amount of flexibility to analysts, while dramatically simplifying the process of exploring geospatially indexed datasets. Packages like Fiona (backed by GDAL), Shapely, Descartes, Geopandas, and PySAL provide a stack of technologies for reading, transforming, and analyzing geospatial data. Combined with the Jupyter notebook and libraries like matplotlib/Basemap it is possible to generate detailed geospatial visualizations. Unfortunately, visualizations generated is either static or does not perform well for very large datasets. Also, this setup requires a great deal of boilerplate code to create and maintain. Other extensions exist to remedy these problems, but they provide a separate map for each input cell and do not support map interactions that feed back into the python environment. To support interactive data exploration and visualization on large datasets we have developed an extension to the Jupyter notebook that provides a single dynamic map that can be managed from the Python environment, and that can communicate back with a server which can perform operations like data subsetting on a cloud-based cluster.
ANDERSON, JR; MOHAMMED, S; GRIMM, B; JONES, BW; KOSHEVOY, P; TASDIZEN, T; WHITAKER, R; MARC, RE
2011-01-01
Modern microscope automation permits the collection of vast amounts of continuous anatomical imagery in both two and three dimensions. These large data sets present significant challenges for data storage, access, viewing, annotation and analysis. The cost and overhead of collecting and storing the data can be extremely high. Large data sets quickly exceed an individual's capability for timely analysis and present challenges in efficiently applying transforms, if needed. Finally annotated anatomical data sets can represent a significant investment of resources and should be easily accessible to the scientific community. The Viking application was our solution created to view and annotate a 16.5 TB ultrastructural retinal connectome volume and we demonstrate its utility in reconstructing neural networks for a distinctive retinal amacrine cell class. Viking has several key features. (1) It works over the internet using HTTP and supports many concurrent users limited only by hardware. (2) It supports a multi-user, collaborative annotation strategy. (3) It cleanly demarcates viewing and analysis from data collection and hosting. (4) It is capable of applying transformations in real-time. (5) It has an easily extensible user interface, allowing addition of specialized modules without rewriting the viewer. PMID:21118201
Synthesis of Monolayer MoS2 by Chemical Vapor Deposition
NASA Astrophysics Data System (ADS)
Withanage, Sajeevi; Lopez, Mike; Dumas, Kenneth; Jung, Yeonwoong; Khondaker, Saiful
Finite and layer-tunable band gap of transition metal dichalcogenides (TMDs) including molybdenum disulfide (MoS2) are highlighted over the zero band gap graphene in various semiconductor applications. Weak interlayer Van der Waal bonding of bulk MoS2 allows to cleave few to single layer MoS2 using top-down methods such as mechanical and chemical exfoliation, however few micron size of these flakes limit MoS2 applications to fundamental research. Bottom-up approaches including the sulfurization of molybdenum (Mo) thin films and co-evaporation of Mo and sulfur precursors received the attention due to their potential to synthesize large area. We synthesized monolayer MoS2 on Si/SiO2 substrates by atmospheric pressure Chemical Vapor Deposition (CVD) methods using sulfur and molybdenum trioxide (MoO3) as precursors. Several growth conditions were tested including precursor amounts, growth temperature, growth time and flow rate. Raman, photoluminescence (PL) and atomic force microscopy (AFM) confirmed monolayer islands merging to create large area were observed with grain sizes up to 70 μm without using any seeds or seeding promoters. These studies provide in-depth knowledge to synthesize high quality large area MoS2 for prospective electronics applications.
Student profiling on university co-curriculum activities using data visualization tools
NASA Astrophysics Data System (ADS)
Jamil, Jastini Mohd.; Shaharanee, Izwan Nizal Mohd
2017-11-01
Co-curricular activities are playing a vital role in the development of a holistic student. Co-curriculum can be described as an extension of the formal learning experiences in a course or academic program. There are many co-curriculum activities such as students' participation in sports, volunteerism, leadership, entrepreneurship, uniform body, student council, and other social events. The number of student involves in co-curriculum activities are large, thus creating an enormous volume of data including their demographic facts, academic performance and co-curriculum types. The task for discovering and analyzing these information becomes increasingly difficult and hard to comprehend. Data visualization offer a better ways in handling with large volume of information. The need for an understanding of these various co-curriculum activities and their effect towards student performance are essential. Visualizing these information can help related stakeholders to become aware of hidden and interesting information from large amount of data drowning in their student data. The main objective of this study is to provide a clearer understanding of the different trends hidden in the student co-curriculum activities data with related to their activities and academic performances. The data visualization software was used to help visualize the data extracted from the database.
NASA Astrophysics Data System (ADS)
Rabien, Sebastian; Barl, Lothar; Beckmann, Udo; Bonaglia, Marco; Borelli, José Luis; Brynnel, Joar; Buschkamp, Peter; Busoni, Lorenzo; Christou, Julian; Connot, Claus; Davies, Richard; Deysenroth, Matthias; Esposito, Simone; Gässler, Wolfgang; Gemperlein, Hans; Hart, Michael; Kulas, Martin; Lefebvre, Michael; Lehmitz, Michael; Mazzoni, Tommaso; Nussbaum, Edmund; Orban de Xivry, Gilles; Peter, Diethard; Quirrenbach, Andreas; Raab, Walfried; Rahmer, Gustavo; Storm, Jesper; Ziegleder, Julian
2014-07-01
ARGOS is the Laser Guide Star and Wavefront sensing facility for the Large Binocular Telescope. With first laser light on sky in 2013, the system is currently undergoing commissioning at the telescope. We present the overall status and design, as well as first results on sky. Aiming for a wide field ground layer correction, ARGOS is designed as a multi- Rayleigh beacon adaptive optics system. A total of six powerful pulsed lasers are creating the laser guide stars in constellations above each of the LBTs primary mirrors. With a range gated detection in the wavefront sensors, and the adaptive correction by the deformable secondary's, we expect ARGOS to enhance the image quality over a large range of seeing conditions. With the two wide field imaging and spectroscopic instruments LUCI1 and LUCI2 as receivers, a wide range of scientific programs will benefit from ARGOS. With an increased resolution, higher encircled energy, both imaging and MOS spectroscopy will be boosted in signal to noise by a large amount. Apart from the wide field correction ARGOS delivers in its ground layer mode, we already foresee the implementation of a hybrid Sodium with Rayleigh beacon combination for a diffraction limited AO performance.
Aqueous Two-Phase Systems at Large Scale: Challenges and Opportunities.
Torres-Acosta, Mario A; Mayolo-Deloisa, Karla; González-Valdez, José; Rito-Palomares, Marco
2018-06-07
Aqueous two-phase systems (ATPS) have proved to be an efficient and integrative operation to enhance recovery of industrially relevant bioproducts. After ATPS discovery, a variety of works have been published regarding their scaling from 10 to 1000 L. Although ATPS have achieved high recovery and purity yields, there is still a gap between their bench-scale use and potential industrial applications. In this context, this review paper critically analyzes ATPS scale-up strategies to enhance the potential industrial adoption. In particular, large-scale operation considerations, different phase separation procedures, the available optimization techniques (univariate, response surface methodology, and genetic algorithms) to maximize recovery and purity and economic modeling to predict large-scale costs, are discussed. ATPS intensification to increase the amount of sample to process at each system, developing recycling strategies and creating highly efficient predictive models, are still areas of great significance that can be further exploited with the use of high-throughput techniques. Moreover, the development of novel ATPS can maximize their specificity increasing the possibilities for the future industry adoption of ATPS. This review work attempts to present the areas of opportunity to increase ATPS attractiveness at industrial levels. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Grandmothers' productivity and the HIV/AIDS pandemic in sub-Saharan Africa.
Bock, John; Johnson, Sara E
2008-06-01
The human immunodeficiency virus (HIV)/acquired immune deficiency syndrome (AIDS) pandemic has left large numbers of orphans in sub-Saharan Africa. Botswana has an HIV prevalence rate of approximately 40% in adults. Morbidity and mortality are high, and in a population of a 1.3 million there are nearly 50,000 children who have lost one or both parents to HIV/AIDS. The extended family, particularly grandparents, absorbs much of the childrearing responsibilities. This creates large amounts of additional work for grandmothers especially. The embodied capital model and the grandmother hypothesis are both derived from life history theory within evolutionary ecology, and both predict that one important factor in the evolution of the human extended family structure is that postreproductive individuals such as grandmothers provide substantial support to their grandchildren's survival. Data collected in the pre-pandemic context in a traditional multi-ethnic community in the Okavango Delta of Botswana are analyzed to calculate the amount of work effort provided to a household by women of different ages. Results show that the contributions of older and younger women to the household in term of both productivity and childrearing are qualitatively and quantitatively different. These results indicate that it is unrealistic to expect older women to be able to compensate for the loss of younger women's contributions to the household, and that interventions be specifically designed to support older women based on the type of activities in which they engage that affect child survival, growth, and development.
Altitude Wind Tunnel Operating at Night
1945-04-21
The Altitude Wind Tunnel (AWT) during one of its overnight runs at the National Advisory Committee for Aeronautics (NACA) Aircraft Engine Research Laboratory in Cleveland, Ohio. The AWT was run during night hours so that its massive power loads were handled when regional electric demands were lowest. At the time the AWT was among the most complex wind tunnels ever designed. In order to simulate conditions at high altitudes, NACA engineers designed innovative new systems that required tremendous amounts of electricity. The NACA had an agreement with the local electric company that it would run its larger facilities overnight when local demand was at its lowest. In return the utility discounted its rates for the NACA during those hours. The AWT could produce wind speeds up to 500 miles per hour through its 20-foot-diameter test section at the standard operating altitude of 30,000 feet. The airflow was created by a large fan that was driven by an 18,000-horsepower General Electric induction motor. The altitude simulation was accomplished by large exhauster and refrigeration systems. The cold temperatures were created by 14 Carrier compressors and the thin atmosphere by four 1750-horsepower exhausters. The first and second shifts usually set up and broke down the test articles, while the third shift ran the actual tests. Engineers would often have to work all day, then operate the tunnel overnight, and analyze the data the next day. The night crew usually briefed the dayshift on the tests during morning staff meetings.
Using Geomorphic Change Detection to Understand Restoration Project Success Relative to Stream Size
NASA Astrophysics Data System (ADS)
Yeager, A.; Segura, C.
2017-12-01
Large wood (LW) jams have long been utilized as a stream restoration strategy to create fish habitat, with a strong focus on Coho salmon in the Pacific Northwest. These projects continue to be implemented despite limited understanding of their success in streams of different size. In this study, we assessed the changes triggered by LW introductions in 10 alluvial plane bed reaches with varying drainage areas (3.9-22 km²) and bankfull widths (6.4-14.7 m) in one Oregon Coast Range basin. In this basin, LW was added in an effort to improve winter rearing habitat for Coho salmon. We used detailed topographic mapping (0.5 m² resolution) to describe the local stream and floodplain geometry. Pebble counts were used to monitor changes in average substrate size after the LW addition. Field surveys were conducted immediately after the LW were installed, in the summer of 2016, and one year after installation, in the summer of 2017. We used geomorphic change detection analysis to quantify the amount of scour and deposition at each site along with changes in average bankfull width. Then we determined the relative amount of change among all sites to identify which size stream changed the most. We also modeled fluctuations in water surface elevation at each site, correlating frequency and inundation of the LW with geomorphic changes detected from the topographic surveys. Preliminary results show an increase in channel width and floodplain connectivity at all sites, indicating an increase in off-channel habitat for juvenile Coho salmon. Bankfull widths increased up to 75% in small sites and up to 25% in large sites. Median grain size became coarser in large streams (increased up to 20%), while we saw a similar amount of fining at smaller sites. The overall increase in channel width is compensated by an overall decrease in bed elevation at both large and small sites, suggesting the maintenance of overall geomorphic equilibrium. Further work will include quantifying these geomorphic changes in the context of critical salmon habitat factors. By identifying which size stream changes the most after LW introduction, and linking this change to salmon habitat metrics, we will provide information to aid in optimizing future LW stream restoration efforts that focus on stream reaches likely to experience the greatest increase in fish habitat.
ERIC Educational Resources Information Center
Choi, Gi Woong; Pursel, Barton K.; Stubbs, Chris
2017-01-01
Interest towards implementing educational gaming into courses within higher education continues to increase, but it requires extensive amounts of resources to create individual games for each course. This paper is a description of a university's effort to create a custom educational game engine to streamline the game development process within the…
ERIC Educational Resources Information Center
Parra, Sergio
2016-01-01
The use of video podcasts in education has emerged as a phenomenon that has gained a considerable amount of attention over the last few years. Although video podcasting is becoming a well-established technology in higher education, new multimedia instructional strategies such as student-created video podcasts in grades K-12 are under-researched.…
Carbon sequestration in two created riverine wetlands in the midwestern United States.
Bernal, Blanca; Mitsch, William J
2013-07-01
Wetlands have the ability to accumulate significant amounts of carbon (C) and thus could provide an effective approach to mitigate greenhouse gas accumulation in the atmosphere. Wetland hydrology, age, and management can affect primary productivity, decomposition, and ultimately C sequestration in riverine wetlands, but these aspects of wetland biogeochemistry have not been adequately investigated, especially in created wetlands. In this study we investigate the ability of created freshwater wetlands to sequester C by determining the sediment accretion and soil C accumulation of two 15-yr-old created wetlands in central Ohio-one planted and one naturally colonized. We measured the amount of sediment and soil C accumulated over the parent material and found that these created wetlands accumulated an average of 242 g C m yr, 70% more than a similar natural wetland in the region and 26% more than the rate estimated for these same wetlands 5 yr before this study. The C sequestration of the naturally colonized wetland was 22% higher than that of the planted wetland (267 ± 17 vs. 219 ± 15 g C m yr, respectively). Soil C accrual accounted for 66% of the aboveground net primary productivity on average. Open water communities had the highest C accumulation rates in both wetlands. This study shows that created wetlands can be natural, cost-effective tools to sequester C to mitigate the effect of greenhouse gas emissions. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
Potteiger, Kelly; Pitney, William A; Cappaert, Thomas A; Wolfe, Angela
2017-12-01
Environmental sustainability is a critical concern in health care. Similar to other professions, the practice of athletic training necessitates the use of a large quantity of natural and manufactured resources. To examine the perceptions of the waste produced by the practice of athletic training and the green practices currently used by athletic trainers (ATs) to combat this waste. Mixed-methods study. Field setting. A total of 442 ATs completed the study. Sixteen individuals participated in the qualitative portion. Data from sections 2 and 3 of the Athletic Training Environmental Impact Survey were analyzed. Focus groups and individual interviews were used to determine participants' views of waste and the efforts used to combat waste. Descriptive statistics were used to examine types of waste. Independent t tests, χ 2 tests, and 1-way analyses of variance were calculated to identify any differences between the knowledge and use of green techniques. Interviews and focus groups were transcribed verbatim and analyzed inductively. Participants reported moderate knowledge of green techniques (3.18 ± 0.53 on a 5-point Likert scale). Fifty-eight percent (n = 260) of survey participants perceived that a substantial amount of waste was produced by the practice of athletic training. Ninety-two percent (n = 408) admitted they thought about the waste produced in their daily practice. The types of waste reported most frequently were plastics (n = 111, 29%), water (n = 88, 23%), and paper for administrative use (n = 81, 21%). Fifty-two percent (n = 234) agreed this waste directly affected the environment. The qualitative aspect of the study reinforced recognition of the large amount of waste produced by the practice of athletic training. Types of conservation practices used by ATs were also explored. Participants reported concern regarding the waste produced by athletic training. The amount of waste varies depending on practice size and setting. Future researchers should use direct measures to determine the amount of waste created by the practice of athletic training.
Handspinning Enabled Highly Concentrated Carbon Nanotubes with Controlled Orientation in Nanofibers
Lee, Hoik; Watanabe, Kei; Kim, Myungwoong; Gopiraman, Mayakrishnan; Song, Kyung-Hun; Lee, Jung Soon; Kim, Ick Soo
2016-01-01
The novel method, handspinning (HS), was invented by mimicking commonly observed methods in our daily lives. The use of HS allows us to fabricate carbon nanotube-reinforced nanofibers (CNT-reinforced nanofibers) by addressing three significant challenges: (i) the difficulty of forming nanofibers at high concentrations of CNTs, (ii) aggregation of the CNTs, and (iii) control of the orientation of the CNTs. The handspun nanofibers showed better physical properties than fibers fabricated by conventional methods, such as electrospinning. Handspun nanofibers retain a larger amount of CNTs than electrospun nanofibers, and the CNTs are easily aligned uniaxially. We attributed these improvements provided by the HS process to simple mechanical stretching force, which allows for orienting the nanofillers along with the force direction without agglomeration, leading to increased contact area between the CNTs and the polymer matrix, thereby providing enhanced interactions. HS is a simple and straightforward method as it does not require an electric field, and, hence, any kinds of polymers and solvents can be applicable. Furthermore, it is feasible to retain a large amount of various nanofillers in the fibers to enhance their physical and chemical properties. Therefore, HS provides an effective pathway to create new types of reinforced nanofibers with outstanding properties. PMID:27876892
Volcanic eruptions; energy and size
de la Cruz-Reyna, S.
1991-01-01
The Earth is a dynamic planet. Many different processes are continuously developing, creating a delicate balance between the energy stored and generated in its interior and the heat lost into space. The heat in continuously transferred through complex self-regulating convection mechanisms on a planetary scale. The distribution of terrestrial heat flow reveals some of the fine structure of the energy transport mechanisms in the outer layers of the Earth. Of these mechanisms in the outer layers of the Earth. Of these mechanisms, volcanism is indeed the most remarkable, for it allows energy to be transported in rapid bursts to the surface. In order to maintain the subtle balance of the terrestrial heat machine, one may expect that some law or principle restricts the ways in which these volcanic bursts affect the overall energy transfer of the Earth. For instance, we know that the geothermal flux of the planet amounts to 1028 erg/year. On the other hand, a single large event like the Lava Creek Tuff eruption that formed Yellowstone caldera over half a million years ago may release the same amount of energy in a very small area, over a short period of time.
High Conduction Neutron Absorber to Simulate Fast Reactor Environment in an Existing Test Reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guillen, Donna; Greenwood, Lawrence R.; Parry, James
2014-06-22
A need was determined for a thermal neutron absorbing material that could be cooled in a gas reactor environment without using large amounts of a coolant that would thermalize the neutron flux. A new neutron absorbing material was developed that provided high conduction so a small amount of water would be sufficient for cooling thereby thermalizing the flux as little as possible. An irradiation experiment was performed to assess the effects of radiation and the performance of a new neutron absorbing material. Neutron fluence monitors were placed inside specially fabricated holders within a set of drop-in capsules and irradiated formore » up to four cycles in the Advanced Test Reactor. Following irradiation, the neutron fluence monitor wires were analyzed by gamma and x-ray spectrometry to determine the activities of the activation products. The adjusted neutron fluences were calculated and grouped into three bins – thermal, epithermal and fast to evaluate the spectral shift created by the new material. Fluence monitors were evaluated after four different irradiation periods to evaluate the effects of burn-up in the absorbing material. Additionally, activities of the three highest activity isotopes present in the specimens are given.« less
Implantation of Martian Materials in the Inner Solar System by a Mega Impact on Mars
NASA Astrophysics Data System (ADS)
Hyodo, Ryuki; Genda, Hidenori
2018-04-01
Observations and meteorites indicate that the Martian materials are enigmatically distributed within the inner solar system. A mega impact on Mars creating a Martian hemispheric dichotomy and the Martian moons can potentially eject Martian materials. A recent work has shown that the mega-impact-induced debris is potentially captured as the Martian Trojans and implanted in the asteroid belt. However, the amount, distribution, and composition of the debris has not been studied. Here, using hydrodynamic simulations, we report that a large amount of debris (∼1% of Mars’ mass), including Martian crust/mantle and the impactor’s materials (∼20:80), are ejected by a dichotomy-forming impact, and distributed between ∼0.5–3.0 au. Our result indicates that unmelted Martian mantle debris (∼0.02% of Mars’ mass) can be the source of Martian Trojans, olivine-rich asteroids in the Hungarian region and the main asteroid belt, and some even hit the early Earth. The evidence of a mega impact on Mars would be recorded as a spike of 40Ar–39Ar ages in meteorites. A mega impact can naturally implant Martian mantle materials within the inner solar system.
Small unmanned aerial vehicles (micro-UAVs, drones) in plant ecology.
Cruzan, Mitchell B; Weinstein, Ben G; Grasty, Monica R; Kohrn, Brendan F; Hendrickson, Elizabeth C; Arredondo, Tina M; Thompson, Pamela G
2016-09-01
Low-elevation surveys with small aerial drones (micro-unmanned aerial vehicles [UAVs]) may be used for a wide variety of applications in plant ecology, including mapping vegetation over small- to medium-sized regions. We provide an overview of methods and procedures for conducting surveys and illustrate some of these applications. Aerial images were obtained by flying a small drone along transects over the area of interest. Images were used to create a composite image (orthomosaic) and a digital surface model (DSM). Vegetation classification was conducted manually and using an automated routine. Coverage of an individual species was estimated from aerial images. We created a vegetation map for the entire region from the orthomosaic and DSM, and mapped the density of one species. Comparison of our manual and automated habitat classification confirmed that our mapping methods were accurate. A species with high contrast to the background matrix allowed adequate estimate of its coverage. The example surveys demonstrate that small aerial drones are capable of gathering large amounts of information on the distribution of vegetation and individual species with minimal impact to sensitive habitats. Low-elevation aerial surveys have potential for a wide range of applications in plant ecology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Okamura, M., E-mail: okamura@bnl.gov; Nishina Center for Accelerator-Based Science, RIKEN, Saitama; Palm, K.
Calcium and lithium ion beams are required by NASA Space Radiation Laboratory at Brookhaven National Laboratory to simulate the effects of cosmic radiation. To identify the difficulties in providing such highly reactive materials as laser targets, both species were experimentally tested. Plate shaped lithium and calcium targets were fabricated to create ablation plasmas with a 6 ns 1064 nm neodymium-doped yttrium aluminum garnet laser. We found significant oxygen contamination in both the Ca and Li high charge state beams due to the rapid oxidation of the surfaces. A large spot size, low power density laser was used to create lowmore » charge state beams without scanning the targets. The low charge state Ca beam did not have any apparent oxygen contamination, showing the potential to clean the target entirely of oxide with a low power beam once in the chamber. The Li target was clearly still oxidizing in the chamber after each low power shot. To measure the rate of oxidation, we shot the low power laser at the target repeatedly at 10 s, 30 s, 60 s, and 120 s interval lengths, showing a linear relation between the interval time and the amount of oxygen in the beam.« less
Canyons and Mesas of Aureum Chaos
NASA Technical Reports Server (NTRS)
2002-01-01
(Released 17 June 2002) This image contains a portion of Aureum Chaos located just south of the Martian equator. This fractured landscape contains canyons and mesas with two large impact craters in the upper left. The largest crater is older than the one above it. This is readily evident because a landslide deposit created by the smaller crater's impact is seen on the larger crater's floor. The overall scene has a rather muted appearance due to mantling by dust. Some small dark streaks can also be seen in this scene. These small dark streaks suggest that the materials covering this area occasionally become unstable and slide. Ridges of resistant material also can be observed in the walls of the canyons. The wall rock seen in the upper part of the cliffs appears to be layered. Classic spur and gully topography created by differing amounts of erosion and possibly different rock types is also visible here. One important observation to be made in this region is that there are no gullies apparent on the slopes such as those seen in Gorgonum Chaos (June 11th daily image). Latitude appears to play a major role in gully occurrence and distribution, with the gullies being predominately found pole ward of 30o.
Sensing in the collaborative Internet of Things.
Borges Neto, João B; Silva, Thiago H; Assunção, Renato Martins; Mini, Raquel A F; Loureiro, Antonio A F
2015-03-19
We are entering a new era of computing technology, the era of Internet of Things (IoT). An important element for this popularization is the large use of off-the-shelf sensors. Most of those sensors will be deployed by different owners, generally common users, creating what we call the Collaborative IoT. This collaborative IoT helps to increase considerably the amount and availability of collected data for different purposes, creating new interesting opportunities, but also several challenges. For example, it is very challenging to search for and select a desired sensor or a group of sensors when there is no description about the provided sensed data or when it is imprecise. Given that, in this work we characterize the properties of the sensed data in the Internet of Things, mainly the sensed data contributed by several sources, including sensors from common users. We conclude that, in order to safely use data available in the IoT, we need a filtering process to increase the data reliability. In this direction, we propose a new simple and powerful approach that helps to select reliable sensors. We tested our method for different types of sensed data, and the results reveal the effectiveness in the correct selection of sensor data.
Challenges in Developing XML-Based Learning Repositories
NASA Astrophysics Data System (ADS)
Auksztol, Jerzy; Przechlewski, Tomasz
There is no doubt that modular design has many advantages, including the most important ones: reusability and cost-effectiveness. In an e-leaming community parlance the modules are determined as Learning Objects (LOs) [11]. An increasing amount of learning objects have been created and published online, several standards has been established and multiple repositories developed for them. For example Cisco Systems, Inc., "recognizes a need to move from creating and delivering large inflexible training courses, to database-driven objects that can be reused, searched, and modified independent of their delivery media" [6]. The learning object paradigm of education resources authoring is promoted mainly to reduce the cost of the content development and to increase its quality. A frequently used metaphor of Learning Objects paradigm compares them to Lego Logs or objects in Object-Oriented program design [25]. However a metaphor is only an abstract idea, which should be turned to something more concrete to be usable. The problem is that many papers on LOs end up solely in metaphors. In our opinion Lego or OO metaphors are gross oversimplificatation of the problem as there is much easier to develop Lego set or design objects in OO program than develop truly interoperable, context-free learning content1.
Querying Large Biological Network Datasets
ERIC Educational Resources Information Center
Gulsoy, Gunhan
2013-01-01
New experimental methods has resulted in increasing amount of genetic interaction data to be generated every day. Biological networks are used to store genetic interaction data gathered. Increasing amount of data available requires fast large scale analysis methods. Therefore, we address the problem of querying large biological network datasets.…
New Evidence from Silica Debris Exo-Systems for Planet Building Hypervelocity Impacts
NASA Astrophysics Data System (ADS)
Lisse, Carey
2010-05-01
There is abundant inferential evidence for massive collisions in the early solar system [1]: Mercury's high density; Venus' retrograde spin; Earth's Moon; Mars' North/South hemispherical cratering anisotropy; Vesta's igneous origin [2]; brecciation in meteorites [3]; and Uranus' spin axis located near the plane of the ecliptic. Recent work [4] analyzing Spitzer mid-IR spectra has demonstrated the presence of large amounts of amorphous silica and SiO gas produced by a recent (within 103 - 104 yrs) large (MExcess > MPluto) hypervelocity impact collision around the young (~12 Myr old) nearby star HD172555, at the right age to form rocky planets. Many questions still remain concerning the location, lifetime, and source of the detected silica/SiO gas, which should not be stable in orbit at the estimated 5.8 AU from the HD172555 A5V primary for more than a few decades, yet it is also highly unlikely that we are fortuitously observing these systems immediately after silica formation A tabulation of the amount counts in the fine silica dust is decidedly Fe and Mg-atom poor compared to solar [4]. Three possible origins for the observed silica/SiO gas seem currently plausible : (1) A single hyperevelocity impact (>10km/s in order to produce silica and vaporize SiO at impact) creating an optically thick circumplanetary debris ring which is overflowing or releasing silica-rich material from its Hill sphere. Like terrestrial tektites, the Fe/Mg poor amorphous silica rubble is formed from quick-quenched molten/vaporized rock created during the impact. The amount of dust detected in the HD172555 system is easily enough to fill and overflow the Hill sphere radius of 0.03 AU for a Pluto-sized body at 5.8 AU from an A5 star, unless it is optically thick (> 1 cm in physical depth). Such a disk would provide a substantial fraction of the observed IR flux, and will be dense enough to self-shield its SiO gas, greatly extending its photolytic lifetime. The lifetime for such a system versus re-condensation into a solid body like the Moon is short, though, ~ 103 to 104 yrs [5]. Credence is lent to this scenario by observations of the Jovian impact in July 2009 [6], where absorption features due to silica have been found superimposed on those of hot ammonia at the > 60 km/s impact site (Fig. 1). (2) Ongoing multiple small hypervelocity impacts continuously grinding down a distribution of large circumstellar particles above the blowout size limit (the 'rubble' identified in [4]) and releasing silica rich material and SiO gas. This model would require a massive (>1 MMoon) belt of 10 μm - 1 cm particles with inclinations spread out over at least ±45o [4] or dust on highly eccentric orbits [7]. The amount of material implied by the relative amplitude of the rubble spectral feature is consistent with the amount needed to collisionally produce the fine silica dust [4, 8]. A body rapidly re-accreting in a debris ring after collisional disruption (like the Moon) would have similar behavior (lots of impacts for some time, producing gas and little melt droplets). (3) A single impact onto a silica-rich object with already highly differentiated surface layers. For a very young system at 10 - 20 Myr when we expect planets to be rapidly accreting, a Mercury or larger-sized rocky body covered in an SiO rich magma ocean is very likely by the Jeans energy criterion [9], even without considering additional heating input by 26Al and other radioactives. For the lowest expected impact velocities,v MercuryEscape = 4 km/s, a pre-existing magma ocean in equilibrium with a surrounding SiO atmosphere would be required; at higher velocities the impacting body could be the formative mechanism for the magma ocean [10]. Further evidence for excess circumstellar emission due to silica dust have now been found. The youngest of these, HD154263, at ~20 Myr age shows evidence for SiO gas and amorphous + crystalline silica. The 2 older systems, HD23514 at ~100 Myr age, and HD15407 at ~2 Gyr, conspicuously do not show any evidence for SiO gas while exhibiting strong features mainly due to crystalline silica. HD23514 also shows evidence for large amounts of amorphous carbon, PAHs, and nanodiamonds, due to a strongly enhanced C-atom abundance in impactor or impactee. HD15407, the oldest system, also does not show any conclusive evidence for the presence of large dark particles ('rubble').
Effects of sampling design on age ratios of migrants captured at stopover sites
Jeffrey F. Kelly; Deborah M. Finch
2000-01-01
Age classes of migrant songbirds often differ in migration timing. This difference creates the potential for age-ratios recorded at stopover sites to vary with the amount and distribution of sampling effort used. To test for these biases, we sub-sampled migrant capture data from the Middle Rio Grande Valley of New Mexico. We created data sets that reflected the age...
ERIC Educational Resources Information Center
Rasskazov, Philipp Dementievich; Muller, Olga Yurievna
2017-01-01
A little amount of universities in Russia are busy in education of disabled students now. The methods were created and experience in sphere of inclusive education was gained. But mechanisms that help to create the development of the inclusive process of educational institution are absent. Teachers have problems with determination of approaches and…
Northwest Climate Risk Assessment
NASA Astrophysics Data System (ADS)
Mote, P.; Dalton, M. M.; Snover, A. K.
2012-12-01
As part of the US National Climate Assessment, the Northwest region undertook a process of climate risk assessment. This process included an expert evaluation of previously identified impacts, their likelihoods, and consequences, and engaged experts from both academia and natural resource management practice (federal, tribal, state, local, private, and non-profit) in a workshop setting. An important input was a list of 11 risks compiled by state agencies in Oregon and similar adaptation efforts in Washington. By considering jointly the likelihoods, consequences, and adaptive capacity, participants arrived at an approximately ranked list of risks which was further assessed and prioritized through a series of risk scoring exercises to arrive at the top three climate risks facing the Northwest: 1) changes in amount and timing of streamflow related to snowmelt, causing far-reaching ecological and socioeconomic consequences; 2) coastal erosion and inundation, and changing ocean acidity, combined with low adaptive capacity in the coastal zone to create large risks; and 3) the combined effects of wildfire, insect outbreaks, and diseases will cause large areas of forest mortality and long-term transformation of forest landscapes.
Assessment of Heavy Metal Pollution in Sediments of Inflow Rivers to Lake Taihu, China.
Niu, Yong; Niu, Yuan; Pang, Yong; Yu, Hui
2015-11-01
Lake Taihu, the third-largest freshwater body in China, has many functions, including drinking water supply, flood control, cultivation, navigation, and tourism. In this study, sediment samples were collected at 31 sites from 11 inflow rivers in 2012, to investigate the distribution and concentration of heavy metals copper (Cu), zinc (Zn), lead (Pb), nickel (Ni), and chromium (Cr), and to assess their potential ecological risk. The highest mean concentration was found for Zn, followed by Cu, Cr, Pb, and Ni. Generally, heavy metal pollution was more serious in Wu Jingang River and Caoqiao River, probably because they receive large amounts of wastewater from various local industrial enterprises. The potential ecological risk values of the heavy metals were larger than 120 in more than 25.8% of the sediment samples, indicating a very high risk. The largest ecological risk was due to copper. Furthermore, the results of a principal component analysis and subsequent analysis of variance showed that heavy metal concentrations in the sediment of inflow rivers were higher than those of the lake, which created a large hazard for the aquatic ecosystems of Lake Taihu.
Introducing the Big Knowledge to Use (BK2U) challenge.
Perl, Yehoshua; Geller, James; Halper, Michael; Ochs, Christopher; Zheng, Ling; Kapusnik-Uner, Joan
2017-01-01
The purpose of the Big Data to Knowledge initiative is to develop methods for discovering new knowledge from large amounts of data. However, if the resulting knowledge is so large that it resists comprehension, referred to here as Big Knowledge (BK), how can it be used properly and creatively? We call this secondary challenge, Big Knowledge to Use. Without a high-level mental representation of the kinds of knowledge in a BK knowledgebase, effective or innovative use of the knowledge may be limited. We describe summarization and visualization techniques that capture the big picture of a BK knowledgebase, possibly created from Big Data. In this research, we distinguish between assertion BK and rule-based BK (rule BK) and demonstrate the usefulness of summarization and visualization techniques of assertion BK for clinical phenotyping. As an example, we illustrate how a summary of many intracranial bleeding concepts can improve phenotyping, compared to the traditional approach. We also demonstrate the usefulness of summarization and visualization techniques of rule BK for drug-drug interaction discovery. © 2016 New York Academy of Sciences.
Dawn of Advanced Molecular Medicine: Nanotechnological Advancements in Cancer Imaging and Therapy
Kaittanis, Charalambos; Shaffer, Travis M.; Thorek, Daniel L. J.; Grimm, Jan
2014-01-01
Nanotechnology plays an increasingly important role not only in our everyday life (with all its benefits and dangers) but also in medicine. Nanoparticles are to date the most intriguing option to deliver high concentrations of agents specifically and directly to cancer cells; therefore, a wide variety of these nanomaterials has been developed and explored. These span the range from simple nanoagents to sophisticated smart devices for drug delivery or imaging. Nanomaterials usually provide a large surface area, allowing for decoration with a large amount of moieties on the surface for either additional functionalities or targeting. Besides using particles solely for imaging purposes, they can also carry as a payload a therapeutic agent. If both are combined within the same particle, a theranostic agent is created. The sophistication of highly developed nanotechnology targeting approaches provides a promising means for many clinical implementations and can provide improved applications for otherwise suboptimal formulations. In this review we will explore nanotechnology both for imaging and therapy to provide a general overview of the field and its impact on cancer imaging and therapy. PMID:25271430
Introducing the Big Knowledge to Use (BK2U) challenge
Perl, Yehoshua; Geller, James; Halper, Michael; Ochs, Christopher; Zheng, Ling; Kapusnik-Uner, Joan
2016-01-01
The purpose of the Big Data to Knowledge (BD2K) initiative is to develop methods for discovering new knowledge from large amounts of data. However, if the resulting knowledge is so large that it resists comprehension, referred to here as Big Knowledge (BK), how can it be used properly and creatively? We call this secondary challenge, Big Knowledge to Use (BK2U). Without a high-level mental representation of the kinds of knowledge in a BK knowledgebase, effective or innovative use of the knowledge may be limited. We describe summarization and visualization techniques that capture the big picture of a BK knowledgebase, possibly created from Big Data. In this research, we distinguish between assertion BK and rule-based BK and demonstrate the usefulness of summarization and visualization techniques of assertion BK for clinical phenotyping. As an example, we illustrate how a summary of many intracranial bleeding concepts can improve phenotyping, compared to the traditional approach. We also demonstrate the usefulness of summarization and visualization techniques of rule-based BK for drug–drug interaction discovery. PMID:27750400
Ultrafast entanglement of trapped ions
NASA Astrophysics Data System (ADS)
Neyenhuis, Brian; Mizrahi, Jonathan; Johnson, Kale; Monroe, Christopher
2013-05-01
We have demonstrated ultrafast spin-motion entanglement of a single atomic ion using a short train of intense laser pulses. This pulse train gives the ion a spin-dependent kick where each spin state receives a discrete momentum kick in opposite directions. Using a series of these spin-dependent kicks we can realize a two qubit gate. In contrast to gates using spectroscopically resolved motional sidebands, these gates may be performed faster than the trap oscillation period, making them potentially less sensitive to noise, independent of temperature, and more easily scalable to large crystals of ions. We show that multiple kicks can be strung together to create a ``Schrodinger cat'' like state, where the large separation between the two parts of the wavepacket allow us to accumulate the phase shift necessary for a gate in a shorter amount of time. We will present a realistic pulse scheme for a two ion gate, and our progress towards its realization. This work is supported by grants from the U.S. Army Research Office with funding from the DARPA OLE program, IARPA, and the MURI program; and the NSF Physics Frontier Center at JQI.
Reduction of Nitrogen Oxides Emissions from a Coal-Fired Boiler Unit
NASA Astrophysics Data System (ADS)
Zhuikov, Andrey V.; Feoktistov, Dmitry V.; Koshurnikova, Natalya N.; Zlenko, Lyudmila V.
2016-02-01
During combustion of fossil fuels a large amount of harmful substances are discharged into the atmospheres of cities by industrial heating boiler houses. The most harmful substances among them are nitrogen oxides. The paper presents one of the most effective technological solutions for suppressing nitrogen oxides; it is arrangement of circulation process with additional mounting of the nozzle directed into the bottom of the ash hopper. When brown high-moisture coals are burnt in the medium power boilers, generally fuel nitrogen oxides are produced. It is possible to reduce their production by two ways: lowering the temperature in the core of the torch or decreasing the excess-air factor in the boiler furnace. Proposed solution includes the arrangement of burning process with additional nozzle installed in the lower part of the ash hopper. Air supply from these nozzles creates vortex involving large unburned fuel particles in multiple circulations. Thereby time of their staying in the combustion zone is prolonging. The findings describe the results of the proposed solution; and recommendations for the use of this technological method are given for other boilers.
Moral Duties of Genomics Researchers: Why Personalized Medicine Requires a Collective Approach.
Vos, Shoko; van Delden, Johannes J M; van Diest, Paul J; Bredenoord, Annelien L
2017-02-01
Advances in genome sequencing together with the introduction of personalized medicine offer promising new avenues for research and precision treatment, particularly in the field of oncology. At the same time, the convergence of genomics, bioinformatics, and the collection of human tissues and patient data creates novel moral duties for researchers. After all, unprecedented amounts of potentially sensitive information are being generated. Over time, traditional research ethics principles aimed at protecting individual participants have become supplemented with social obligations related to the interests of society and the research enterprise at large, illustrating that genomic medicine is also a social endeavor. In this review we provide a comprehensive assembly of moral duties that have been attributed to genomics researchers and offer suggestions for responsible advancement of personalized genomic cancer care. Copyright © 2016 Elsevier Ltd. All rights reserved.
An efficient approach to imaging underground hydraulic networks
NASA Astrophysics Data System (ADS)
Kumar, Mohi
2012-07-01
To better locate natural resources, treat pollution, and monitor underground networks associated with geothermal plants, nuclear waste repositories, and carbon dioxide sequestration sites, scientists need to be able to accurately characterize and image fluid seepage pathways below ground. With these images, scientists can gain knowledge of soil moisture content, the porosity of geologic formations, concentrations and locations of dissolved pollutants, and the locations of oil fields or buried liquid contaminants. Creating images of the unknown hydraulic environments underfoot is a difficult task that has typically relied on broad extrapolations from characteristics and tests of rock units penetrated by sparsely positioned boreholes. Such methods, however, cannot identify small-scale features and are very expensive to reproduce over a broad area. Further, the techniques through which information is extrapolated rely on clunky and mathematically complex statistical approaches requiring large amounts of computational power.
NASA Technical Reports Server (NTRS)
Prasad, S. S.
1979-01-01
Large amounts of long lived N2(A3Sigma) are created by the energy degradation of precipitating solar particles. Laboratory data suggest that in the stratosphere N2(A3Sigma) are efficiently converted into N2O. Through reactions with O(1D), N2O may gradually release NO and thereby influence the long term aspects of stratospheric chemical response. During the daytime, negative ions may transform an active NO(x) into an inactive HNO3. At night both negative and positive ion chemistry generate HO(x). Omission of ionic chemistry results in considerable underestimation of O3 depletion during the initial phases of solar particle events, and thereby introduces significant error in the estimation of the nature of the prompt response.
Intensity-invariant coding in the auditory system.
Barbour, Dennis L
2011-11-01
The auditory system faithfully represents sufficient details from sound sources such that downstream cognitive processes are capable of acting upon this information effectively even in the face of signal uncertainty, degradation or interference. This robust sound source representation leads to an invariance in perception vital for animals to interact effectively with their environment. Due to unique nonlinearities in the cochlea, sound representations early in the auditory system exhibit a large amount of variability as a function of stimulus intensity. In other words, changes in stimulus intensity, such as for sound sources at differing distances, create a unique challenge for the auditory system to encode sounds invariantly across the intensity dimension. This challenge and some strategies available to sensory systems to eliminate intensity as an encoding variable are discussed, with a special emphasis upon sound encoding. Copyright © 2011 Elsevier Ltd. All rights reserved.
Macnamara, Brooke N; Hambrick, David Z; Oswald, Frederick L
2014-08-01
More than 20 years ago, researchers proposed that individual differences in performance in such domains as music, sports, and games largely reflect individual differences in amount of deliberate practice, which was defined as engagement in structured activities created specifically to improve performance in a domain. This view is a frequent topic of popular-science writing-but is it supported by empirical evidence? To answer this question, we conducted a meta-analysis covering all major domains in which deliberate practice has been investigated. We found that deliberate practice explained 26% of the variance in performance for games, 21% for music, 18% for sports, 4% for education, and less than 1% for professions. We conclude that deliberate practice is important, but not as important as has been argued. © The Author(s) 2014.
Data transmission protocol for Pi-of-the-Sky cameras
NASA Astrophysics Data System (ADS)
Uzycki, J.; Kasprowicz, G.; Mankiewicz, M.; Nawrocki, K.; Sitek, P.; Sokolowski, M.; Sulej, R.; Tlaczala, W.
2006-10-01
The large amount of data collected by the automatic astronomical cameras has to be transferred to the fast computers in a reliable way. The method chosen should ensure data streaming in both directions but in nonsymmetrical way. The Ethernet interface is very good choice because of its popularity and proven performance. However it requires TCP/IP stack implementation in devices like cameras for full compliance with existing network and operating systems. This paper describes NUDP protocol, which was made as supplement to standard UDP protocol and can be used as a simple-network protocol. The NUDP does not need TCP protocol implementation and makes it possible to run the Ethernet network with simple devices based on microcontroller and/or FPGA chips. The data transmission idea was created especially for the "Pi of the Sky" project.
NASA Astrophysics Data System (ADS)
Seamon, E.; Gessler, P. E.; Flathers, E.
2015-12-01
The creation and use of large amounts of data in scientific investigations has become common practice. Data collection and analysis for large scientific computing efforts are not only increasing in volume as well as number, the methods and analysis procedures are evolving toward greater complexity (Bell, 2009, Clarke, 2009, Maimon, 2010). In addition, the growth of diverse data-intensive scientific computing efforts (Soni, 2011, Turner, 2014, Wu, 2008) has demonstrated the value of supporting scientific data integration. Efforts to bridge this gap between the above perspectives have been attempted, in varying degrees, with modular scientific computing analysis regimes implemented with a modest amount of success (Perez, 2009). This constellation of effects - 1) an increasing growth in the volume and amount of data, 2) a growing data-intensive science base that has challenging needs, and 3) disparate data organization and integration efforts - has created a critical gap. Namely, systems of scientific data organization and management typically do not effectively enable integrated data collaboration or data-intensive science-based communications. Our research efforts attempt to address this gap by developing a modular technology framework for data science integration efforts - with climate variation as the focus. The intention is that this model, if successful, could be generalized to other application areas. Our research aim focused on the design and implementation of a modular, deployable technology architecture for data integration. Developed using aspects of R, interactive python, SciDB, THREDDS, Javascript, and varied data mining and machine learning techniques, the Modular Data Response Framework (MDRF) was implemented to explore case scenarios for bio-climatic variation as they relate to pacific northwest ecosystem regions. Our preliminary results, using historical NETCDF climate data for calibration purposes across the inland pacific northwest region (Abatzoglou, Brown, 2011), show clear ecosystems shifting over a ten-year period (2001-2011), based on multiple supervised classifier methods for bioclimatic indicators.
Measuring Sizes & Shapes of Galaxies
NASA Astrophysics Data System (ADS)
Kusmic, Samir; Willemn Holwerda, Benne
2018-01-01
Software is how galaxy morphometrics are calculated, cutting down on time needed to categorize galaxies. However, new surveys coming in the next decade is expected to count upwards of a thousand times more galaxies than with current surveys. This issue would create longer time consumption just processing data. In this research, we looked into how we can reduce the time it takes to get morphometric parameters in order to classify galaxies, but also how precise we can get with other findings. The software of choice is Source Extractor, known for taking a short amount of time, as well as being recently updated to get compute morphometric parameters. This test is being done by running CANDELS data, five fields in the J and H filters, through Source Extractor and then cross-correlating the new catalog with one created with GALFIT, obtained from van der Wel et al. 2014, and then with spectroscopic redshift data. With Source Extractor, we look at how many galaxies counted, how precise the computation, how to classify morphometry, and how the results stand with other findings. The run-time was approximately 10 hours when cross-correlated with GALFIT and approximately 8 hours with the spectroscopic redshift; these were expected times as Source Extractor and already faster than GALFIT's run-time by a large factor. As well, Source Extractor's recovery was large: 79.24\\% of GALFIT's count. However, the precision is highly variable. We have created two thresholds to see which would be better in order to combat this;we ended up picking an unbiased isophotal area threshold as the better choice. Still, with such a threshold, spread was relatively wide. However, comparing the parameters with redshift showed agreeable findings, however, not necessarily to the numerical value. From the results, we see Source Extractor as a good first-look, to be followed up by other software.
NASA Astrophysics Data System (ADS)
Wong, Jianhui; Lim, Yun Seng; Morris, Stella; Morris, Ezra; Chua, Kein Huat
2017-04-01
The amount of small-scaled renewable energy sources is anticipated to increase on the low-voltage distribution networks for the improvement of energy efficiency and reduction of greenhouse gas emission. The growth of the PV systems on the low-voltage distribution networks can create voltage unbalance, voltage rise, and reverse-power flow. Usually these issues happen with little fluctuation. However, it tends to fluctuate severely as Malaysia is a region with low clear sky index. A large amount of clouds often passes over the country, hence making the solar irradiance to be highly scattered. Therefore, the PV power output fluctuates substantially. These issues can lead to the malfunction of the electronic based equipment, reduction in the network efficiency and improper operation of the power protection system. At the current practice, the amount of PV system installed on the distribution network is constraint by the utility company. As a result, this can limit the reduction of carbon footprint. Therefore, energy storage system is proposed as a solution for these power quality issues. To ensure an effective operation of the distribution network with PV system, a fuzzy control system is developed and implemented to govern the operation of an energy storage system. The fuzzy driven energy storage system is able to mitigate the fluctuating voltage rise and voltage unbalance on the electrical grid by actively manipulates the flow of real power between the grid and the batteries. To verify the effectiveness of the proposed fuzzy driven energy storage system, an experimental network integrated with 7.2kWp PV system was setup. Several case studies are performed to evaluate the response of the proposed solution to mitigate voltage rises, voltage unbalance and reduce the amount of reverse power flow under highly intermittent PV power output.
Space Electric Research Test in the Electric Propulsion Laboratory
1964-06-21
Technicians prepare the Space Electric Research Test (SERT-I) payload for a test in Tank Number 5 of the Electric Propulsion Laboratory at the National Aeronautics and Space Administration (NASA) Lewis Research Center. Lewis researchers had been studying different methods of electric rocket propulsion since the mid-1950s. Harold Kaufman created the first successful engine, the electron bombardment ion engine, in the early 1960s. These electric engines created and accelerated small particles of propellant material to high exhaust velocities. Electric engines have a very small amount of thrust, but once lofted into orbit by workhorse chemical rockets, they are capable of small, continuous thrust for periods up to several years. The electron bombardment thruster operated at a 90-percent efficiency during testing in the Electric Propulsion Laboratory. The package was rapidly rotated in a vacuum to simulate its behavior in space. The SERT-I mission, launched from Wallops Island, Virginia, was the first flight test of Kaufman’s ion engine. SERT-I had one cesium engine and one mercury engine. The suborbital flight was only 50 minutes in duration but proved that the ion engine could operate in space. The Electric Propulsion Laboratory included two large space simulation chambers, one of which is seen here. Each uses twenty 2.6-foot diameter diffusion pumps, blowers, and roughing pumps to remove the air inside the tank to create the thin atmosphere. A helium refrigeration system simulates the cold temperatures of space.
Scalability improvements to NRLMOL for DFT calculations of large molecules
NASA Astrophysics Data System (ADS)
Diaz, Carlos Manuel
Advances in high performance computing (HPC) have provided a way to treat large, computationally demanding tasks using thousands of processors. With the development of more powerful HPC architectures, the need to create efficient and scalable code has grown more important. Electronic structure calculations are valuable in understanding experimental observations and are routinely used for new materials predictions. For the electronic structure calculations, the memory and computation time are proportional to the number of atoms. Memory requirements for these calculations scale as N2, where N is the number of atoms. While the recent advances in HPC offer platforms with large numbers of cores, the limited amount of memory available on a given node and poor scalability of the electronic structure code hinder their efficient usage of these platforms. This thesis will present some developments to overcome these bottlenecks in order to study large systems. These developments, which are implemented in the NRLMOL electronic structure code, involve the use of sparse matrix storage formats and the use of linear algebra using sparse and distributed matrices. These developments along with other related development now allow ground state density functional calculations using up to 25,000 basis functions and the excited state calculations using up to 17,000 basis functions while utilizing all cores on a node. An example on a light-harvesting triad molecule is described. Finally, future plans to further improve the scalability will be presented.
Peptide library synthesis on spectrally encoded beads for multiplexed protein/peptide bioassays
NASA Astrophysics Data System (ADS)
Nguyen, Huy Q.; Brower, Kara; Harink, Björn; Baxter, Brian; Thorn, Kurt S.; Fordyce, Polly M.
2017-02-01
Protein-peptide interactions are essential for cellular responses. Despite their importance, these interactions remain largely uncharacterized due to experimental challenges associated with their measurement. Current techniques (e.g. surface plasmon resonance, fluorescence polarization, and isothermal calorimetry) either require large amounts of purified material or direct fluorescent labeling, making high-throughput measurements laborious and expensive. In this report, we present a new technology for measuring antibody-peptide interactions in vitro that leverages spectrally encoded beads for biological multiplexing. Specific peptide sequences are synthesized directly on encoded beads with a 1:1 relationship between peptide sequence and embedded code, thereby making it possible to track many peptide sequences throughout the course of an experiment within a single small volume. We demonstrate the potential of these bead-bound peptide libraries by: (1) creating a set of 46 peptides composed of 3 commonly used epitope tags (myc, FLAG, and HA) and single amino-acid scanning mutants; (2) incubating with a mixture of fluorescently-labeled antimyc, anti-FLAG, and anti-HA antibodies; and (3) imaging these bead-bound libraries to simultaneously identify the embedded spectral code (and thus the sequence of the associated peptide) and quantify the amount of each antibody bound. To our knowledge, these data demonstrate the first customized peptide library synthesized directly on spectrally encoded beads. While the implementation of the technology provided here is a high-affinity antibody/protein interaction with a small code space, we believe this platform can be broadly applicable to any range of peptide screening applications, with the capability to multiplex into libraries of hundreds to thousands of peptides in a single assay.
NASA Astrophysics Data System (ADS)
Estrada, Paul R.; Cuzzi, Jeffrey N.; Morgan, Demitri A.
2016-02-01
We model particle growth in a turbulent, viscously evolving protoplanetary nebula, incorporating sticking, bouncing, fragmentation, and mass transfer at high speeds. We treat small particles using a moments method and large particles using a traditional histogram binning, including a probability distribution function of collisional velocities. The fragmentation strength of the particles depends on their composition (icy aggregates are stronger than silicate aggregates). The particle opacity, which controls the nebula thermal structure, evolves as particles grow and mass redistributes. While growing, particles drift radially due to nebula headwind drag. Particles of different compositions evaporate at “evaporation fronts” (EFs) where the midplane temperature exceeds their respective evaporation temperatures. We track the vapor and solid phases of each component, accounting for advection and radial and vertical diffusion. We present characteristic results in evolutions lasting 2 × 105 years. In general, (1) mass is transferred from the outer to the inner nebula in significant amounts, creating radial concentrations of solids at EFs; (2) particle sizes are limited by a combination of fragmentation, bouncing, and drift; (3) “lucky” large particles never represent a significant amount of mass; and (4) restricted radial zones just outside each EF become compositionally enriched in the associated volatiles. We point out implications for millimeter to submillimeter SEDs and the inference of nebula mass, radial banding, the role of opacity on new mechanisms for generating turbulence, the enrichment of meteorites in heavy oxygen isotopes, variable and nonsolar redox conditions, the primary accretion of silicate and icy planetesimals, and the makeup of Jupiter’s core.
NASA Astrophysics Data System (ADS)
Thompson, A. M.; Stauffer, R. M.; Young, G. S.
2015-12-01
Ozone (O3) trends analysis is typically performed with monthly or seasonal averages. Although this approach works well for stratospheric or total O3, uncertainties in tropospheric O3 amounts may be large due to rapid meteorological changes near the tropopause and in the lower free troposphere (LFT) where pollution has a days-weeks lifetime. We use self-organizing maps (SOM), a clustering technique, as an alternative for creating tropospheric climatologies from O3 soundings. In a previous study of 900 tropical ozonesondes, clusters representing >40% of profiles deviated > 1-sigma from mean O3. Here SOM are based on 15 years of data from four sites in the contiguous US (CONUS; Boulder, CO; Huntsville, AL; Trinidad Head, CA; Wallops Island, VA). Ozone profiles from 2 - 12 km are used to evaluate the impact of tropopause variability on climatology; 2 - 6 km O3 profile segments are used for the LFT. Near-tropopause O3 is twice the mean O3 mixing ratio in three clusters of 2 - 12 km O3, representing > 15% of profiles at each site. Large mid and lower-tropospheric O3 deviations from monthly means are found in clusters of both 2 - 12 and 2 - 6 km O3. Positive offsets result from pollution and stratosphere-to-troposphere exchange. In the LFT the lowest tropospheric O3 is associated with subtropical air. Some clusters include profiles with common seasonality but other factors, e.g., tropopause height or LFT column amount, characterize other SOM nodes. Thus, as for tropical profiles, CONUS O3 averages can be a poor choice for a climatology.
Chakrabarti, Aditi; Chaudhury, Manoj K
2013-12-17
We report some experimental observations regarding a new type of long-range interaction between rigid particles that prevails when they are suspended in an ultrasoft elastic gel. A denser particle submerges itself to a considerable depth inside the gel and becomes elasto-buoyant by balancing its weight against the elastic force exerted by the surrounding medium. By virtue of a large elasto-capillary length, the surface of the gel wraps around the particle and closes to create a line singularity connecting the particle to the free surface of the gel. A substantial amount of tensile strain is thus developed in the gel network parallel to the free surface that penetrates to a significant depth inside the gel. The field of this tensile strain is rather long-range because of a large gravito-elastic correlation length and sufficiently strong to pull two submerged particles into contact. The particles move toward each other with an effective force following an inverse linear distance law. When more monomers or dimers of the particles are released inside the gel, they orient rather freely inside the capsules where they are located and attract each other to form closely packed clusters. Eventually, these clusters themselves interact and coalesce. This is an emergent phenomenon in which gravity, capillarity, and elasticity work in tandem to create a long-range interaction. We also present the results of a related experiment, in which a particle suspended inside a thickness-graded gel moves accompanied by the continuous folding and the relaxation of the gel's surface.
Winslow, Luke A.; Read, Jordan S.; Hanson, Paul C.; Stanley, Emily H.
2014-01-01
With lake abundances in the thousands to millions, creating an intuitive understanding of the distribution of morphology and processes in lakes is challenging. To improve researchers’ understanding of large-scale lake processes, we developed a parsimonious mathematical model based on the Pareto distribution to describe the distribution of lake morphology (area, perimeter and volume). While debate continues over which mathematical representation best fits any one distribution of lake morphometric characteristics, we recognize the need for a simple, flexible model to advance understanding of how the interaction between morphometry and function dictates scaling across large populations of lakes. These models make clear the relative contribution of lakes to the total amount of lake surface area, volume, and perimeter. They also highlight the critical thresholds at which total perimeter, area and volume would be evenly distributed across lake size-classes have Pareto slopes of 0.63, 1 and 1.12, respectively. These models of morphology can be used in combination with models of process to create overarching “lake population” level models of process. To illustrate this potential, we combine the model of surface area distribution with a model of carbon mass accumulation rate. We found that even if smaller lakes contribute relatively less to total surface area than larger lakes, the increasing carbon accumulation rate with decreasing lake size is strong enough to bias the distribution of carbon mass accumulation towards smaller lakes. This analytical framework provides a relatively simple approach to upscaling morphology and process that is easily generalizable to other ecosystem processes.
Butensky, Samuel D; Sloan, Andrew P; Meyers, Eric; Carmel, Jason B
2017-07-15
Hand function is critical for independence, and neurological injury often impairs dexterity. To measure hand function in people or forelimb function in animals, sensors are employed to quantify manipulation. These sensors make assessment easier and more quantitative and allow automation of these tasks. While automated tasks improve objectivity and throughput, they also produce large amounts of data that can be burdensome to analyze. We created software called Dexterity that simplifies data analysis of automated reaching tasks. Dexterity is MATLAB software that enables quick analysis of data from forelimb tasks. Through a graphical user interface, files are loaded and data are identified and analyzed. These data can be annotated or graphed directly. Analysis is saved, and the graph and corresponding data can be exported. For additional analysis, Dexterity provides access to custom scripts created by other users. To determine the utility of Dexterity, we performed a study to evaluate the effects of task difficulty on the degree of impairment after injury. Dexterity analyzed two months of data and allowed new users to annotate the experiment, visualize results, and save and export data easily. Previous analysis of tasks was performed with custom data analysis, requiring expertise with analysis software. Dexterity made the tools required to analyze, visualize and annotate data easy to use by investigators without data science experience. Dexterity increases accessibility to automated tasks that measure dexterity by making analysis of large data intuitive, robust, and efficient. Copyright © 2017 Elsevier B.V. All rights reserved.
The NUTRA-SNACKS project: basic research and biotechnological programs on nutraceutics.
Rea, Giuseppina; Antonacci, Amina; Lambreva, Maya; Margonelli, Andrea; Ambrosi, Cecilia; Giardi, Maria Teresa
2010-01-01
The Nutra-Snacks project aims at creating novel high quality ready-to-eat foods with functional activity, useful for promoting public health. The team is composed of seven research institutes and three SMEs from different countries whose activities span from basic to applied research providing the right technological transfer to small and medium industries involved in the novel food production chain. Strategic objectives include the application of plant cell and in vitro culture systems to create very large amounts of high-value plant secondary metabolites with recognized anticancer, antilipidemic, anticholesterol, antimicrobial, antiviral, antihypertensive and anti-inflammatory properties and to include them in specific food products. To this end, the screening of a vast number of working organisms capable of accumulating the desired compounds and the characterization of their expression profiles represent fundamental steps in the research program. The information allows the identification of plant species hyper-producing metabolites and selection of those metabolites capable of specifically counteracting the oxidative stress that underlies the development of important pathologies and diseases. In addition, devising safe metabolite extraction procedures is also crucial in order to provide nutraceutical-enriched extracts compatible with human health. New biotechnological approaches are also undertaken including the exploitation of photosynthetic algal strains in bio-farms to enhance the synthesis ofantioxidant compounds and the design of novel bioreactors for small and large scale biomass production. Further outstanding objectives include the development of (i) safety and quality control protocols (ii) biosensor techniques for the analysis of the emerging ready-to-eat food and (iii) a contribution to define a standard for new regulations on nutraceutics.
X-rays are a type of radiation called electromagnetic waves. X-ray imaging creates pictures of the inside of ... different amounts of radiation. Calcium in bones absorbs x-rays the most, so bones look white. Fat ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huesemann, Michael H.
The most stringent emission scenarios published by the Intergovernmental Panel on Climate Change (IPCC) would result in the stabilization of atmospheric carbon dioxide (CO2) at concentrations of approximately 550 ppm which would produce a global temperature increase of at least 2 C by 2100. Given the large uncertainties regarding the potential risks associated with this degree of global warming, it would be more prudent to stabilize atmospheric CO2 concentrations at or below current levels which, in turn, would require a greater than 20-fold reduction (i.e., ?95%) in per capita carbon emissions in industrialized nations within the next 50 to 100more » years. Using the Kaya equation as a conceptual framework, this paper examines whether CO2 mitigation approaches such as energy efficiency improvements, carbon sequestration, and the development of carbon-free energy sources would be sufficient to bring about the required reduction in per capita carbon emissions without creating unforeseen negative impacts elsewhere. In terms of energy efficiency, large improvements (?5-fold) are in principle possible given aggressive investments in R&D and if market imperfections such as corporate subsidies are removed. However, energy efficiency improvements per se will not result in a reduction in carbon emissions if, as predicted by the IPCC, the size of the global economy has expanded 12-26 fold by 2100. Terrestrial carbon sequestration via reforestation and improved agricultural soil management has many environmental advantages but has only limited CO2 mitigation potential because the global terrestrial carbon sink (ca. 200 Gt C) is small relative to the size of fossil fuel deposits (?4000 Gt C). By contrast, very large amounts of CO2 can potentially be removed from the atmosphere via sequestration in geologic formations and oceans, but carbon storage is not permanent and is likely to create many unpredictable environmental consequences. Renewable solar energy can in theory provide large amounts of carbon-free power. However, biomass and hydroelectric energy can only be marginally expanded and large-scale solar energy installations (i.e., wind, photovoltaics, and direct thermal) are likely to have significant negative environmental impacts. Expansion of nuclear energy is highly unlikely due to concerns over reactor safety, radioactive waste management, weapons proliferation, and cost. In view of the serious limitations and liabilities of many proposed CO2 mitigation approaches it appears that there remain only few no-regrets options such as drastic energy efficiency improvements, extensive terrestrial carbon sequestration, and cautious expansion of renewable energy generation. These promising CO2 mitigation technologies have the potential to bring about the required 20-fold reduction in per capita carbon emission only if population and economic growth are halted without delay. Thus, addressing the problem of global warming requires not only technological research and development but also a reexamination of core values that mistakenly equate material consumption and economic growth to happiness and well-being.« less
Usage of Neural Network to Predict Aluminium Oxide Layer Thickness
Michal, Peter; Vagaská, Alena; Gombár, Miroslav; Kmec, Ján; Spišák, Emil; Kučerka, Daniel
2015-01-01
This paper shows an influence of chemical composition of used electrolyte, such as amount of sulphuric acid in electrolyte, amount of aluminium cations in electrolyte and amount of oxalic acid in electrolyte, and operating parameters of process of anodic oxidation of aluminium such as the temperature of electrolyte, anodizing time, and voltage applied during anodizing process. The paper shows the influence of those parameters on the resulting thickness of aluminium oxide layer. The impact of these variables is shown by using central composite design of experiment for six factors (amount of sulphuric acid, amount of oxalic acid, amount of aluminium cations, electrolyte temperature, anodizing time, and applied voltage) and by usage of the cubic neural unit with Levenberg-Marquardt algorithm during the results evaluation. The paper also deals with current densities of 1 A·dm−2 and 3 A·dm−2 for creating aluminium oxide layer. PMID:25922850
Usage of neural network to predict aluminium oxide layer thickness.
Michal, Peter; Vagaská, Alena; Gombár, Miroslav; Kmec, Ján; Spišák, Emil; Kučerka, Daniel
2015-01-01
This paper shows an influence of chemical composition of used electrolyte, such as amount of sulphuric acid in electrolyte, amount of aluminium cations in electrolyte and amount of oxalic acid in electrolyte, and operating parameters of process of anodic oxidation of aluminium such as the temperature of electrolyte, anodizing time, and voltage applied during anodizing process. The paper shows the influence of those parameters on the resulting thickness of aluminium oxide layer. The impact of these variables is shown by using central composite design of experiment for six factors (amount of sulphuric acid, amount of oxalic acid, amount of aluminium cations, electrolyte temperature, anodizing time, and applied voltage) and by usage of the cubic neural unit with Levenberg-Marquardt algorithm during the results evaluation. The paper also deals with current densities of 1 A · dm(-2) and 3 A · dm(-2) for creating aluminium oxide layer.
A Call to Arms: Wartime Blood Donor Recruitment.
Wang, Jean Cy
2018-01-01
To ensure an adequate blood supply, blood collection agencies must design campaigns to recruit and maintain an active donor pool. Such campaigns generally appeal to altruism and humanitarianism, which donors most commonly cite as their reasons for donating. However, large donor registries and the widespread recruitment campaigns that sustain them did not become a necessity until the technology for the collection, storage, and transfusion of blood had advanced to a point that enabled the establishment of transfusion services that could provide large amounts of stored blood to meet high demands. The realization of these milestones was one of the most important medical achievements of the Great War: the desperate need for blood created by war drove earlier adoption of scientific discoveries that might otherwise have been neglected. The medical advances of the Great War in turn enabled the establishment of wide-ranging transfusion services to aid combatants during the Spanish Civil War and Second World War. These services required the support of large civilian donor bases, and donor campaigns tapped into the patriotic feelings of civilians at home. This review will highlight some of the messages and media that were used to recruit blood donors in Spain, Britain, Canada, and the United States during wartime. Copyright © 2017 Elsevier Inc. All rights reserved.
Anderson, J R; Mohammed, S; Grimm, B; Jones, B W; Koshevoy, P; Tasdizen, T; Whitaker, R; Marc, R E
2011-01-01
Modern microscope automation permits the collection of vast amounts of continuous anatomical imagery in both two and three dimensions. These large data sets present significant challenges for data storage, access, viewing, annotation and analysis. The cost and overhead of collecting and storing the data can be extremely high. Large data sets quickly exceed an individual's capability for timely analysis and present challenges in efficiently applying transforms, if needed. Finally annotated anatomical data sets can represent a significant investment of resources and should be easily accessible to the scientific community. The Viking application was our solution created to view and annotate a 16.5 TB ultrastructural retinal connectome volume and we demonstrate its utility in reconstructing neural networks for a distinctive retinal amacrine cell class. Viking has several key features. (1) It works over the internet using HTTP and supports many concurrent users limited only by hardware. (2) It supports a multi-user, collaborative annotation strategy. (3) It cleanly demarcates viewing and analysis from data collection and hosting. (4) It is capable of applying transformations in real-time. (5) It has an easily extensible user interface, allowing addition of specialized modules without rewriting the viewer. © 2010 The Authors Journal of Microscopy © 2010 The Royal Microscopical Society.
NASA Astrophysics Data System (ADS)
Coogan, A.; Avanzi, F.; Akella, R.; Conklin, M. H.; Bales, R. C.; Glaser, S. D.
2017-12-01
Automatic meteorological and snow stations provide large amounts of information at dense temporal resolution, but data quality is often compromised by noise and missing values. We present a new gap-filling and cleaning procedure for networks of these stations based on Kalman filtering and expectation maximization. Our method utilizes a multi-sensor, regime-switching Kalman filter to learn a latent process that captures dependencies between nearby stations and handles sharp changes in snowfall rate. Since the latent process is inferred using observations across working stations in the network, it can be used to fill in large data gaps for a malfunctioning station. The procedure was tested on meteorological and snow data from Wireless Sensor Networks (WSN) in the American River basin of the Sierra Nevada. Data include air temperature, relative humidity, and snow depth from dense networks of 10 to 12 stations within 1 km2 swaths. Both wet and dry water years have similar data issues. Data with artificially created gaps was used to quantify the method's performance. Our multi-sensor approach performs better than a single-sensor one, especially with large data gaps, as it learns and exploits the dominant underlying processes in snowpack at each site.
Changes in size of deforested patches in the Brazilian Amazon.
Rosa, Isabel M D; Souza, Carlos; Ewers, Robert M
2012-10-01
Different deforestation agents, such as small farmers and large agricultural businesses, create different spatial patterns of deforestation. We analyzed the proportion of deforestation associated with different-sized clearings in the Brazilian Amazon from 2002 through 2009. We used annual deforestation maps to determine total area deforested and the size distribution of deforested patches per year. The size distribution of deforested areas changed over time in a consistent, directional manner. Large clearings (>1000 ha) comprised progressively smaller amounts of total annual deforestation. The number of smaller clearings (6.25-50.00 ha) remained unchanged over time. Small clearings accounted for 73% of all deforestation in 2009, up from 30% in 2002, whereas the proportion of deforestation attributable to large clearings decreased from 13% to 3% between 2002 and 2009. Large clearings were concentrated in Mato Grosso, but also occurred in eastern Pará and in Rondônia. In 2002 large clearings accounted for 17%, 15%, and 10% of all deforestation in Mato Grosso, Pará, and Rondônia, respectively. Even in these states, where there is a highly developed agricultural business dominated by soybean production and cattle ranching, the proportional contribution of large clearings to total deforestation declined. By 2009 large clearings accounted for 2.5%, 3.5%, and 1% of all deforestation in Mato Grosso, Pará, and Rondônia, respectively. These changes in deforestation patch size are coincident with the implementation of new conservation policies by the Brazilian government, which suggests that these policies are not effectively reducing the number of small clearings in primary forest, whether these are caused by large landholders or smallholders, but have been more effective at reducing the frequency of larger clearings. ©2012 Society for Conservation Biology.
Bone Cancer: Questions and Answers
... which uses a powerful magnet linked to a computer to create detailed pictures of areas inside the body without using x-rays. A positron emission tomography (PET) scan , in which a small amount of ...
The Global Economic Crisis: Impact on Sub-Saharan Africa and Global Policy Responses
2009-10-19
concessional lending facilities, the Poverty Reduction and Growth Facility ( PRGF ) and the Exogenous Shocks Facility (ESF).99 Figure 12. IMF Concessional Loans...to Africa Billions of Dollars Source: International Monetary Fund. Notes: Amounts are the total amount of outstanding PRGF and ESF loans to...99 PRGF loans are intended to help low-income countries address balance of payments concerns, such as those created by the financial crisis. Unlike
The Effects of Mass Media Advertising on U.S. Army Recruiting.
1982-03-16
personal commitments and sacri- fices which have been, all too often, ignored. ACKNOWLEDGEMENTS I would like to thank Dr. John S. Detweiler for his...advertising to create the in- centive to enlist. There has been little chance for personal contact by recruiters with the target audience in the numbers...audience. Manpower and budget restrictions limit the amount of Army personnel involved in recruiting. This, in turn, limits 4 the amount of personal
Nanodust released in interplanetary collisions
NASA Astrophysics Data System (ADS)
Lai, H. R.; Russell, C. T.
2018-07-01
The lifecycle of near-Earth objects (NEOs) involves a collisional cascade that produces ever smaller debris ending with nanoscale particles which are removed from the solar system by radiation pressure and electromagnetic effects. It has been proposed that the nanodust clouds released in collisions perturb the background interplanetary magnetic field and create the interplanetary field enhancements (IFEs). Assuming that this IFE formation scenario is actually operating, we calculate the interplanetary collision rate, estimate the total debris mass carried by nanodust, and compare the collision rate with the IFE rate. We find that to release the same amount of nanodust, the collision rate is comparable to the observed IFE rate. Besides quantitatively testing the association between the collisions evolving large objects and giant solar wind structures, such a study can be extended to ranges of smaller scales and to investigate the source of moderate and small solar wind perturbations.
Complete Imageless solution for overlay front-end manufacturing
NASA Astrophysics Data System (ADS)
Herisson, David; LeCacheux, Virginie; Touchet, Mathieu; Vachellerie, Vincent; Lecarpentier, Laurent; Felten, Franck; Polli, Marco
2005-09-01
Imageless option of KLA-Tencor RDM system (Recipe Data Management) is a new method of recipe creation, using only the mask design to define alignment target and measurement parameters. This technique is potentially the easiest tool to improve recipe management of a large amount of products in logic fab. Overlay recipes are created without wafer, by using a synthetic image (copy of gds mask file) for alignment pattern and target design like shape (frame in frame) and size for the measurement. A complete gauge study on critical CMOS 90nm Gate level has been conducted to evaluate reliability and robustness of the imageless recipe. We show that Imageless limits drastically the number of templates used for recipe creation, and improves or maintains measurement capability compare to manual recipe creation (operator dependant). Imageless appears to be a suitable solution for high volume manufacturing, as shown by the results obtained on production lots.
Making big data useful for health care: a summary of the inaugural mit critical data conference.
Badawi, Omar; Brennan, Thomas; Celi, Leo Anthony; Feng, Mengling; Ghassemi, Marzyeh; Ippolito, Andrea; Johnson, Alistair; Mark, Roger G; Mayaud, Louis; Moody, George; Moses, Christopher; Naumann, Tristan; Pimentel, Marco; Pollard, Tom J; Santos, Mauro; Stone, David J; Zimolzak, Andrew
2014-08-22
With growing concerns that big data will only augment the problem of unreliable research, the Laboratory of Computational Physiology at the Massachusetts Institute of Technology organized the Critical Data Conference in January 2014. Thought leaders from academia, government, and industry across disciplines-including clinical medicine, computer science, public health, informatics, biomedical research, health technology, statistics, and epidemiology-gathered and discussed the pitfalls and challenges of big data in health care. The key message from the conference is that the value of large amounts of data hinges on the ability of researchers to share data, methodologies, and findings in an open setting. If empirical value is to be from the analysis of retrospective data, groups must continuously work together on similar problems to create more effective peer review. This will lead to improvement in methodology and quality, with each iteration of analysis resulting in more reliability.
Automotive HUDs: the overlooked safety issues.
Tufano, D R
1997-06-01
The transfer of tactical aviation technology into automobiles is creating information display requirements that are likely to be met by use of the head-up display (HUD). These developments are based largely on conclusions that the HUD-related safety issues raised in the aviation HUD literature can be dismissed and that the benefits of using HUDs are certain. Such conclusions either neglect relevant research or are supported by a very small amount of evidence, much of which is either irrelevant or generated within a flawed methodological paradigm. This critical review covers the issues of (a) HUD focal distance and its effect on the perception of outside objects and (b) the effects of HUD imagery on visual attention. The issues of focal distance, cognitive capture, and the inherent connection between the two may have a greater impact on safety in the automotive context than they do in aviation.
Proper use of colour schemes for image data visualization
NASA Astrophysics Data System (ADS)
Vozenilek, Vit; Vondrakova, Alena
2018-04-01
With the development of information and communication technologies, new technologies are leading to an exponential increase in the volume and types of data available. At this time of the information society, data is one of the most important arguments for policy making, crisis management, research and education, and many other fields. An essential task for experts is to share high-quality data providing the right information at the right time. Designing of data presentation can largely influence the user perception and the cognitive aspects of data interpretation. Significant amounts of data can be visualised in some way. One image can thus replace a considerable number of numeric tables and texts. The paper focuses on the accurate visualisation of data from the point of view of used colour schemes. Bad choose of colours can easily confuse the user and lead to the data misinterpretation. On the contrary, correctly created visualisations can make information transfer much simpler and more efficient.
Organic textile waste as a resource for sustainable agriculture in arid and semi-arid areas.
Eriksson, Bo G
2017-03-01
New vegetation in barren areas offers possibilities for sequestering carbon in the soil. Arid and semi-arid areas (ASAs) are candidates for new vegetation. The possibility of agriculture in ASAs is reviewed, revealing the potential for cultivation by covering the surface with a layer of organic fibres. This layer collects more water from humidity in the air than does the uncovered mineral surface, and creates a humid environment that promotes microbial life. One possibility is to use large amounts of organic fibres for soil enhancement in ASAs. In the context of the European Commission Waste Framework Directive, the possibility of using textile waste from Sweden is explored. The costs for using Swedish textile waste are high, but possible gains are the sale of agricultural products and increased land prices as well as environmental mitigation. The findings suggest that field research on such agriculture in ASAs should start as soon as possible.
Engine Yaw Augmentation for Hybrid-Wing-Body Aircraft via Optimal Control Allocation Techniques
NASA Technical Reports Server (NTRS)
Taylor, Brian R.; Yoo, Seung Yeun
2011-01-01
Asymmetric engine thrust was implemented in a hybrid-wing-body non-linear simulation to reduce the amount of aerodynamic surface deflection required for yaw stability and control. Hybrid-wing-body aircraft are especially susceptible to yaw surface deflection due to their decreased bare airframe yaw stability resulting from the lack of a large vertical tail aft of the center of gravity. Reduced surface deflection, especially for trim during cruise flight, could reduce the fuel consumption of future aircraft. Designed as an add-on, optimal control allocation techniques were used to create a control law that tracks total thrust and yaw moment commands with an emphasis on not degrading the baseline system. Implementation of engine yaw augmentation is shown and feasibility is demonstrated in simulation with a potential drag reduction of 2 to 4 percent. Future flight tests are planned to demonstrate feasibility in a flight environment.
Monnier, Stéphanie; Cox, David G; Albion, Tim; Canzian, Federico
2005-01-01
Background Single Nucleotide Polymorphism (SNP) genotyping is a major activity in biomedical research. The Taqman technology is one of the most commonly used approaches. It produces large amounts of data that are difficult to process by hand. Laboratories not equipped with a Laboratory Information Management System (LIMS) need tools to organize the data flow. Results We propose a package of Visual Basic programs focused on sample management and on the parsing of input and output TaqMan files. The code is written in Visual Basic, embedded in the Microsoft Office package, and it allows anyone to have access to those tools, without any programming skills and with basic computer requirements. Conclusion We have created useful tools focused on management of TaqMan genotyping data, a critical issue in genotyping laboratories whithout a more sophisticated and expensive system, such as a LIMS. PMID:16221298
Efficient Sample Tracking With OpenLabFramework
List, Markus; Schmidt, Steffen; Trojnar, Jakub; Thomas, Jochen; Thomassen, Mads; Kruse, Torben A.; Tan, Qihua; Baumbach, Jan; Mollenhauer, Jan
2014-01-01
The advance of new technologies in biomedical research has led to a dramatic growth in experimental throughput. Projects therefore steadily grow in size and involve a larger number of researchers. Spreadsheets traditionally used are thus no longer suitable for keeping track of the vast amounts of samples created and need to be replaced with state-of-the-art laboratory information management systems. Such systems have been developed in large numbers, but they are often limited to specific research domains and types of data. One domain so far neglected is the management of libraries of vector clones and genetically engineered cell lines. OpenLabFramework is a newly developed web-application for sample tracking, particularly laid out to fill this gap, but with an open architecture allowing it to be extended for other biological materials and functional data. Its sample tracking mechanism is fully customizable and aids productivity further through support for mobile devices and barcoded labels. PMID:24589879
Response comment: Carbon sequestration on Mars
Edwards, Christopher; Ehlmann, Bethany L.
2016-01-01
Martian atmospheric pressure has important implications for the past and present habitability of the planet, including the timing and causes of environmental change. The ancient Martian surface is strewn with evidence for early water bound in minerals (e.g., Ehlmann and Edwards, 2014) and recorded in surface features such as large catastrophically created outflow channels (e.g., Carr, 1979), valley networks (Hynek et al., 2010; Irwin et al., 2005), and crater lakes (e.g., Fassett and Head, 2008). Using orbital spectral data sets coupled with geologic maps and a set of numerical spectral analysis models, Edwards and Ehlmann (2015) constrained the amount of atmospheric sequestration in early Martian rocks and found that the majority of this sequestration occurred prior to the formation of the early Hesperian/late Noachian valley networks (Fassett and Head, 2011; Hynek et al., 2010), thus implying the atmosphere was already thin by the time these surface-water-related features were formed.
NASA Astrophysics Data System (ADS)
Kochrekar, Sachin; Agharkar, Mahesh; Salgaonkar, Manjunath; Gharge, Mrunal; Hidouri, Slah; Azeez, Musibau A.
2015-06-01
Graphene is a two-dimensional form of graphite that has attracted great curiosity for its novel physical properties. A key challenge that has emerged is how to create large amounts of graphene at low cost. The purpose of this Paper is to explore a new method to exfoliate graphite extracted from used dry battery in a small scale blender; in presence of SDS surfactant to synthesize graphene oxide, which can be then reduced to graphene. Quantity of SDS required is extremely less (1/10th) of graphite, and it replaces several steps and chemicals such as KMnO4, H2O2, H2SO4 and NaNO3. In this paper, we present the new process and preliminary characterization of synthesized graphene oxide by Raman and UV-Vis absorbance spectroscopy and ATR-IR spectroscopy.
Towards sustainable design for single-use medical devices.
Hanson, Jacob J; Hitchcock, Robert W
2009-01-01
Despite their sophistication and value, single-use medical devices have become commodity items in the developed world. Cheap raw materials along with large scale manufacturing and distribution processes have combined to make many medical devices more expensive to resterilize, package and restock than to simply discard. This practice is not sustainable or scalable on a global basis. As the petrochemicals that provide raw materials become more expensive and the global reach of these devices continues into rapidly developing economies, there is a need for device designs that take into account the total life-cycle of these products, minimize the amount of non-renewable materials consumed and consider alternative hybrid reusable / disposable approaches. In this paper, we describe a methodology to perform life cycle and functional analyses to create additional design requirements for medical devices. These types of sustainable approaches can move the medical device industry even closer to the "triple bottom line"--people, planet, profit.
Study regarding the spline interpolation accuracy of the experimentally acquired data
NASA Astrophysics Data System (ADS)
Oanta, Emil M.; Danisor, Alin; Tamas, Razvan
2016-12-01
Experimental data processing is an issue that must be solved in almost all the domains of science. In engineering we usually have a large amount of data and we try to extract the useful signal which is relevant for the phenomenon under investigation. The criteria used to consider some points more relevant then some others may take into consideration various conditions which may be either phenomenon dependent, or general. The paper presents some of the ideas and tests regarding the identification of the best set of criteria used to filter the initial set of points in order to extract a subset which best fits the approximated function. If the function has regions where it is either constant, or it has a slow variation, fewer discretization points may be used. This means to create a simpler solution to process the experimental data, keeping the accuracy in some fair good limits.
Mice with altered BDNF signaling as models for mood disorders and antidepressant effects
Lindholm, Jesse S. O.; Castrén, Eero
2014-01-01
Brain-derived neurotrophic factor (BDNF) and its receptor tyrosine kinase TrkB support neuronal survival during development and promote connectivity and plasticity in the adult brain. Decreased BDNF signaling is associated with the pathophysiology of depression and the mechanisms underlying the actions of antidepressant drugs (AD). Several transgenic mouse models with decreases or increases in the amount of BDNF or the activity of TrkB signaling have been created. This review summarizes the studies where various mouse models with increased or decreased BDNF levels or TrkB signaling were used to evaluate the role of BDNF signaling in depression-like behavior. Although a large number of models have been employed and several studies have been published, no clear-cut connections between BDNF levels or signaling and depression-like behavior in mice have emerged. However, it is clear that BDNF plays a critical role in the mechanisms underlying the actions of AD. PMID:24817844
NASA Astrophysics Data System (ADS)
Zhang, Junyi; Beugnon, Jerome; Nascimbene, Sylvain
We describe a protocol to prepare clusters of ultracold bosonic atoms in strongly interacting states reminiscent of fractional quantum Hall states. Our scheme consists in injecting a controlled amount of angular momentum to an atomic gas using Raman transitions carrying orbital angular momentum. By injecting one unit of angular momentum per atom, one realizes a single-vortex state, which is well described by mean-field theory for large enough particle numbers. We also present schemes to realize fractional quantum Hall states, namely, the bosonic Laughlin and Moore-Read states. We investigate the requirements for adiabatic nucleation of such topological states, in particular comparing linear Landau-Zener ramps and arbitrary ramps obtained from optimized control methods. We also show that this protocol requires excellent control over the isotropic character of the trapping potential. ERC-Synergy Grant UQUAM, ANR-10-IDEX-0001-02, DIM NanoK Atocirc project.
A Review on the Bioinformatics Tools for Neuroimaging
MAN, Mei Yen; ONG, Mei Sin; Mohamad, Mohd Saberi; DERIS, Safaai; SULONG, Ghazali; YUNUS, Jasmy; CHE HARUN, Fauzan Khairi
2015-01-01
Neuroimaging is a new technique used to create images of the structure and function of the nervous system in the human brain. Currently, it is crucial in scientific fields. Neuroimaging data are becoming of more interest among the circle of neuroimaging experts. Therefore, it is necessary to develop a large amount of neuroimaging tools. This paper gives an overview of the tools that have been used to image the structure and function of the nervous system. This information can help developers, experts, and users gain insight and a better understanding of the neuroimaging tools available, enabling better decision making in choosing tools of particular research interest. Sources, links, and descriptions of the application of each tool are provided in this paper as well. Lastly, this paper presents the language implemented, system requirements, strengths, and weaknesses of the tools that have been widely used to image the structure and function of the nervous system. PMID:27006633
Reid, Jeffrey G; Carroll, Andrew; Veeraraghavan, Narayanan; Dahdouli, Mahmoud; Sundquist, Andreas; English, Adam; Bainbridge, Matthew; White, Simon; Salerno, William; Buhay, Christian; Yu, Fuli; Muzny, Donna; Daly, Richard; Duyk, Geoff; Gibbs, Richard A; Boerwinkle, Eric
2014-01-29
Massively parallel DNA sequencing generates staggering amounts of data. Decreasing cost, increasing throughput, and improved annotation have expanded the diversity of genomics applications in research and clinical practice. This expanding scale creates analytical challenges: accommodating peak compute demand, coordinating secure access for multiple analysts, and sharing validated tools and results. To address these challenges, we have developed the Mercury analysis pipeline and deployed it in local hardware and the Amazon Web Services cloud via the DNAnexus platform. Mercury is an automated, flexible, and extensible analysis workflow that provides accurate and reproducible genomic results at scales ranging from individuals to large cohorts. By taking advantage of cloud computing and with Mercury implemented on the DNAnexus platform, we have demonstrated a powerful combination of a robust and fully validated software pipeline and a scalable computational resource that, to date, we have applied to more than 10,000 whole genome and whole exome samples.
Making Big Data Useful for Health Care: A Summary of the Inaugural MIT Critical Data Conference
2014-01-01
With growing concerns that big data will only augment the problem of unreliable research, the Laboratory of Computational Physiology at the Massachusetts Institute of Technology organized the Critical Data Conference in January 2014. Thought leaders from academia, government, and industry across disciplines—including clinical medicine, computer science, public health, informatics, biomedical research, health technology, statistics, and epidemiology—gathered and discussed the pitfalls and challenges of big data in health care. The key message from the conference is that the value of large amounts of data hinges on the ability of researchers to share data, methodologies, and findings in an open setting. If empirical value is to be from the analysis of retrospective data, groups must continuously work together on similar problems to create more effective peer review. This will lead to improvement in methodology and quality, with each iteration of analysis resulting in more reliability. PMID:25600172
Permafrost and Subsurface Ice in the Solar System
NASA Technical Reports Server (NTRS)
Anderson, D. M.
1985-01-01
The properties and behavior of planetary permafrost are discussed with reference to the ability of such surfaces to sustain loads characteristics of spacecraft landing and planetary bases. In most occurrences, water ice is in close proximity to, or in contact with, finely divided silicate mineral matter. When ice contacts silicate mineral surfaces, a liquid-like, transition zone is created. Its thickness ranges from several hundred Angstron units at temperatures near 0 degrees C to about three Angstrom units at -150 degrees C. When soluble substances are present, the resulting brine enlarges the interfacial zone. When clays are involved, although the interfacial zone may be small, its extent is large. The unfrozen, interfacial water may amount to 100% or more weight at a temperature of -5 degrees C. The presence of this interfacial unfrozen water acts to confer plasticity to permafrost, enabling it to exhibit creep at all imposed levels of stress. Nucleation processes and load-bearing capacity are examined.
Enzymatic catalysis treatment method of meat industry wastewater using lacasse.
Thirugnanasambandham, K; Sivakumar, V
2015-01-01
The process of meat industry produces in a large amount of wastewater that contains high levels of colour and chemical oxygen demand (COD). So they must be pretreated before their discharge into the ecological system. In this paper, enzymatic catalysis (EC) was adopted to treat the meat wastewater. Box-Behnken design (BBD), an experimental design for response surface methodology (RSM), was used to create a set of 29 experimental runs needed for optimizing of the operating conditions. Quadratic regression models with estimated coefficients were developed to describe the colour and COD removals. The experimental results show that EC could effectively reduce colour (95 %) and COD (86 %) at the optimum conditions of enzyme dose of 110 U/L, incubation time of 100 min, pH of 7 and temperature of 40 °C. RSM could be effectively adopted to optimize the operating multifactors in complex EC process.
Large-Eddy Simulation of Wind-Plant Aerodynamics: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Churchfield, M. J.; Lee, S.; Moriarty, P. J.
In this work, we present results of a large-eddy simulation of the 48 multi-megawatt turbines composing the Lillgrund wind plant. Turbulent inflow wind is created by performing an atmospheric boundary layer precursor simulation and turbines are modeled using a rotating, variable-speed actuator line representation. The motivation for this work is that few others have done wind plant large-eddy simulations with a substantial number of turbines, and the methods for carrying out the simulations are varied. We wish to draw upon the strengths of the existing simulations and our growing atmospheric large-eddy simulation capability to create a sound methodology for performingmore » this type of simulation. We have used the OpenFOAM CFD toolbox to create our solver.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Punsly, Brian, E-mail: brian.punsly@verizon.net, E-mail: brian.punsly@comdev-usa.com; ICRANet, Piazza della Repubblica 10, I-65100 Pescara
It has been previously determined that there is a highly significant correlation between the spectral index from 10 GHz to 1350 A and the amount of excess luminosity in the red wing of quasar C IV {lambda}1549 broad emission lines (BELs). Ostensibly, the prominence of the red excess is associated with the radio jet emission mechanism and is most pronounced for lines of sight close to the jet axis. Studying the scant significant differences in the UV spectra of radio-loud and radio-quiet quasars might provide vital clues to the origin of the unknown process that creates powerful relativistic jets thatmore » appear in only about 10% of quasars. In this study, the phenomenon is explored with multi-epoch observations of the Mg II {lambda}2798 broad line in 3C 279 which has one of the largest known red wing excesses in a quasar spectrum. The amount of excess that is detected appears to be independent of all directly observed optical continuum, radio, or submillimeter properties (fluxes or polarizations). The only trend that occurs in this sparse data is: the stronger the BEL, the larger the fraction of flux that resides in the red wing. It is concluded that more monitoring is needed and spectropolarimetry with a large telescope is essential during low states to understand more.« less
NASA Astrophysics Data System (ADS)
Felbauer, Lucia; Pöppl, Ronald
2016-04-01
Global warming results in an ongoing retreat of glaciers in the Alps, leaving behind large amounts of easily erodible sediments. In addition, the debuttressing of rock-walls and the decay of permafrost in the high mountain regions facilitates mass movements of potential disastrous consequences, such as rock falls, landslides and debris flows. Therefore, it is highly important to quantify the amount of sediments that are supplied from the different compartments and to investigate how glacial retreat influences sediment dynamics in proglacial areas. In the presented work glacier retreat and associated sediment dynamics were investigated in the Kromer valley (Silvretta Alps, Austria) by analyzing remote sensing data. Glacial retreat from the period of 1950 to 2012 was documented by interpreting aerial photographs. By digitizing the different stages of the glaciers for six time frames, changes in glacier area and length were mapped and quantified. In order to identify, characterize and quantify sediment dynamics in the proglacial areas a high resolution DEM of difference (DoD) between 2007 and 2012 was created and analyzed, further differentiating between different zones (e.g. valley bottom, hillslope) and types of geomorphic processes (e.g. fluvial, gravitational). First results will be presented at the EGU General Assembly 2016.
Meng, Z X; Li, H F; Sun, Z Z; Zheng, W; Zheng, Y F
2013-03-01
Surface mineralization is an effective method to produce calcium phosphate apatite coating on the surface of bone tissue scaffold which could create an osteophilic environment similar to the natural extracellular matrix for bone cells. In this study, we prepared mineralized poly(D,L-lactide-co-glycolide) (PLGA) and PLGA/gelatin electrospun nanofibers via depositing calcium phosphate apatite coating on the surface of these nanofibers to fabricate bone tissue engineering scaffolds by concentrated simulated body fluid method, supersaturated calcification solution method and alternate soaking method. The apatite products were characterized by the scanning electron microscopy (SEM), Fourier transform-infrared spectroscopy (FT-IR), and X-ray diffractometry (XRD) methods. A large amount of calcium phosphate apatite composed of dicalcium phosphate dihydrate (DCPD), hydroxyapatite (HA) and octacalcium phosphate (OCP) was deposited on the surface of resulting nanofibers in short times via three mineralizing methods. A larger amount of calcium phosphate was deposited on the surface of PLGA/gelatin nanofibers rather than PLGA nanofibers because gelatin acted as nucleation center for the formation of calcium phosphate. The cell culture experiments revealed that the difference of morphology and components of calcium phosphate apatite did not show much influence on the cell adhesion, proliferation and activity. Copyright © 2012 Elsevier B.V. All rights reserved.
Oliva, Eduardo; Zeitoun, Philippe; Velarde, Pedro; Fajardo, Marta; Cassou, Kevin; Ros, David; Sebban, Stephan; Portillo, David; le Pape, Sebastien
2010-11-01
Plasma-based seeded soft-x-ray lasers have the potential to generate high energy and highly coherent short pulse beams. Due to their high density, plasmas created by the interaction of an intense laser with a solid target should store the highest amount of energy density among all plasma amplifiers. Our previous numerical work with a two-dimensional (2D) adaptive mesh refinement hydrodynamic code demonstrated that careful tailoring of plasma shapes leads to a dramatic enhancement of both soft-x-ray laser output energy and pumping efficiency. Benchmarking of our 2D hydrodynamic code in previous experiments demonstrated a high level of confidence, allowing us to perform a full study with the aim of the way for 10-100 μJ seeded soft-x-ray lasers. In this paper, we describe in detail the mechanisms that drive the hydrodynamics of plasma columns. We observed transitions between narrow plasmas, where very strong bidimensional flow prevents them from storing energy, to large plasmas that store a high amount of energy. Millimeter-sized plasmas are outstanding amplifiers, but they have the limitation of transverse lasing. In this paper, we provide a preliminary solution to this problem.
Gedrange, Tomasz
2016-01-01
The aim of this study was to examine the osteogenic potential of new flax covering materials. Bone defects were created on the skull of forty rats. Materials of pure PLA and PCL and their composites with flax fibers, genetically modified producing PHB (PLA-transgen, PCL-transgen) and unmodified (PLA-wt, PCL-wt), were inserted. The skulls were harvested after four weeks and subjected to histological examination. The percentage of bone regeneration by using PLA was less pronounced than after usage of pure PCL in comparison with controls. After treatment with PCL-transgen, a large amount of new formed bone could be found. In contrast, PCL-wt decreased significantly the bone regeneration, compared to the other tested groups. The bone covers made of pure PLA had substantially less influence on bone regeneration and the bone healing proceeded with a lot of connective tissue, whereas PLA-transgen and PLA-wt showed nearly comparable amount of new formed bone. Regarding the histological data, the hypothesis could be proposed that PCL and its composites have contributed to a higher quantity of the regenerated bone, compared to PLA. The histological studies showed comparable bone regeneration processes after treatment with tested covering materials, as well as in the untreated bone lesions. PMID:27597965
CD-ROM And Knowledge Integration
NASA Astrophysics Data System (ADS)
Rann, Leonard S.
1988-06-01
As the title of this paper suggests, it is about CD-ROM technology and the structuring of massive databases. Even more, it is about the impact CD-ROM has had on the publication of massive amounts of information, and the unique qualities of the medium that allows for the most sophisticated computer retrieval techniques that have ever been used. I am not drawing on experience as a pedant in the educational field, but rather as a software and database designer who has worked with CD-ROM since its inception. I will be giving examples from my company's current applications, as well as discussing some of the challenges that face information publishers in the future. In particular I have a belief about what the most valuable outlet can be created using CD-ROM will be: The CD-ROM is particularly suited for the mass delivery of information systems and databases that either require or utilize a large amount of computational preprocessing to allow a real-time or interactive response to be achieved. Until the advent of CD-ROM technology this level of sophistication in publication was virtually impossible. I will further explain this later in this paper. First, I will discuss the salient features of CD-ROM that make it unique in the world of data storage for electronic publishing.
Biologically inspired circuitry that mimics mammalian hearing
NASA Astrophysics Data System (ADS)
Hubbard, Allyn; Cohen, Howard; Karl, Christian; Freedman, David; Mountain, David; Ziph-Schatzberg, Leah; Nourzad Karl, Marianne; Kelsall, Sarah; Gore, Tyler; Pu, Yirong; Yang, Zibing; Xing, Xinyu; Deligeorges, Socrates
2009-05-01
We are developing low-power microcircuitry that implements classification and direction finding systems of very small size and small acoustic aperture. Our approach was inspired by the fact that small mammals are able to localize sounds despite their ears may be separated by as little as a centimeter. Gerbils, in particular are good low-frequency localizers, which is a particularly difficult task, since a wavelength at 500 Hz is on the order of two feet. Given such signals, crosscorrelation- based methods to determine direction fail badly in the presence of a small amount of noise, e.g. wind noise and noise clutter common to almost any realistic environment. Circuits are being developed using both analog and digital techniques, each of which process signals in fundamentally the same way the peripheral auditory system of mammals processes sound. A filter bank represents filtering done by the cochlea. The auditory nerve is implemented using a combination of an envelope detector, an automatic gain stage, and a unique one-bit A/D, which creates what amounts to a neural impulse. These impulses are used to extract pitch characteristics, which we use to classify sounds such as vehicles, small and large weaponry from AK-47s to 155mm cannon, including mortar launches and impacts. In addition to the pitchograms, we also use neural nets for classification.
Carbonate sediment deposits on the reef front around Oahu, Hawaii
Hampton, M.A.; Blay, C.T.; Murray, C.J.
2004-01-01
Large sediment deposits on the reff front around Oahu are a possible resource for replenishing eroded beaches. High-resolution subbottom profiles clearly depict the deposits in three study areas: Kailua Bay off the windward coast, Makua to Kahe Point off the leeward coast, and Camp Erdman to Waimea off the north coast. Most of the sediment is in water depths between 20 and 100 m, resting on submerged shelves created during lowstands of sea level. The mapped deposits have a volume of about 4 ?? 108 m3 in water depths less than 100 m, being thickest off the mouth of channels carved into the modern insular shelf, from which most of the sediment issues. Vibracore samples contain various amounts of sediment of similar size to the sand on Oahu beaches, with the most compatible prospects located off Makaha, Haleiwa, and Camp Erdman, and the least compatible ones located in Kailua Bay. Laboratory tests show a positive correlation of abrasion with Halimeda content: samples from Kailua Bay suffered high amounts of attrition, but others were comparable to tested beach samples. The common gray color of the offshore sediment, aesthetically undesirable for sand on popular tourist beaches, was diminished in the laboratory by soaking in heated hydrogen peroxide. ?? Taylor and Francis Inc.
Gredes, Tomasz; Kunath, Franziska; Gedrange, Tomasz; Kunert-Keil, Christiane
2016-01-01
The aim of this study was to examine the osteogenic potential of new flax covering materials. Bone defects were created on the skull of forty rats. Materials of pure PLA and PCL and their composites with flax fibers, genetically modified producing PHB (PLA-transgen, PCL-transgen) and unmodified (PLA-wt, PCL-wt), were inserted. The skulls were harvested after four weeks and subjected to histological examination. The percentage of bone regeneration by using PLA was less pronounced than after usage of pure PCL in comparison with controls. After treatment with PCL-transgen, a large amount of new formed bone could be found. In contrast, PCL-wt decreased significantly the bone regeneration, compared to the other tested groups. The bone covers made of pure PLA had substantially less influence on bone regeneration and the bone healing proceeded with a lot of connective tissue, whereas PLA-transgen and PLA-wt showed nearly comparable amount of new formed bone. Regarding the histological data, the hypothesis could be proposed that PCL and its composites have contributed to a higher quantity of the regenerated bone, compared to PLA. The histological studies showed comparable bone regeneration processes after treatment with tested covering materials, as well as in the untreated bone lesions.
The Equipment of Using AZOLLA for O2-Supplimentation and its Test
NASA Astrophysics Data System (ADS)
Liu, Xia-Shi; Chen, Min; Bian, Zu-Liang; Liu, Chung-Chu
The water-consuming amount in a long-term astro-navigation is large. In order to reduce the burden of water supply from Earth ground, the space station needs to resolve the problems of water supply. For this reason, the recovery and regeneration of urine solution of spacemen, and its utilization possess a key importance. Many investigations on this aspect have been reported. Our research based on "biological absorption-purification-UV photocatalytic oxidation" techniques with a relevant treating equipment that for a comprehensive treatment to fresh urine of spacemen has been created. In this equipment, the urine solution was used as the nutrient solution for the biological parts in ecological life ensurant system, after absorbing the nutrient, it was decomposed, metabolized and purified in some distance, and created a favorable condition for the follow-up oxidation treatment by UV-Photocatalytic Oxidation. After these two processes, the treated urine solution reached the GB5749-85 standard of water quality. Some main indexes are as table one. Chroma<5-15 Feculent degree-NTU-1.20-3 --5 Total rigidity-according to CaCO3-mg/L-3.60-450 N-NO3--mg/L-0.60-20 Soluble total solid-mg/L-543-1000 Bacterial gross-cfu/ml-13-100 Coliform group-No./L-<3-3 Key words-Photocatalytic Oxidation , Ultraviolet
Observations of absorption lines from highly ionized atoms. [of interstellar medium
NASA Technical Reports Server (NTRS)
Jenkins, Edward B.
1987-01-01
In the ultraviolet spectra of hot stars, absorption lines can be seen from highly ionized species in the interstellar medium. Observations of these features which have been very influential in revising the perception of the medium's various physical states, are discussed. The pervasiveness of O 6 absorption lines, coupled with complementary observations of a diffuse background in soft X-rays and EUV radiation, shows that there is an extensive network of low density gas (n approx. few x 0.001/cu cm) existing at coronal temperatures log T = 5.3 or 6.3. Shocks created by supernova explosions or mass loss from early-type stars can propagate freely through space and eventually transfer a large amount of energy to the medium. To create the coronal temperatures, the shocks must have velocities in excess of 150 km/sec; shocks at somewhat lower velocity (v = 100 km/sec) can be directly observed in the lines of Si3. Observations of other lines in the ultraviolet, such as Si 4V and C 5, may highlight the widespread presence of energetic UV radiation from very hot, dwarf stars. More advanced techniques in visible and X-ray astronomical spectroscopy may open up for inspection selected lines from atoms in much higher stages of ionization.
Observations of Absorption Lines from Highly Ionized Atoms
NASA Technical Reports Server (NTRS)
Jenkins, E. B.
1984-01-01
In the ultraviolet spectra of hot stars, absorption lines can be seen from highly ionized species in the interstellar medium. Observations of these features which have been very influential in revising the perception of the medium's various physical states, are discussed. The pervasiveness of O 6 absorption lines, coupled with complementary observations of a diffuse background in soft X-rays and EUV radiation, shows that there is an extensive network of low density gas (n approx. fewX 0.001/cucm) existing at coronal temperatures, 5.3 or = log T or = 6.3. Shocks created by supernova explosions or mass loss from early-type stars can propagate freely through space and eventually transfer a large amount of energy to the medium. To create the coronal temperatures, the shocks must have velocities in excess of 150 km/sec; shocks at somewhat lower velocity 9v or = 100 km/sec) can be directly observed in the lines of Si3. Observations of other lines in the ultraviolet, such as Si 4V and C 5, may highlight the widespread presence of energetic uv radiation from very hot, dward stars. More advanced techniques in visible and X-ray astronomical spectroscopy may open up for inspection selected lines from atoms in much higher stages of ionization.
Small unmanned aerial vehicles (micro-UAVs, drones) in plant ecology1
Cruzan, Mitchell B.; Weinstein, Ben G.; Grasty, Monica R.; Kohrn, Brendan F.; Hendrickson, Elizabeth C.; Arredondo, Tina M.; Thompson, Pamela G.
2016-01-01
Premise of the study: Low-elevation surveys with small aerial drones (micro–unmanned aerial vehicles [UAVs]) may be used for a wide variety of applications in plant ecology, including mapping vegetation over small- to medium-sized regions. We provide an overview of methods and procedures for conducting surveys and illustrate some of these applications. Methods: Aerial images were obtained by flying a small drone along transects over the area of interest. Images were used to create a composite image (orthomosaic) and a digital surface model (DSM). Vegetation classification was conducted manually and using an automated routine. Coverage of an individual species was estimated from aerial images. Results: We created a vegetation map for the entire region from the orthomosaic and DSM, and mapped the density of one species. Comparison of our manual and automated habitat classification confirmed that our mapping methods were accurate. A species with high contrast to the background matrix allowed adequate estimate of its coverage. Discussion: The example surveys demonstrate that small aerial drones are capable of gathering large amounts of information on the distribution of vegetation and individual species with minimal impact to sensitive habitats. Low-elevation aerial surveys have potential for a wide range of applications in plant ecology. PMID:27672518
An automated tool for an analysis of compliance to evidence-based clinical guidelines.
Metfessel, B A
2001-01-01
Evidence-based clinical guidelines have been developed in an attempt to decrease practice variation and improve patient outcomes. Although a number of studies and a few commercial products have attempted to measure guideline compliance, there still exists a strong need for an automated product that can take as input large amounts of data and create systematic and detailed profiles of compliance to evidence-based guidelines. The Guideline Compliance Assessment Tool is a product presently under development in our group that will accept as input medical and pharmacy claims data and create a guideline compliance profile that assesses provider practice patterns as compared to evidence-based standards. The system components include an episode of care grouper to standardize classifications of illnesses, an evidence-based guideline knowledge base that potentially contains information on several hundred distinct conditions, a guideline compliance scoring system that emphasizes systematic guideline variance rather than random variances, and an advanced data warehouse that would allow drilling into specific areas of interest. As provider profiling begins to shift away from a primary emphasis on cost to an emphasis on quality, automated methods for measuring guideline compliance will become important in measuring provider performance and increasing guideline usage, consequently improving the standard of care and the potential for better patient outcomes.
Using Human Induced Pluripotent Stem Cells to Model Skeletal Diseases.
Barruet, Emilie; Hsiao, Edward C
2016-01-01
Musculoskeletal disorders affecting the bones and joints are major health problems among children and adults. Major challenges such as the genetic origins or poor diagnostics of severe skeletal disease hinder our understanding of human skeletal diseases. The recent advent of human induced pluripotent stem cells (human iPS cells) provides an unparalleled opportunity to create human-specific models of human skeletal diseases. iPS cells have the ability to self-renew, allowing us to obtain large amounts of starting material, and have the potential to differentiate into any cell types in the body. In addition, they can carry one or more mutations responsible for the disease of interest or be genetically corrected to create isogenic controls. Our work has focused on modeling rare musculoskeletal disorders including fibrodysplasia ossificans progressive (FOP), a congenital disease of increased heterotopic ossification. In this review, we will discuss our experiences and protocols differentiating human iPS cells toward the osteogenic lineage and their application to model skeletal diseases. A number of critical challenges and exciting new approaches are also discussed, which will allow the skeletal biology field to harness the potential of human iPS cells as a critical model system for understanding diseases of abnormal skeletal formation and bone regeneration.
Cloud-Enabled Climate Analytics-as-a-Service using Reanalysis data: A case study.
NASA Astrophysics Data System (ADS)
Nadeau, D.; Duffy, D.; Schnase, J. L.; McInerney, M.; Tamkin, G.; Potter, G. L.; Thompson, J. H.
2014-12-01
The NASA Center for Climate Simulation (NCCS) maintains advanced data capabilities and facilities that allow researchers to access the enormous volume of data generated by weather and climate models. The NASA Climate Model Data Service (CDS) and the NCCS are merging their efforts to provide Climate Analytics-as-a-Service for the comparative study of the major reanalysis projects: ECMWF ERA-Interim, NASA/GMAO MERRA, NOAA/NCEP CFSR, NOAA/ESRL 20CR, JMA JRA25, and JRA55. These reanalyses have been repackaged to netCDF4 file format following the CMIP5 Climate and Forecast (CF) metadata convention prior to be sequenced into the Hadoop Distributed File System ( HDFS ). A small set of operations that represent a common starting point in many analysis workflows was then created: min, max, sum, count, variance and average. In this example, Reanalysis data exploration was performed with the use of Hadoop MapReduce and accessibility was achieved using the Climate Data Service(CDS) application programming interface (API) created at NCCS. This API provides a uniform treatment of large amount of data. In this case study, we have limited our exploration to 2 variables, temperature and precipitation, using 3 operations, min, max and avg and using 30-year of Reanalysis data for 3 regions of the world: global, polar, subtropical.
Sensing in the Collaborative Internet of Things
Borges Neto, João B.; Silva, Thiago H.; Assunção, Renato Martins; Mini, Raquel A. F.; Loureiro, Antonio A. F.
2015-01-01
We are entering a new era of computing technology, the era of Internet of Things (IoT). An important element for this popularization is the large use of off-the-shelf sensors. Most of those sensors will be deployed by different owners, generally common users, creating what we call the Collaborative IoT. This collaborative IoT helps to increase considerably the amount and availability of collected data for different purposes, creating new interesting opportunities, but also several challenges. For example, it is very challenging to search for and select a desired sensor or a group of sensors when there is no description about the provided sensed data or when it is imprecise. Given that, in this work we characterize the properties of the sensed data in the Internet of Things, mainly the sensed data contributed by several sources, including sensors from common users. We conclude that, in order to safely use data available in the IoT, we need a filtering process to increase the data reliability. In this direction, we propose a new simple and powerful approach that helps to select reliable sensors. We tested our method for different types of sensed data, and the results reveal the effectiveness in the correct selection of sensor data. PMID:25808766
NASA Technical Reports Server (NTRS)
Gregory, Kyle J.; Hill, Joanne E. (Editor); Black, J. Kevin; Baumgartner, Wayne H.; Jahoda, Keith
2016-01-01
A fundamental challenge in a spaceborne application of a gas-based Time Projection Chamber (TPC) for observation of X-ray polarization is handling the large amount of data collected. The TPC polarimeter described uses the APV-25 Application Specific Integrated Circuit (ASIC) to readout a strip detector. Two dimensional photoelectron track images are created with a time projection technique and used to determine the polarization of the incident X-rays. The detector produces a 128x30 pixel image per photon interaction with each pixel registering 12 bits of collected charge. This creates challenging requirements for data storage and downlink bandwidth with only a modest incidence of photons and can have a significant impact on the overall mission cost. An approach is described for locating and isolating the photoelectron track within the detector image, yielding a much smaller data product, typically between 8x8 pixels and 20x20 pixels. This approach is implemented using a Microsemi RT-ProASIC3-3000 Field-Programmable Gate Array (FPGA), clocked at 20 MHz and utilizing 10.7k logic gates (14% of FPGA), 20 Block RAMs (17% of FPGA), and no external RAM. Results will be presented, demonstrating successful photoelectron track cluster detection with minimal impact to detector dead-time.
Geomorphologic Map of Titan's Polar Terrains
NASA Astrophysics Data System (ADS)
Birch, S. P. D.; Hayes, A. G.; Malaska, M. J.; Lopes, R. M. C.; Schoenfeld, A.; Williams, D. A.
2016-06-01
Titan's lakes and seas contain vast amounts of information regarding the history and evolution of Saturn's largest moon. To understand this landscape, we created a geomorphologic map, and then used our map to develop an evolutionary model.
DES Science Portal: II- Creating Science-Ready Catalogs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fausti Neto, Angelo; et al.
We present a novel approach for creating science-ready catalogs through a software infrastructure developed for the Dark Energy Survey (DES). We integrate the data products released by the DES Data Management and additional products created by the DES collaboration in an environment known as DES Science Portal. Each step involved in the creation of a science-ready catalog is recorded in a relational database and can be recovered at any time. We describe how the DES Science Portal automates the creation and characterization of lightweight catalogs for DES Year 1 Annual Release, and show its flexibility in creating multiple catalogs withmore » different inputs and configurations. Finally, we discuss the advantages of this infrastructure for large surveys such as DES and the Large Synoptic Survey Telescope. The capability of creating science-ready catalogs efficiently and with full control of the inputs and configurations used is an important asset for supporting science analysis using data from large astronomical surveys.« less
Automated Solvent Seaming of Large Polyimide Membranes
NASA Technical Reports Server (NTRS)
Rood, Robert; Moore, James D.; Talley, Chris; Gierow, Paul A.
2006-01-01
A solvent-based welding process enables the joining of precise, cast polyimide membranes at their edges to form larger precise membranes. The process creates a homogeneous, optical-quality seam between abutting membranes, with no overlap and with only a very localized area of figure disturbance. The seam retains 90 percent of the strength of the parent material. The process was developed for original use in the fabrication of wide-aperture membrane optics, with areal densities of less than 1 kg/m2, for lightweight telescopes, solar concentrators, antennas, and the like to be deployed in outer space. The process is just as well applicable to the fabrication of large precise polyimide membranes for flat or inflatable solar concentrators and antenna reflectors for terrestrial applications. The process is applicable to cast membranes made of CP1 (or equivalent) polyimide. The process begins with the precise fitting together and fixturing of two membrane segments. The seam is formed by applying a metered amount of a doped solution of the same polyimide along the abutting edges of the membrane segments. After the solution has been applied, the fixtured films are allowed to dry and are then cured by convective heating. The weld material is the same as the parent material, so that what is formed is a homogeneous, strong joint that is almost indistinguishable from the parent material. The success of the process is highly dependent on formulation of the seaming solution from the correct proportion of the polyimide in a suitable solvent. In addition, the formation of reliable seams depends on the deposition of a precise amount of the seaming solution along the seam line. To ensure the required precision, deposition is performed by use of an automated apparatus comprising a modified commercially available, large-format, ink-jet print head on an automated positioning table. The printing head jets the seaming solution into the seam area at a rate controlled in coordination with the movement of the positioning table.
KA-SB: from data integration to large scale reasoning
Roldán-García, María del Mar; Navas-Delgado, Ismael; Kerzazi, Amine; Chniber, Othmane; Molina-Castro, Joaquín; Aldana-Montes, José F
2009-01-01
Background The analysis of information in the biological domain is usually focused on the analysis of data from single on-line data sources. Unfortunately, studying a biological process requires having access to disperse, heterogeneous, autonomous data sources. In this context, an analysis of the information is not possible without the integration of such data. Methods KA-SB is a querying and analysis system for final users based on combining a data integration solution with a reasoner. Thus, the tool has been created with a process divided into two steps: 1) KOMF, the Khaos Ontology-based Mediator Framework, is used to retrieve information from heterogeneous and distributed databases; 2) the integrated information is crystallized in a (persistent and high performance) reasoner (DBOWL). This information could be further analyzed later (by means of querying and reasoning). Results In this paper we present a novel system that combines the use of a mediation system with the reasoning capabilities of a large scale reasoner to provide a way of finding new knowledge and of analyzing the integrated information from different databases, which is retrieved as a set of ontology instances. This tool uses a graphical query interface to build user queries easily, which shows a graphical representation of the ontology and allows users o build queries by clicking on the ontology concepts. Conclusion These kinds of systems (based on KOMF) will provide users with very large amounts of information (interpreted as ontology instances once retrieved), which cannot be managed using traditional main memory-based reasoners. We propose a process for creating persistent and scalable knowledgebases from sets of OWL instances obtained by integrating heterogeneous data sources with KOMF. This process has been applied to develop a demo tool , which uses the BioPax Level 3 ontology as the integration schema, and integrates UNIPROT, KEGG, CHEBI, BRENDA and SABIORK databases. PMID:19796402
Hubble Looks in on a Galactic Nursery
2017-12-08
This dramatic image shows the NASA/ESA Hubble Space Telescope’s view of dwarf galaxy known as NGC 1140, which lies 60 million light-years away in the constellation of Eridanus. As can be seen in this image NGC 1140 has an irregular form, much like the Large Magellanic Cloud — a small galaxy that orbits the Milky Way. This small galaxy is undergoing what is known as a starburst. Despite being almost ten times smaller than the Milky Way it is creating stars at about the same rate, with the equivalent of one star the size of our sun being created per year. This is clearly visible in the image, which shows the galaxy illuminated by bright, blue-white, young stars. Galaxies like NGC 1140 — small, starbursting and containing large amounts of primordial gas with far fewer elements heavier than hydrogen and helium than are present in our sun — are of particular interest to astronomers. Their composition makes them similar to the intensely star-forming galaxies in the early Universe. And these early Universe galaxies were the building blocks of present-day large galaxies like our galaxy, the Milky Way. But, as they are so far away these early Universe galaxies are harder to study so these closer starbursting galaxies are a good substitute for learning more about galaxy evolution. The vigorous star formation will have a very destructive effect on this small dwarf galaxy in its future. When the larger stars in the galaxy die, and explode as supernovae, gas is blown into space and may easily escape the gravitational pull of the galaxy. The ejection of gas from the galaxy means it is throwing out its potential for future stars as this gas is one of the building blocks of star formation. NGC 1140’s starburst cannot last for long. Image credit: ESA/Hubble & NASA
2008-09-01
projectiles containing small amounts of a reactive material. The mechanism is that limited deflagration of the ANFO creates sufficient pressure to...resulting pressurization of the container causes the container to rupture, thus producing a render-safe solution. Several free-field shots demonstrated...the ANFO creates sufficient pressure to rupture plastic or steel containers. 1 Introduction Vehicle-borne improvised explosive devices (VBIEDs) have
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hosking, Jonathan R. M.; Natarajan, Ramesh
The computer creates a utility demand forecast model for weather parameters by receiving a plurality of utility parameter values, wherein each received utility parameter value corresponds to a weather parameter value. Determining that a range of weather parameter values lacks a sufficient amount of corresponding received utility parameter values. Determining one or more utility parameter values that corresponds to the range of weather parameter values. Creating a model which correlates the received and the determined utility parameter values with the corresponding weather parameters values.
Are "New Building" Learning Gains Sustainable?
ERIC Educational Resources Information Center
Walczak, Mary M.; Van Wylen, David G. L.
2015-01-01
New science facilities have become a reality on many college campuses in the last few decades. Large time investments in creating shared programmatic vision and designing flexible spaces, partnered with large fiscal investments, have created a new generation of science building. Unfortunately, few studies provide evidence about whether the…
NASA Astrophysics Data System (ADS)
Murphy, Richard J.; Monteiro, Sildomar T.
2013-01-01
Hyperspectral imagery is used to map the distribution of iron and separate iron ore from shale (a waste product) on a vertical mine face in an open-pit mine in the Pilbara, Western Australia. Vertical mine faces have complex surface geometries which cause large spatial variations in the amount of incident and reflected light. Methods used to analyse imagery must minimise these effects whilst preserving any spectral variations between rock types and minerals. Derivative analysis of spectra to the 1st-, 2nd- and 4th-order is used to do this. To quantify the relative amounts and distribution of iron, the derivative spectrum is integrated across the visible and near infrared spectral range (430-970 nm) and over those wavelength regions containing individual peaks and troughs associated with specific iron absorption features. As a test of this methodology, results from laboratory spectra acquired from representative rock samples were compared with total amounts of iron minerals from X-ray diffraction (XRD) analysis. Relationships between derivatives integrated over the visible near-infrared range and total amounts (% weight) of iron minerals were strongest for the 4th- and 2nd-derivative (R2 = 0.77 and 0.74, respectively) and weakest for the 1st-derivative (R2 = 0.56). Integrated values of individual peaks and troughs showed moderate to strong relationships in 2nd- (R2 = 0.68-0.78) and 4th-derivative (R2 = 0.49-0.78) spectra. The weakest relationships were found for peaks or troughs towards longer wavelengths. The same derivative methods were then applied to imagery to quantify relative amounts of iron minerals on a mine face. Before analyses, predictions were made about the relative abundances of iron in the different geological zones on the mine face, as mapped from field surveys. Integration of the whole spectral curve (430-970 nm) from the 2nd- and 4th-derivative gave results which were entirely consistent with predictions. Conversely, integration of the 1st-derivative gave results that did not fit with predictions nor distinguish between zones with very large and small amounts of iron oxide. Classified maps of ore and shale were created using a simple level-slice of the 1st-derivative reflectance at 702, 765 and 809 nm. Pixels classified as shale showed a similar distribution to kaolinite (an indicator of shales in the region), as mapped by the depth of the diagnostic kaolinite absorption feature at 2196 nm. Standard statistical measures of classification performance (accuracy, precision, recall and the Kappa coefficient of agreement) indicated that nearly all of the pixels were classified correctly using 1st-derivative reflectance at 765 and 809 nm. These results indicate that data from the VNIR (430-970 nm) can be used to quantify, without a priori knowledge, the total amount of iron minerals and to distinguish ore from shale on vertical mine faces.
Updated US Department of Agriculture Food Patterns meet goals of the 2010 dietary guidelines.
Britten, Patricia; Cleveland, Linda E; Koegel, Kristin L; Kuczynski, Kevin J; Nickols-Richardson, Sharon M
2012-10-01
The US Department of Agriculture Food Patterns were updated for the 2010 Dietary Guidelines for Americans to meet new nutrition goals and incorporate results of food pattern modeling requested by the Dietary Guidelines Advisory Committee. The purpose of this article is to describe the process used and changes in the updated patterns. Changes include renaming the Meat and Beans and Milk Groups to the Protein Foods and Dairy Groups, respectively, to be more encompassing of foods in each. Vegetable subgroups now provide more achievable intake recommendations. Calcium-fortified soymilk is now included in the Dairy Group because of its similarity to foods in that group. Increased amounts of seafoods are recommended in the Protein Foods Group, balanced by decreased amounts of meat and poultry. A limit on calories from solid fats and added sugars is included, replacing the previous discretionary calorie allowance and emphasizing the need to choose nutrient-dense forms of foods. Lacto-ovo vegetarian and vegan patterns that meet nutrition goals were created by making substitutions in the Protein Foods Group, and for vegan patterns, in the Dairy Group. Patterns identify food choices that meet nutritional needs within energy allowances and encourage choosing a variety of foods. They rely on foods in nutrient-dense forms, including a limited amount of calories from solid fats and added sugars. The Food Patterns provide a useful template for educating consumers about healthful food choices while highlighting a large gap between choices many Americans make and healthy eating patterns. Copyright © 2012 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.
Cotney, Justin L; Noonan, James P
2015-02-02
Chromatin immunoprecipitation coupled with high-throughput sequencing (ChIP-Seq) is a powerful method used to identify genome-wide binding patterns of transcription factors and distribution of various histone modifications associated with different chromatin states. In most published studies, ChIP-Seq has been performed on cultured cells grown under controlled conditions, allowing generation of large amounts of material in a homogeneous biological state. Although such studies have provided great insight into the dynamic landscapes of animal genomes, they do not allow the examination of transcription factor binding and chromatin states in adult tissues, developing embryonic structures, or tumors. Such knowledge is critical to understanding the information required to create and maintain a complex biological tissue and to identify noncoding regions of the genome directly involved in tissues affected by complex diseases such as autism. Studying these tissue types with ChIP-Seq can be challenging due to the limited availability of tissues and the lack of complex biological states able to be achieved in culture. These inherent differences require alterations of standard cross-linking and chromatin extraction typically used in cell culture. Here we describe a general approach for using small amounts of animal tissue to perform ChIP-Seq directed at histone modifications and transcription factors. Tissue is homogenized before treatment with formaldehyde to ensure proper cross-linking, and a two-step nuclear isolation is performed to increase extraction of soluble chromatin. Small amounts of soluble chromatin are then used for immunoprecipitation (IP) and prepared for multiplexed high-throughput sequencing. © 2015 Cold Spring Harbor Laboratory Press.
Taylor, Sandra L; Ruhaak, L Renee; Kelly, Karen; Weiss, Robert H; Kim, Kyoungmi
2017-03-01
With expanded access to, and decreased costs of, mass spectrometry, investigators are collecting and analyzing multiple biological matrices from the same subject such as serum, plasma, tissue and urine to enhance biomarker discoveries, understanding of disease processes and identification of therapeutic targets. Commonly, each biological matrix is analyzed separately, but multivariate methods such as MANOVAs that combine information from multiple biological matrices are potentially more powerful. However, mass spectrometric data typically contain large amounts of missing values, and imputation is often used to create complete data sets for analysis. The effects of imputation on multiple biological matrix analyses have not been studied. We investigated the effects of seven imputation methods (half minimum substitution, mean substitution, k-nearest neighbors, local least squares regression, Bayesian principal components analysis, singular value decomposition and random forest), on the within-subject correlation of compounds between biological matrices and its consequences on MANOVA results. Through analysis of three real omics data sets and simulation studies, we found the amount of missing data and imputation method to substantially change the between-matrix correlation structure. The magnitude of the correlations was generally reduced in imputed data sets, and this effect increased with the amount of missing data. Significant results from MANOVA testing also were substantially affected. In particular, the number of false positives increased with the level of missing data for all imputation methods. No one imputation method was universally the best, but the simple substitution methods (Half Minimum and Mean) consistently performed poorly. © The Author 2016. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
Earth Science Data Analytics: Preparing for Extracting Knowledge from Information
NASA Technical Reports Server (NTRS)
Kempler, Steven; Barbieri, Lindsay
2016-01-01
Data analytics is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. Data analytics is a broad term that includes data analysis, as well as an understanding of the cognitive processes an analyst uses to understand problems and explore data in meaningful ways. Analytics also include data extraction, transformation, and reduction, utilizing specific tools, techniques, and methods. Turning to data science, definitions of data science sound very similar to those of data analytics (which leads to a lot of the confusion between the two). But the skills needed for both, co-analyzing large amounts of heterogeneous data, understanding and utilizing relevant tools and techniques, and subject matter expertise, although similar, serve different purposes. Data Analytics takes on a practitioners approach to applying expertise and skills to solve issues and gain subject knowledge. Data Science, is more theoretical (research in itself) in nature, providing strategic actionable insights and new innovative methodologies. Earth Science Data Analytics (ESDA) is the process of examining, preparing, reducing, and analyzing large amounts of spatial (multi-dimensional), temporal, or spectral data using a variety of data types to uncover patterns, correlations and other information, to better understand our Earth. The large variety of datasets (temporal spatial differences, data types, formats, etc.) invite the need for data analytics skills that understand the science domain, and data preparation, reduction, and analysis techniques, from a practitioners point of view. The application of these skills to ESDA is the focus of this presentation. The Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster was created in recognition of the practical need to facilitate the co-analysis of large amounts of data and information for Earth science. Thus, from a to advance science point of view: On the continuum of ever evolving data management systems, we need to understand and develop ways that allow for the variety of data relationships to be examined, and information to be manipulated, such that knowledge can be enhanced, to facilitate science. Recognizing the importance and potential impacts of the unlimited ways to co-analyze heterogeneous datasets, now and especially in the future, one of the objectives of the ESDA cluster is to facilitate the preparation of individuals to understand and apply needed skills to Earth science data analytics. Pinpointing and communicating the needed skills and expertise is new, and not easy. Information technology is just beginning to provide the tools for advancing the analysis of heterogeneous datasets in a big way, thus, providing opportunity to discover unobvious scientific relationships, previously invisible to the science eye. And it is not easy It takes individuals, or teams of individuals, with just the right combination of skills to understand the data and develop the methods to glean knowledge out of data and information. In addition, whereas definitions of data science and big data are (more or less) available (summarized in Reference 5), Earth science data analytics is virtually ignored in the literature, (barring a few excellent sources).
A crowdsourcing model for creating preclinical medical education study tools.
Bow, Hansen C; Dattilo, Jonathan R; Jonas, Andrea M; Lehmann, Christoph U
2013-06-01
During their preclinical course work, medical students must memorize and recall substantial amounts of information. Recent trends in medical education emphasize collaboration through team-based learning. In the technology world, the trend toward collaboration has been characterized by the crowdsourcing movement. In 2011, the authors developed an innovative approach to team-based learning that combined students' use of flashcards to master large volumes of content with a crowdsourcing model, using a simple informatics system to enable those students to share in the effort of generating concise, high-yield study materials. The authors used Google Drive and developed a simple Java software program that enabled students to simultaneously access and edit sets of questions and answers in the form of flashcards. Through this crowdsourcing model, medical students in the class of 2014 at the Johns Hopkins University School of Medicine created a database of over 16,000 questions that corresponded to the Genes to Society basic science curriculum. An analysis of exam scores revealed that students in the class of 2014 outperformed those in the class of 2013, who did not have access to the flashcard system, and a survey of students demonstrated that users were generally satisfied with the system and found it a valuable study tool. In this article, the authors describe the development and implementation of their crowdsourcing model for creating study materials, emphasize its simplicity and user-friendliness, describe its impact on students' exam performance, and discuss how students in any educational discipline could implement a similar model of collaborative learning.
USDA-ARS?s Scientific Manuscript database
The amount of microarray gene expression data in public repositories has been increasing exponentially for the last couple of decades. High-throughput microarray data integration and analysis has become a critical step in exploring the large amount of expression data for biological discovery. Howeve...
Lewis, Tyler; Schmutz, Joel A.; Amundson, Courtney L.; Lindberg, Mark S.
2016-01-01
Summary 1. Wildfires are the principal disturbance in the boreal forest, and their size and frequency are increasing as the climate warms. Impacts of fires on boreal wildlife are largely unknown, especially for the tens of millions of waterfowl that breed in the region. This knowledge gap creates significant barriers to the integrative management of fires and waterfowl, leading to fire policies that largely disregard waterfowl. 2. Waterfowl populations across the western boreal forest of North America have been monitored annually since 1955 by the Waterfowl Breeding Population and Habitat Survey (BPOP), widely considered the most extensive wildlife survey in the world. Using these data, we examined impacts of forest fires on abundance of two waterfowl guilds – dabblers and divers. We modelled waterfowl abundance in relation to fire extent (i.e. amount of survey transect burned) and time since fire, examining both immediate and lagged fire impacts. 3. From 1955 to 2014, >1100 fires in the western boreal forest intersected BPOP survey transects, and many transects burned multiple times. Nonetheless, fires had no detectable impact on waterfowl abundance; annual transect counts of dabbler and diver pairs remained stable from the pre- to post-fire period. 4. The absence of fire impacts on waterfowl abundance extended from the years immediately following the fire to those more than a decade afterwards. Likewise, the amount of transect burned did not influence waterfowl abundance, with similar pair counts from the pre- to post-fire period for small (1–20% burned), medium (21–60%) and large (>60%) burns. 5. Policy implications. Waterfowl populations appear largely resilient to forest fires, providing initial evidence that current policies of limited fire suppression, which predominate throughout much of the boreal forest, have not been detrimental to waterfowl populations. Likewise, fire-related management actions, such as prescribed burning or targeted suppression, seem to have limited impacts on waterfowl abundance and productivity. For waterfowl managers, our results suggest that adaptive models of waterfowl harvest, which annually guide hunting quotas, do not need to emphasize fires when integrating climate change effects.
Open NASA Earth Exchange (OpenNEX): A Public-Private Partnership for Climate Change Research
NASA Astrophysics Data System (ADS)
Nemani, R. R.; Lee, T. J.; Michaelis, A.; Ganguly, S.; Votava, P.
2014-12-01
NASA Earth Exchange (NEX) is a data, computing and knowledge collaborative that houses satellite, climate and ancillary data where a community of researchers can come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform with access to large supercomputing resources. As a part of broadening the community beyond NASA-funded researchers, NASA through an agreement with Amazon Inc. made available to the public a large collection of Climate and Earth Sciences satellite data. The data, available through the Open NASA Earth Exchange (OpenNEX) platform hosted by Amazon Web Services (AWS) public cloud, consists of large amounts of global land surface imaging, vegetation conditions, climate observations and climate projections. In addition to the data, users of OpenNEX platform can also watch lectures from leading experts, learn basic access and use of the available data sets. In order to advance White House initiatives such as Open Data, Big Data and Climate Data and the Climate Action Plan, NASA over the past six months conducted the OpenNEX Challenge. The two-part challenge was designed to engage the public in creating innovative ways to use NASA data and address climate change impacts on economic growth, health and livelihood. Our intention was that the challenges allow citizen scientists to realize the value of NASA data assets and offers NASA new ideas on how to share and use that data. The first "ideation" challenge, closed on July 31st attracted over 450 participants consisting of climate scientists, hobbyists, citizen scientists, IT experts and App developers. Winning ideas from the first challenge will be incorporated into the second "builder" challenge currently targeted to launch mid-August and close by mid-November. The winner(s) will be formally announced at AGU in December of 2014. We will share our experiences and lessons learned over the past year from OpenNEX, a public-private partnership for engaging and enabling a large community of citizen scientists to better understand global climate changes and in creating climate resilience.
Bolhuis, Dieuwerke P.; Lakemond, Catriona M. M.; de Wijk, Rene A.; Luning, Pieternel A.; de Graaf, Cees
2013-01-01
Background A number of studies have shown that bite and sip sizes influence the amount of food intake. Consuming with small sips instead of large sips means relatively more sips for the same amount of food to be consumed; people may believe that intake is higher which leads to faster satiation. This effect may be disturbed when people are distracted. Objective The objective of the study is to assess the effects of sip size in a focused state and a distracted state on ad libitum intake and on the estimated amount consumed. Design In this 3×2 cross-over design, 53 healthy subjects consumed ad libitum soup with small sips (5 g, 60 g/min), large sips (15 g, 60 g/min), and free sips (where sip size was determined by subjects themselves), in both a distracted and focused state. Sips were administered via a pump. There were no visual cues toward consumption. Subjects then estimated how much they had consumed by filling soup in soup bowls. Results Intake in the small-sip condition was ∼30% lower than in both the large-sip and free-sip conditions (P<0.001). In addition, subjects underestimated how much they had consumed in the large-sip and free-sip conditions (P<0.03). Distraction led to a general increase in food intake (P = 0.003), independent of sip size. Distraction did not influence sip size or estimations. Conclusions Consumption with large sips led to higher food intake, as expected. Large sips, that were either fixed or chosen by subjects themselves led to underestimations of the amount consumed. This may be a risk factor for over-consumption. Reducing sip or bite sizes may successfully lower food intake, even in a distracted state. PMID:23372657
77 FR 44289 - Notice of Permit Application Received Under the Antarctic Conservation Act of 1978
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-27
... small amount of waste created by the expedition team will be removed, including all fuel bottles, batteries, plastics, and non-combustible wastes, including perishable and nonperishable food wastes. The...
13 CFR 120.861 - Job creation or retention.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Project must create or retain one Job Opportunity per an amount of 504 loan funding that will be specified by SBA from time to time in a Federal Register notice. Such Job Opportunity average remains in effect...
13 CFR 120.861 - Job creation or retention.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Project must create or retain one Job Opportunity per an amount of 504 loan funding that will be specified by SBA from time to time in a Federal Register notice. Such Job Opportunity average remains in effect...
13 CFR 120.861 - Job creation or retention.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Project must create or retain one Job Opportunity per an amount of 504 loan funding that will be specified by SBA from time to time in a Federal Register notice. Such Job Opportunity average remains in effect...
13 CFR 120.861 - Job creation or retention.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Project must create or retain one Job Opportunity per an amount of 504 loan funding that will be specified by SBA from time to time in a Federal Register notice. Such Job Opportunity average remains in effect...
13 CFR 120.861 - Job creation or retention.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Project must create or retain one Job Opportunity per an amount of 504 loan funding that will be specified by SBA from time to time in a Federal Register notice. Such Job Opportunity average remains in effect...
Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications
NASA Astrophysics Data System (ADS)
Maskey, M.; Ramachandran, R.; Miller, J.
2017-12-01
Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.
Color object detection using spatial-color joint probability functions.
Luo, Jiebo; Crandall, David
2006-06-01
Object detection in unconstrained images is an important image understanding problem with many potential applications. There has been little success in creating a single algorithm that can detect arbitrary objects in unconstrained images; instead, algorithms typically must be customized for each specific object. Consequently, it typically requires a large number of exemplars (for rigid objects) or a large amount of human intuition (for nonrigid objects) to develop a robust algorithm. We present a robust algorithm designed to detect a class of compound color objects given a single model image. A compound color object is defined as having a set of multiple, particular colors arranged spatially in a particular way, including flags, logos, cartoon characters, people in uniforms, etc. Our approach is based on a particular type of spatial-color joint probability function called the color edge co-occurrence histogram. In addition, our algorithm employs perceptual color naming to handle color variation, and prescreening to limit the search scope (i.e., size and location) for the object. Experimental results demonstrated that the proposed algorithm is insensitive to object rotation, scaling, partial occlusion, and folding, outperforming a closely related algorithm based on color co-occurrence histograms by a decisive margin.
Mission concept and autonomy considerations for active Debris removal
NASA Astrophysics Data System (ADS)
Peters, Susanne; Pirzkall, Christoph; Fiedler, Hauke; Förstner, Roger
2016-12-01
Over the last 60 years, Space Debris has become one of the main challenges for the safe operation of satellites in low Earth orbit. To address this threat, guidelines that include a limited debris release during normal operations, minimization of the potential for on-orbit break-ups and post mission disposal have begun to be implemented. However, for the long-term, the amount of debris will still increase due to fragments created by collisions of objects in space. The active removal of space debris of at least five large objects per years is therefore recommended, but not yet included in those guidelines. Even though various technical concepts have been developed over the last years, the question on how to make them reliable and safe or how to finance such mission has not been answered. This paper addresses the first two topics. With Space Debris representing an uncooperative and possibly tumbling target, close proximity becomes absolutely critical, especially when an uninterrupted connection to the ground station is not ensured. This paper therefore defines firstly a mission to remove at least five large objects and secondly introduces a preliminary autonomy concept fitted for this mission.
'Sciencenet'--towards a global search and share engine for all scientific knowledge.
Lütjohann, Dominic S; Shah, Asmi H; Christen, Michael P; Richter, Florian; Knese, Karsten; Liebel, Urban
2011-06-15
Modern biological experiments create vast amounts of data which are geographically distributed. These datasets consist of petabytes of raw data and billions of documents. Yet to the best of our knowledge, a search engine technology that searches and cross-links all different data types in life sciences does not exist. We have developed a prototype distributed scientific search engine technology, 'Sciencenet', which facilitates rapid searching over this large data space. By 'bringing the search engine to the data', we do not require server farms. This platform also allows users to contribute to the search index and publish their large-scale data to support e-Science. Furthermore, a community-driven method guarantees that only scientific content is crawled and presented. Our peer-to-peer approach is sufficiently scalable for the science web without performance or capacity tradeoff. The free to use search portal web page and the downloadable client are accessible at: http://sciencenet.kit.edu. The web portal for index administration is implemented in ASP.NET, the 'AskMe' experiment publisher is written in Python 2.7, and the backend 'YaCy' search engine is based on Java 1.6.
Cloning single wall carbon nanotubes for hydrogen storage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tour, James M; Kittrell, Carter
2012-08-30
The purpose of this research is to development the technology required for producing 3-D nano-engineered frameworks for hydrogen storage based on sp 2 carbon media, which will have high gravimetric and especially high volumetric uptake of hydrogen, and in an aligned fibrous array that will take advantage of the exceptionally high thermal conductivity of sp 2 carbon materials to speed up the fueling process while minimizing or eliminating the need for internal cooling systems. A limitation for nearly all storage media using physisorption of the hydrogen molecule is the large amount of surface area (SA) occupied by each H 2more » molecule due to its large zero-point vibrational energy. This creates a conundrum that in order to maximize SA, the physisorption media is made more tenuous and the density is decreased, usually well below 1 kg/L, so that there comes a tradeoff between volumetric and gravimetric uptake. Our major goal was to develop a new type of media with high density H 2 uptake, which favors volumetric storage and which, in turn, has the capability to meet the ultimate DoE H 2 goals.« less
Isosurface Extraction in Time-Varying Fields Using a Temporal Hierarchical Index Tree
NASA Technical Reports Server (NTRS)
Shen, Han-Wei; Gerald-Yamasaki, Michael (Technical Monitor)
1998-01-01
Many high-performance isosurface extraction algorithms have been proposed in the past several years as a result of intensive research efforts. When applying these algorithms to large-scale time-varying fields, the storage overhead incurred from storing the search index often becomes overwhelming. this paper proposes an algorithm for locating isosurface cells in time-varying fields. We devise a new data structure, called Temporal Hierarchical Index Tree, which utilizes the temporal coherence that exists in a time-varying field and adoptively coalesces the cells' extreme values over time; the resulting extreme values are then used to create the isosurface cell search index. For a typical time-varying scalar data set, not only does this temporal hierarchical index tree require much less storage space, but also the amount of I/O required to access the indices from the disk at different time steps is substantially reduced. We illustrate the utility and speed of our algorithm with data from several large-scale time-varying CID simulations. Our algorithm can achieve more than 80% of disk-space savings when compared with the existing techniques, while the isosurface extraction time is nearly optimal.
Dinc, Emine; Tian, Lijin; Roy, Laura M; Roth, Robyn; Goodenough, Ursula; Croce, Roberta
2016-07-05
To avoid photodamage, photosynthetic organisms are able to thermally dissipate the energy absorbed in excess in a process known as nonphotochemical quenching (NPQ). Although NPQ has been studied extensively, the major players and the mechanism of quenching remain debated. This is a result of the difficulty in extracting molecular information from in vivo experiments and the absence of a validation system for in vitro experiments. Here, we have created a minimal cell of the green alga Chlamydomonas reinhardtii that is able to undergo NPQ. We show that LHCII, the main light harvesting complex of algae, cannot switch to a quenched conformation in response to pH changes by itself. Instead, a small amount of the protein LHCSR1 (light-harvesting complex stress related 1) is able to induce a large, fast, and reversible pH-dependent quenching in an LHCII-containing membrane. These results strongly suggest that LHCSR1 acts as pH sensor and that it modulates the excited state lifetimes of a large array of LHCII, also explaining the NPQ observed in the LHCSR3-less mutant. The possible quenching mechanisms are discussed.
Poon, Candice C; Ebacher, Vincent; Liu, Katherine; Yong, Voon Wee; Kelly, John James Patrick
2018-05-03
Automated slide scanning and segmentation of fluorescently-labeled tissues is the most efficient way to analyze whole slides or large tissue sections. Unfortunately, many researchers spend large amounts of time and resources developing and optimizing workflows that are only relevant to their own experiments. In this article, we describe a protocol that can be used by those with access to a widefield high-content analysis system (WHCAS) to image any slide-mounted tissue, with options for customization within pre-built modules found in the associated software. Not originally intended for slide scanning, the steps detailed in this article make it possible to acquire slide scanning images in the WHCAS which can be imported into the associated software. In this example, the automated segmentation of brain tumor slides is demonstrated, but the automated segmentation of any fluorescently-labeled nuclear or cytoplasmic marker is possible. Furthermore, there are a variety of other quantitative software modules including assays for protein localization/translocation, cellular proliferation/viability/apoptosis, and angiogenesis that can be run. This technique will save researchers time and effort and create an automated protocol for slide analysis.
Photodiodes based in La0.7Sr0.3MnO3/single layer MoS2 hybrid vertical heterostructures
NASA Astrophysics Data System (ADS)
Niu, Yue; Frisenda, Riccardo; Svatek, Simon A.; Orfila, Gloria; Gallego, Fernando; Gant, Patricia; Agraït, Nicolás; Leon, Carlos; Rivera-Calzada, Alberto; Pérez De Lara, David; Santamaria, Jacobo; Castellanos-Gomez, Andres
2017-09-01
The fabrication of artificial materials by stacking of individual two-dimensional (2D) materials is amongst one of the most promising research avenues in the field of 2D materials. Moreover, this strategy to fabricate new man-made materials can be further extended by fabricating hybrid stacks between 2D materials and other functional materials with different dimensionality making the potential number of combinations almost infinite. Among all these possible combinations, mixing 2D materials with transition metal oxides can result especially useful because of the large amount of interesting physical phenomena displayed separately by these two material families. We present a hybrid device based on the stacking of a single layer MoS2 onto a lanthanum strontium manganite (La0.7Sr0.3MnO3) thin film, creating an atomically thin device. It shows a rectifying electrical transport with a ratio of 103, and a photovoltaic effect with V oc up to 0.4 V. The photodiode behaviour arises as a consequence of the different doping character of these two materials. This result paves the way towards combining the efforts of these two large materials science communities.
AC HTS Transmission Cable for Integration into the Future EHV Grid of the Netherlands
NASA Astrophysics Data System (ADS)
Zuijderduin, R.; Chevtchenko, O.; Smit, J. J.; Aanhaanen, G.; Melnik, I.; Geschiere, A.
Due to increasing power demand, the electricity grid of the Netherlands is changing. The future grid must be capable to transmit all the connected power. Power generation will be more decentralized like for instance wind parks connected to the grid. Furthermore, future large scale production units are expected to be installed near coastal regions. This creates some potential grid issues, such as: large power amounts to be transmitted to consumers from west to east and grid stability. High temperature superconductors (HTS) can help solving these grid problems. Advantages to integrate HTS components at Extra High Voltage (EHV) and High Voltage (HV) levels are numerous: more power with less losses and less emissions, intrinsic fault current limiting capability, better control of power flow, reduced footprint, etc. Today's main obstacle is the relatively high price of HTS. Nevertheless, as the price goes down, initial market penetration for several HTS components is expected by year 2015 (e.g.: cables, fault current limiters). In this paper we present a design of intrinsically compensated EHV HTS cable for future grid integration. Discussed are the parameters of such cable providing an optimal power transmission in the future network.
NASA Astrophysics Data System (ADS)
Jose, Tony; Narayanan, Vijayakumar
2018-03-01
Radio over fiber (RoF) systems use a large number of base stations (BSs) and a number of central stations (CSs), which are interlinked together to form the network. RoF systems use multiple wavelengths for communication between CSs or between CSs and BSs to facilitate the huge amount of data traffic due to the multiple services for a large number of users. When erbium-doped fiber amplifiers (EDFAs) are used as amplifiers in such wavelength-division multiplexed systems, the nonuniform gain spectrum of EDFAs causes instability to some of the channels while providing faithful amplification to other channels. To avoid this inconsistency, the gain spectrum of the amplifier needs to be uniform along the whole usable range of wavelengths. A gain contouring technique is proposed to provide uniform gain to all channels irrespective of wavelength. Optical add/drop multiplexers (OADMs) and different lengths of erbium-doped fibers are used to create such a gain contouring mechanism in the optical domain itself. The effect of a cascade of nonuniform gain amplifiers is studied, and the proposed system mitigates the adverse effects caused due to nonuniform gain-induced channel instability effectively.
Major transitions in information technology
Valverde, Sergi
2016-01-01
When looking at the history of technology, we can see that all inventions are not of equal importance. Only a few technologies have the potential to start a new branching series (specifically, by increasing diversity), have a lasting impact in human life and ultimately became turning points. Technological transitions correspond to times and places in the past when a large number of novel artefact forms or behaviours appeared together or in rapid succession. Why does that happen? Is technological change continuous and gradual or does it occur in sudden leaps and bounds? The evolution of information technology (IT) allows for a quantitative and theoretical approach to technological transitions. The value of information systems experiences sudden changes (i) when we learn how to use this technology, (ii) when we accumulate a large amount of information, and (iii) when communities of practice create and exchange free information. The coexistence between gradual improvements and discontinuous technological change is a consequence of the asymmetric relationship between complexity and hardware and software. Using a cultural evolution approach, we suggest that sudden changes in the organization of ITs depend on the high costs of maintaining and transmitting reliable information. This article is part of the themed issue ‘The major synthetic evolutionary transitions’. PMID:27431527
An intelligent tool for activity data collection.
Sarkar, A M Jehad
2011-01-01
Activity recognition systems using simple and ubiquitous sensors require a large variety of real-world sensor data for not only evaluating their performance but also training the systems for better functioning. However, a tremendous amount of effort is required to setup an environment for collecting such data. For example, expertise and resources are needed to design and install the sensors, controllers, network components, and middleware just to perform basic data collections. It is therefore desirable to have a data collection method that is inexpensive, flexible, user-friendly, and capable of providing large and diverse activity datasets. In this paper, we propose an intelligent activity data collection tool which has the ability to provide such datasets inexpensively without physically deploying the testbeds. It can be used as an inexpensive and alternative technique to collect human activity data. The tool provides a set of web interfaces to create a web-based activity data collection environment. It also provides a web-based experience sampling tool to take the user's activity input. The tool generates an activity log using its activity knowledge and the user-given inputs. The activity knowledge is mined from the web. We have performed two experiments to validate the tool's performance in producing reliable datasets.
Model for fluorescence quenching in light harvesting complex II in different aggregation states.
Andreeva, Atanaska; Abarova, Silvia; Stoitchkova, Katerina; Busheva, Mira
2009-02-01
Low-temperature (77 K) steady-state fluorescence emission spectroscopy and dynamic light scattering were applied to the main chlorophyll a/b protein light harvesting complex of photosystem II (LHC II) in different aggregation states to elucidate the mechanism of fluorescence quenching within LHC II oligomers. Evidences presented that LHC II oligomers are heterogeneous and consist of large and small particles with different fluorescence yield. At intermediate detergent concentrations the mean size of the small particles is similar to that of trimers, while the size of large particles is comparable to that of aggregated trimers without added detergent. It is suggested that in small particles and trimers the emitter is monomeric chlorophyll, whereas in large aggregates there is also another emitter, which is a poorly fluorescing chlorophyll associate. A model, describing populations of antenna chlorophyll molecules in small and large aggregates in their ground and first singlet excited states, is considered. The model enables us to obtain the ratio of the singlet excited-state lifetimes in small and large particles, the relative amount of chlorophyll molecules in large particles, and the amount of quenchers as a function of the degree of aggregation. These dependencies reveal that the quenching of the chl a fluorescence upon aggregation is due to the formation of large aggregates and the increasing of the amount of chlorophyll molecules forming these aggregates. As a consequence, the amount of quenchers, located in large aggregates, is increased, and their singlet excited-state lifetimes steeply decrease.
Using DoD Maps to Examine the Influence of Large Wood on Channel Morphodynamics
NASA Astrophysics Data System (ADS)
MacKenzie, L. C.; Eaton, B. C.
2012-12-01
Since the advent of logging and slash burning, many streams in British Columbia have experienced changes to the amount of large wood added to or removed from these systems, which has, in turn, influenced the storage and movement of sediment within these channels. This set of flume experiments examines and quantifies the impacts of large wood on the reach-scale morphodynamics. Understanding the relation between the wood load and channel morphodynamics is important when assessing the quality of the aquatic habitat of a stream. The experiments were conducted using a fixed-bank, mobile bed Froude-scaled physical model of Fishtrap Creek, British Columbia, built in a shallow flume that is 1.5 m wide and 11 m long. The stream table was run without wood until it reached equilibrium at which point wood pieces of varying sizes were added to the channel. The bed morphology was surveyed using a laser profiling system at five-hour intervals. The laser profiles were then interpolated to create digital elevation models (DEM) from which DEM of difference (DoD) maps were produced. Analysis of the DoD maps focused on quantifying and locating differences in the distribution of sediment storage, erosion, and deposition between the runs as well as those induced by the addition of large wood into the stream channel. We then assessed the typical influence of individual pieces and of jams on pool frequency, size and distribution along the channels.
Sundramoorthy, Ashok K.; Wang, Yilei; Wang, Jing; Che, Jianfei; Thong, Ya Xuan; Lu, Albert Chee W.; Chan-Park, Mary B.
2015-01-01
Graphene is a promising candidate material for transparent conductive films because of its excellent conductivity and one-carbon-atom thickness. Graphene oxide flakes prepared by Hummers method are typically several microns in size and must be pieced together in order to create macroscopic films. We report a macro-scale thin film fabrication method which employs a three-dimensional (3-D) surfactant, 4-sulfocalix[4]arene (SCX), as a lateral aggregating agent. After electrochemical exfoliation, the partially oxidized graphene (oGr) flakes are dispersed with SCX. The SCX forms micelles, which adsorb on the oGr flakes to enhance their dispersion, also promote aggregation into large-scale thin films under vacuum filtration. A thin oGr/SCX film can be shaved off from the aggregated oGr/SCX cake by immersing the cake in water. The oGr/SCX thin-film floating on the water can be subsequently lifted from the water surface with a substrate. The reduced oGr (red-oGr) films can be as thin as 10−20 nm with a transparency of >90% and sheet resistance of 890 ± 47 kΩ/sq. This method of electrochemical exfoliation followed by SCX-assisted suspension and hydrazine reduction, avoids using large amounts of strong acid (unlike Hummers method), is relatively simple and can easily form a large scale conductive and transparent film from oGr/SCX suspension. PMID:26040436
Simulations of the flocculent spiral M33: what drives the spiral structure?
NASA Astrophysics Data System (ADS)
Dobbs, C. L.; Pettitt, A. R.; Corbelli, E.; Pringle, J. E.
2018-05-01
We perform simulations of isolated galaxies in order to investigate the likely origin of the spiral structure in M33. In our models, we find that gravitational instabilities in the stars and gas are able to reproduce the observed spiral pattern and velocity field of M33, as seen in HI, and no interaction is required. We also find that the optimum models have high levels of stellar feedback which create large holes similar to those observed in M33, whilst lower levels of feedback tend to produce a large amount of small scale structure, and undisturbed long filaments of high surface density gas, hardly detected in the M33 disc. The gas component appears to have a significant role in producing the structure, so if there is little feedback, both the gas and stars organise into clear spiral arms, likely due to a lower combined Q (using gas and stars), and the ready ability of cold gas to undergo spiral shocks. By contrast models with higher feedback have weaker spiral structure, especially in the stellar component, compared to grand design galaxies. We did not see a large difference in the behaviour of Qstars with most of these models, however, because Qstars stayed relatively constant unless the disc was more strongly unstable. Our models suggest that although the stars produce some underlying spiral structure, this is relatively weak, and the gas physics has a considerable role in producing the large scale structure of the ISM in flocculent spirals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jablonská, Jana, E-mail: jana.jablonska@vsb.cz; Kozubková, Milada, E-mail: milada.kozubkova@vsb.cz
Cavitation today is a very important problem that is solved by means of experimental and mathematical methods. The article deals with the generation of cavitation in convergent divergent nozzle of rectangular cross section. Measurement of pressure, flow rate, temperature, amount of dissolved air in the liquid and visualization of cavitation area using high-speed camera was performed for different flow rates. The measurement results were generalized by dimensionless analysis, which allows easy detection of cavitation in the nozzle. For numerical simulation the multiphase mathematical model of cavitation consisting of water and vapor was created. During verification the disagreement with the measurementsmore » for higher flow rates was proved, therefore the model was extended to multiphase mathematical model (water, vapor and air), due to release of dissolved air. For the mathematical modeling the multiphase turbulence RNG k-ε model for low Reynolds number flow with vapor and air cavitation was used. Subsequently the sizes of the cavitation area were verified. In article the inlet pressure and loss coefficient depending on the amount of air added to the mathematical model are evaluated. On the basis of the approach it may be create a methodology to estimate the amount of released air added at the inlet to the modeled area.« less
Preparation of biochar from sewage sludge
NASA Astrophysics Data System (ADS)
Nieto, Aurora; María Méndez, Ana; Gascó, Gabriel
2013-04-01
Biomass waste materials appropriate for biochar production include crop residues (both field residues and processing residues such as nut shells, fruit pits, bagasse, etc), as well as yard, food and forestry wastes, and animal manures. Biochar can and should be made from biomass waste materials and must not contain unacceptable levels of toxins such as heavy metals which can be found in sewage sludge and industrial or landfill waste. Making biochar from biomass waste materials should create no competition for land with any other land use option—such as food production or leaving the land in its pristine state. Large amounts of agricultural, municipal and forestry biomass are currently burned or left to decompose and release CO2 and methane back into the atmosphere. They also can pollute local ground and surface waters—a large issue for livestock wastes. Using these materials to make biochar not only removes them from a pollution cycle, but biochar can be obtained as a by-product of producing energy from this biomass. Sewage sludge is a by-product from wastewater treatment plants, and contains significant amounts of heavy metals, organic toxins and pathogenic microorganisms, which are considered to be harmful to the environment and all living organisms. Agricultural use, land filling and incineration are commonly used as disposal methods. It was, however, reported that sewage sludge applications in agriculture gives rise to an accumulation of harmful components (heavy metals and organic compounds) in soil. For this reason, pyrolysis can be considered as a promising technique to treat the sewage sludge including the production of fuels. The objective of this work is to study the advantages of the biochar prepared from sewage sludge.
Multifaceted re-analysis of the enigmatic Kitimat slide complex, Canada
NASA Astrophysics Data System (ADS)
Stacey, Cooper D.; Lintern, D. Gwyn; Enkin, Randolph J.
2018-07-01
Repeat submarine landslides are challenging to study due to the tendency of subsequent slides to destroy previous deposits. Repeat slides are common in fjord head deltas where high amounts of sediment are focused in narrow valleys. This study examines a well-known slide deposit associated with the Kitimat Delta on Canada's west coast that has been linked to tsunamigenic landslides in 1974 and 1975. For the first time we incorporate multibeam bathymetry to a multifaceted dataset including new high resolution acoustic data and sediment cores to examine the history of submarine slides at the Kitimat Delta. Based on morphological analysis and age modelling using 210Pb and 14C data, we determine that the complex surface morphology of the slide lobe consists of at least two large slide deposits that reach 5 km from the delta: the known event that occurred in 1975 and an older event that occurred at 623 ± 83 cal BP (95% confidence interval). We demonstrate that slide deposits can be differentiated based on surface morphology and acoustic character. This is confirmed by age modelling. The 1975 slide resulted in a flow that ploughed through the seabed creating compression and translation along a basal shear plane, resulting in deep deformation and a surface characterized by pressure ridges. The 623 ± 83 cal BP event resulted in a large amount of blocky slide material that overran the former seafloor and was transported >5 km from the delta front. Several buried events are observed at depth, one of which occurred at 2592 ± 84 cal BP and appears to be on the same order of magnitude as the 1975 event and showing very similar acoustic characteristics. As for hazard implications, we show submarine landslides of varying sizes have naturally occurred on this delta throughout the past several thousand years.
Langland, Michael J.
2009-01-01
The Susquehanna River transports a substantial amount of the sediment and nutrient load to the Chesapeake Bay. Upstream of the bay, three large dams and their associated reservoirs trap a large amount of the transported sediment and associated nutrients. During the fall of 2008, the U.S. Geological Survey in cooperation with the Pennsylvania Department of Environmental Protection completed bathymetric surveys of three reservoirs on the lower Susquehanna River to provide an estimate of the remaining sediment-storage capacity. Previous studies indicated the upper two reservoirs were in equilibrium with long-term sediment storage; only the most downstream reservoir retained capacity to trap sediments. A differential global positioning system (DGPS) instrument was used to provide the corresponding coordinate position. Bathymetry data were collected using a single beam 210 kHz (kilohertz) echo sounder at pre-defined transects that matched previous surveys. Final horizontal (X and Y) and vertical (Z) coordinates of the geographic positions and depth to bottom were used to create bathymetric maps of the reservoirs. Results indicated that from 1996 to 2008 about 14,700,000 tons of sediment were deposited in the three reservoirs with the majority (12,000,000 tons) being deposited in Conowingo Reservoir. Approximately 20,000 acre-feet or 30,000,000 tons of remaining storage capacity is available in Conowingo Reservoir. At current transport (3,000,000 tons per year) and deposition (2,000,000 tons per year) rates and with no occurrence of major scour events due to floods, the remaining capacity may be filled in 15 to 20 years. Once the remaining sediment-storage capacity in the reservoirs is filled, sediment and associated phosphorus loads entering the Chesapeake Bay are expected to increase.
NEMO: Extraction and normalization of organization names from PubMed affiliations.
Jonnalagadda, Siddhartha Reddy; Topham, Philip
2010-10-04
Today, there are more than 18 million articles related to biomedical research indexed in MEDLINE, and information derived from them could be used effectively to save the great amount of time and resources spent by government agencies in understanding the scientific landscape, including key opinion leaders and centers of excellence. Associating biomedical articles with organization names could significantly benefit the pharmaceutical marketing industry, health care funding agencies and public health officials and be useful for other scientists in normalizing author names, automatically creating citations, indexing articles and identifying potential resources or collaborators. Large amount of extracted information helps in disambiguating organization names using machine-learning algorithms. We propose NEMO, a system for extracting organization names in the affiliation and normalizing them to a canonical organization name. Our parsing process involves multi-layered rule matching with multiple dictionaries. The system achieves more than 98% f-score in extracting organization names. Our process of normalization that involves clustering based on local sequence alignment metrics and local learning based on finding connected components. A high precision was also observed in normalization. NEMO is the missing link in associating each biomedical paper and its authors to an organization name in its canonical form and the Geopolitical location of the organization. This research could potentially help in analyzing large social networks of organizations for landscaping a particular topic, improving performance of author disambiguation, adding weak links in the co-author network of authors, augmenting NLM's MARS system for correcting errors in OCR output of affiliation field, and automatically indexing the PubMed citations with the normalized organization name and country. Our system is available as a graphical user interface available for download along with this paper.
Silica Debris Disk Evidence for Giant Planet Forming Impacts
NASA Astrophysics Data System (ADS)
Lisse, C.
2014-04-01
Giant impacts are major formation events in the history of our solar system. The final assembly of the planets, as we understand it, had to include massive fast collision events as the planets grew to objects with large escape velocities or in regions of high Keplerian velocities (Chambers 2004; Kenyon & Bromley 2004a,b, 2006; Fegley & Schaefer 2005). These massive impact events should create large amounts of glassy silica material derived from the rapid melting, vaporization, and refreezing of normal silicate rich primitive rocky material. We report here the detection of 4 bright silica-rich debris disks in the Spitzer IRS spectral archive, and the possible identification of 7 others. The stellar types of the system primaries span from A5V to G0V, their ages are 10 - 100 Myr, and the dust is warm, 280 - 480 K, and is located between 1.5 and 6 AU, well inside the systems' terrestrial planet regions. The minimum amount of detected 0.1 - 20 dust mass ranges from 10^21 - 10^23 kg; assuming < 10% dust formation efficiency (Benz 2009, 2011) this implies collisions involving impactors massing at least 10^22 - 10^24 kg, i.e. from Moon to Earth mass. We find possible trends in the mineralogy of the silica, with predominantly amorphous silica found in the 2 younger systems, and crystalline silica in the older systems. We speculate this is due higher velocity impacts found in younger, hotter systems, coupled with the effects of energetic photon annealing of small amorphous silica grains. All of these measures are consistent with the creation of silica rich rubble, or construction debris, during the terrestrial planet formation era of giant impacts.
Renewable energy recovery through selected industrial wastes
NASA Astrophysics Data System (ADS)
Zhang, Pengchong
Typically, industrial waste treatment costs a large amount of capital, and creates environmental concerns as well. A sound alternative for treating these industrial wastes is anaerobic digestion. This technique reduces environmental pollution, and recovers renewable energy from the organic fraction of those selected industrial wastes, mostly in the form of biogas (methane). By applying anaerobic technique, selected industrial wastes could be converted from cash negative materials into economic energy feed stocks. In this study, three kinds of industrial wastes (paper mill wastes, brown grease, and corn-ethanol thin stillage) were selected, their performance in the anaerobic digestion system was studied and their applicability was investigated as well. A pilot-scale system, including anaerobic section (homogenization, pre-digestion, and anaerobic digestion) and aerobic section (activated sludge) was applied to the selected waste streams. The investigation of selected waste streams was in a gradually progressive order. For paper mill effluents, since those effluents contain a large amount of recalcitrant or toxic compounds, the anaerobic-aerobic system was used to check its treatability, including organic removal efficiency, substrate utilization rate, and methane yield. The results showed the selected effluents were anaerobically treatable. For brown grease, as it is already well known as a treatable substrate, a high rate anaerobic digester were applied to check the economic effect of this substrate, including methane yield and substrate utilization rate. These data from pilot-scale experiment have the potential to be applied to full-scale plant. For thin stillage, anaerobic digestion system has been incorporated to the traditional ethanol making process as a gate-to-gate process. The performance of anaerobic digester was applied to the gate-to-gate life-cycle analysis to estimate the energy saving and industrial cost saving in a typical ethanol plant.
Application of identifying transmission spheres for spherical surface testing
NASA Astrophysics Data System (ADS)
Han, Christopher B.; Ye, Xin; Li, Xueyuan; Wang, Quanzhao; Tang, Shouhong; Han, Sen
2017-06-01
We developed a new application on Microsoft Foundation Classes (MFC) to identify correct transmission spheres (TS) for Spherical Surface Testing (SST). Spherical surfaces are important optical surfaces, and the wide application and high production rate of spherical surfaces necessitates an accurate and highly reliable measuring device. A Fizeau Interferometer is an appropriate tool for SST due to its subnanometer accuracy. It measures the contour of a spherical surface using a common path, which is insensitive to the surrounding circumstances. The Fizeau Interferometer transmits a wide laser beam, creating interference fringes from re-converging light from the transmission sphere and the test surface. To make a successful measurement, the application calculates and determines the appropriate transmission sphere for the test surface. There are 3 main inputs from the test surfaces that are utilized to determine the optimal sizes and F-numbers of the transmission spheres: (1) the curvatures (concave or convex), (2) the Radii of Curvature (ROC), and (3) the aperture sizes. The application will firstly calculate the F-numbers (i.e. ROC divided by aperture) of the test surface, secondly determine the correct aperture size of a convex surface, thirdly verify that the ROC of the test surface must be shorter than the reference surface's ROC of the transmission sphere, and lastly calculate the percentage of area that the test surface will be measured. However, the amount of interferometers and transmission spheres should be optimized when measuring large spherical surfaces to avoid requiring a large amount of interferometers and transmission spheres for each test surface. Current measuring practices involve tedious and potentially inaccurate calculations. This smart application eliminates human calculation errors, optimizes the selection of transmission spheres (including the least number required) and interferometer sizes, and increases efficiency.
Mapping patient safety: a large-scale literature review using bibliometric visualisation techniques.
Rodrigues, S P; van Eck, N J; Waltman, L; Jansen, F W
2014-03-13
The amount of scientific literature available is often overwhelming, making it difficult for researchers to have a good overview of the literature and to see relations between different developments. Visualisation techniques based on bibliometric data are helpful in obtaining an overview of the literature on complex research topics, and have been applied here to the topic of patient safety (PS). On the basis of title words and citation relations, publications in the period 2000-2010 related to PS were identified in the Scopus bibliographic database. A visualisation of the most frequently cited PS publications was produced based on direct and indirect citation relations between publications. Terms were extracted from titles and abstracts of the publications, and a visualisation of the most important terms was created. The main PS-related topics studied in the literature were identified using a technique for clustering publications and terms. A total of 8480 publications were identified, of which the 1462 most frequently cited ones were included in the visualisation. The publications were clustered into 19 clusters, which were grouped into three categories: (1) magnitude of PS problems (42% of all included publications); (2) PS risk factors (31%) and (3) implementation of solutions (19%). In the visualisation of PS-related terms, five clusters were identified: (1) medication; (2) measuring harm; (3) PS culture; (4) physician; (5) training, education and communication. Both analysis at publication and term level indicate an increasing focus on risk factors. A bibliometric visualisation approach makes it possible to analyse large amounts of literature. This approach is very useful for improving one's understanding of a complex research topic such as PS and for suggesting new research directions or alternative research priorities. For PS research, the approach suggests that more research on implementing PS improvement initiatives might be needed.
Industrial applications of hot dry rock geothermal energy
NASA Astrophysics Data System (ADS)
Duchane, D. V.
1992-07-01
Geothermal resources in the form of naturally occurring hot water or steam have been utilized for many years. While these hydrothermal resources are found in many places, the general case is that the rock at depth is hot, but does not contain significant amounts of mobile fluid. An extremely large amount of geothermal energy is found around the world in this hot dry rock (HDR). Technology has been under development for more than twenty years at the Los Alamos National Laboratory in the United States and elsewhere to develop the technology to extract the geothermal energy from HDR in a form useful for electricity generation, space heating, or industrial processing. HDR technology is especially attractive for industrial applications because of the ubiquitous distribution of the HDR resource and the unique aspects of the process developed to recover it. In the HDR process, as developed at Los Alamos, water is pumped down a well under high pressure to open up natural joints in hot rock and create an artificial geothermal reservoir. Energy is extracted by circulating water through the reservoir. Pressurized hot water is returned to the surface through the production well, and its thermal energy is extracted for practical use. The same water is then recirculated through the system to mine more geothermal heat. Construction of a pilot HDR facility at Fenton Hill, NM, USA, has recently been completed by the Los Alamos National Laboratory. It consists of a large underground reservoir, a surface plant, and the connecting wellbores. This paper describes HDR technology and the current status of the development program. Novel industrial applications of geothermal energy based on the unique characteristics of the HDR energy extraction process are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Estrada, Paul R.; Cuzzi, Jeffrey N.; Morgan, Demitri A., E-mail: Paul.R.Estrada@nasa.gov
2016-02-20
We model particle growth in a turbulent, viscously evolving protoplanetary nebula, incorporating sticking, bouncing, fragmentation, and mass transfer at high speeds. We treat small particles using a moments method and large particles using a traditional histogram binning, including a probability distribution function of collisional velocities. The fragmentation strength of the particles depends on their composition (icy aggregates are stronger than silicate aggregates). The particle opacity, which controls the nebula thermal structure, evolves as particles grow and mass redistributes. While growing, particles drift radially due to nebula headwind drag. Particles of different compositions evaporate at “evaporation fronts” (EFs) where the midplanemore » temperature exceeds their respective evaporation temperatures. We track the vapor and solid phases of each component, accounting for advection and radial and vertical diffusion. We present characteristic results in evolutions lasting 2 × 10{sup 5} years. In general, (1) mass is transferred from the outer to the inner nebula in significant amounts, creating radial concentrations of solids at EFs; (2) particle sizes are limited by a combination of fragmentation, bouncing, and drift; (3) “lucky” large particles never represent a significant amount of mass; and (4) restricted radial zones just outside each EF become compositionally enriched in the associated volatiles. We point out implications for millimeter to submillimeter SEDs and the inference of nebula mass, radial banding, the role of opacity on new mechanisms for generating turbulence, the enrichment of meteorites in heavy oxygen isotopes, variable and nonsolar redox conditions, the primary accretion of silicate and icy planetesimals, and the makeup of Jupiter’s core.« less
Development of large engineered cartilage constructs from a small population of cells.
Brenner, Jillian M; Kunz, Manuela; Tse, Man Yat; Winterborn, Andrew; Bardana, Davide D; Pang, Stephen C; Waldman, Stephen D
2013-01-01
Confronted with articular cartilage's limited capacity for self-repair, joint resurfacing techniques offer an attractive treatment for damaged or diseased tissue. Although tissue engineered cartilage constructs can be created, a substantial number of cells are required to generate sufficient quantities of tissue for the repair of large defects. As routine cell expansion methods tend to elicit negative effects on chondrocyte function, we have developed an approach to generate phenotypically stable, large-sized engineered constructs (≥3 cm(2) ) directly from a small amount of donor tissue or cells (as little as 20,000 cells to generate a 3 cm(2) tissue construct). Using rabbit donor tissue, the bioreactor-cultivated constructs were hyaline-like in appearance and possessed a biochemical composition similar to native articular cartilage. Longer bioreactor cultivation times resulted in increased matrix deposition and improved mechanical properties determined over a 4 week period. Additionally, as the anatomy of the joint will need to be taken in account to effectively resurface large affected areas, we have also explored the possibility of generating constructs matched to the shape and surface geometry of a defect site through the use of rapid-prototyped defect tissue culture molds. Similar hyaline-like tissue constructs were developed that also possessed a high degree of shape correlation to the original defect mold. Future studies will be aimed at determining the effectiveness of this approach to the repair of cartilage defects in an animal model and the creation of large-sized osteochondral constructs. Copyright © 2012 American Institute of Chemical Engineers (AIChE).
NASA Astrophysics Data System (ADS)
Roy, Kenneth I.; Kennedy, Robert G., III; Fields, David E.
2013-02-01
The traditional concept of terraforming assumes ready availability of candidate planets with acceptable qualities: orbiting a star in its "Goldilocks zone", liquid water, enough mass, years longer than days, magnetic field, etc. But even stipulating affordable interstellar travel, we still might never find a good candidate elsewhere. Whatever we found likely would require centuries of heavy terraforming, just as Mars or Venus would here. Our increasing appreciation of the ubiquity of life suggests that any terra nova would already possess it. We would then face the dilemma of introducing alien life forms (us, our microbes) into another living world. Instead, we propose a novel method to create habitable environments for humanity by enclosing airless, sterile, otherwise useless planets, moons, and even large asteroids within engineered shells, which avoids the conundrum. These shells are subject to two opposing internal stresses: compression due to the primary's gravity, and tension from atmospheric pressure contained inside. By careful design, these two cancel each other resulting in zero net shell stress. Beneath the shell an Earth-like environment could be created similar in almost all respects to that of Home, except for gravity, regardless of the distance to the sun or other star. Englobing a small planet, moon, or even a dwarf planet like Ceres, would require astronomical amounts of material (quadrillions of tons) and energy, plus a great deal of time. It would be a quantum leap in difficulty over building Dyson Dots or industrializing our solar system, perhaps comparable to a mission across interstellar space with a living crew within their lifetime. But when accomplished, these constructs would be complete (albeit small) worlds, not merely large habitats. They could be stable across historic timescales, possibly geologic. Each would contain a full, self-sustaining ecology, which might evolve in curious directions over time. This has interesting implications for SETI as well.
Discovering semantic features in the literature: a foundation for building functional associations
Chagoyen, Monica; Carmona-Saez, Pedro; Shatkay, Hagit; Carazo, Jose M; Pascual-Montano, Alberto
2006-01-01
Background Experimental techniques such as DNA microarray, serial analysis of gene expression (SAGE) and mass spectrometry proteomics, among others, are generating large amounts of data related to genes and proteins at different levels. As in any other experimental approach, it is necessary to analyze these data in the context of previously known information about the biological entities under study. The literature is a particularly valuable source of information for experiment validation and interpretation. Therefore, the development of automated text mining tools to assist in such interpretation is one of the main challenges in current bioinformatics research. Results We present a method to create literature profiles for large sets of genes or proteins based on common semantic features extracted from a corpus of relevant documents. These profiles can be used to establish pair-wise similarities among genes, utilized in gene/protein classification or can be even combined with experimental measurements. Semantic features can be used by researchers to facilitate the understanding of the commonalities indicated by experimental results. Our approach is based on non-negative matrix factorization (NMF), a machine-learning algorithm for data analysis, capable of identifying local patterns that characterize a subset of the data. The literature is thus used to establish putative relationships among subsets of genes or proteins and to provide coherent justification for this clustering into subsets. We demonstrate the utility of the method by applying it to two independent and vastly different sets of genes. Conclusion The presented method can create literature profiles from documents relevant to sets of genes. The representation of genes as additive linear combinations of semantic features allows for the exploration of functional associations as well as for clustering, suggesting a valuable methodology for the validation and interpretation of high-throughput experimental data. PMID:16438716
Brush-Like Polymers: New Design Platforms for Soft, Dry Materials with Unique Property Relations
NASA Astrophysics Data System (ADS)
Daniel, William Francis McKemie, Jr.
Elastomers represent a unique class of engineering materials due to their light weight, low cost, and desirable combination of softness (105 -107 Pa) and large extensibilities (up to 1000%). Despite these advantages, there exist applications that require many times softer modulus, greater extensibility, and stronger strain hardening for the purpose of mimicking the mechanical properties of systems such as biological tissues. Until recently, only liquid-filled gels were suitable materials for such applications, including soft robotics and implants. A considerable amount of work has been done to create gels with superior properties, but despite unique strengths they also suffer from unique weaknesses. This class of material displays fundamental limitations in the form of heterogeneous structures, solvent loss and phase transitions at extreme temperatures, and loss of liquid fraction upon high deformations. In gels the solvent fraction also introduces a large solvent/polymer interaction parameter which must be carefully considered when designing the final mechanical properties. These energetic considerations further exaggerate the capacity for inconstant mechanical properties caused by fluctuations of the solvent fraction. In order to overcome these weaknesses, a new platform for single component materials with low modulus (<105 Pa) must be developed. Single component systems do not suffer from compositional changes over time and display more stable performance in a wider variety of temperatures and humidity conditions. A solvent-free system also has the potential to be homogeneous which replaces the large energetic interactions with comparatively small architectural interaction parameters. If a solvent-free alternative to liquid-filled gels is to be created, we must first consider the fundamental barrier to softer elastomers, i.e. entanglements - intrinsic topological restrains which define a lower limit of modulus ( 105 Pa). These entanglements are determined by chemistry specific parameters (repeat unit volume and Kuhn segment size) in the polymer liquid (melt) prior to crosslinking. Previous solvent free replacements for gels include elastomers end-linked in semidilute conditions. These materials are generated through crosslinking telechelic polymer chains in semidilute solutions at the onset of chain overlap. At such low polymer concentrations entanglements are greatly diluted and once the resulting gel is dried it creates a supersoft and super-elastic network. Although such methods have successfully generated materials with moduli below the 105 Pa limit and high extensibilities ( 1000%) they present their own limitations. Firstly, the semidilute crosslinking methods uses an impractically large volume of solvent which is unattractive in industry. Second, producing and crosslinking large monodisperse telechelic chains is a nontrivial process leading to large uncertainties in the final network architecture and properties. Specifically, telechelics have a distribution of end-to-end distances and in semidilute solutions with extremely low fraction of chain ends the crosslink reaction is diffusion limited, very slow, and imprecise. In order to achieve a superior solvent-free platform, we propose alteration of mechanical properties through the architectural disentanglement of brush-like polymer structures. In recent year there has been an increase in the synthetic conditions and crosslinking schemes available for producing brush-like structures. This makes brush-like materials an attractive alternative to more restrictive methods such as end-linking. Standard networks have one major control factor outside of chemistry, the network stand length. Brush-like architectures are created from long strands with regularly grafted side chains creating three characteristic length scales which may be independently manipulated. In collaboration with M. Rubinstein, we have utilized bottlebrush polymer architectures (a densely grafted brush-like polymer) to experimentally verify theoretical predictions of disentangled bottlebrush melts. By attaching well-defined side chains onto long polymer backbones, individual polymer strands are separated in space (similar to dilution with solvent) accompanied by a comparatively small increase in the rigidity of the strands. The end result is an architectural disentangled melt with an entanglement plateau modulus as much as three orders of magnitude lower than typical linear polymers and a broadly expanded potential for extensibility once crosslinked.
Learn abut HexSim, a program developed by the EPA, that incorporates vast amounts of available data about dwindling wildlife species, such as spotted owls, to create scenarios involving virtual populations
Reducing Soot in Diesel Exhaust
NASA Technical Reports Server (NTRS)
Bellan, J.
1984-01-01
Electrically charged fuel improves oxidation. Fuel injection system reduces amount of soot formed in diesel engines. Spray injector electrically charges fuel droplets as they enter cylinder. Charged droplets repel each other, creating, dilute fuel mist easily penetrated by oxygen in cylinder.
IMPERVIOUS SURFACE RESEARCH IN THE MID-ATLANTIC
Anthropogenic impervious surfaces have an important relationship with non-point source pollution (NPS) in urban watersheds. These human-created surfaces include such features as roads, parking lots, rooftops, sidewalks, and driveways. The amount of impervious surface area in a ...
Portable MRI developed at Los Alamos
Espy, Michelle
2018-02-14
Scientists at Los Alamos National Laboratory are developing an ultra-low-field Magnetic Resonance Imaging (MRI) system that could be low-power and lightweight enough for forward deployment on the battlefield and to field hospitals in the World's poorest regions. "MRI technology is a powerful medical diagnostic tool," said Michelle Espy, the Battlefield MRI (bMRI) project leader, "ideally suited for imaging soft-tissue injury, particularly to the brain." But hospital-based MRI devices are big and expensive, and require considerable infrastructure, such as large quantities of cryogens like liquid nitrogen and helium, and they typically use a large amount of energy. "Standard MRI machines just can't go everywhere," said Espy. "Soldiers wounded in battle usually have to be flown to a large hospital and people in emerging nations just don't have access to MRI at all. We've been in contact with doctors who routinely work in the Third World and report that MRI would be extremely valuable in treating pediatric encephalopathy, and other serious diseases in children." So the Los Alamos team started thinking about a way to make an MRI device that could be relatively easy to transport, set up, and use in an unconventional setting. Conventional MRI machines use very large magnetic fields that align the protons in water molecules to then create magnetic resonance signals, which are detected by the machine and turned into images. The large magnetic fields create exceptionally detailed images, but they are difficult and expensive to make. Espy and her team wanted to see if images of sufficient quality could be made with ultra-low-magnetic fields, similar in strength to the Earth's magnetic field. To achieve images at such low fields they use exquisitely sensitive detectors called Superconducting Quantum Interference Devices, or SQUIDs. SQUIDs are among the most sensitive magnetic field detectors available, so interference with the signal is the primary stumbling block. "SQUIDs are so sensitive they'll respond to a truck driving by outside or a radio signal 50 miles away," said Al Urbaitis, a bMRI engineer. The team's first generation bMRI had to be built in a large metal housing in order to shield it from interference. Now the Los Alamos team is working in the open environment without the large metal housing using a lightweight series of wire coils that surround the bMRI system to compensate the Earthâs magnetic field. In the future, the field compensation system will also function similar to noise-cancelling headphones to eradicate invading magnetic field signals on-the-fly.
Portable MRI developed at Los Alamos
DOE Office of Scientific and Technical Information (OSTI.GOV)
Espy, Michelle
Scientists at Los Alamos National Laboratory are developing an ultra-low-field Magnetic Resonance Imaging (MRI) system that could be low-power and lightweight enough for forward deployment on the battlefield and to field hospitals in the World's poorest regions. "MRI technology is a powerful medical diagnostic tool," said Michelle Espy, the Battlefield MRI (bMRI) project leader, "ideally suited for imaging soft-tissue injury, particularly to the brain." But hospital-based MRI devices are big and expensive, and require considerable infrastructure, such as large quantities of cryogens like liquid nitrogen and helium, and they typically use a large amount of energy. "Standard MRI machines justmore » can't go everywhere," said Espy. "Soldiers wounded in battle usually have to be flown to a large hospital and people in emerging nations just don't have access to MRI at all. We've been in contact with doctors who routinely work in the Third World and report that MRI would be extremely valuable in treating pediatric encephalopathy, and other serious diseases in children." So the Los Alamos team started thinking about a way to make an MRI device that could be relatively easy to transport, set up, and use in an unconventional setting. Conventional MRI machines use very large magnetic fields that align the protons in water molecules to then create magnetic resonance signals, which are detected by the machine and turned into images. The large magnetic fields create exceptionally detailed images, but they are difficult and expensive to make. Espy and her team wanted to see if images of sufficient quality could be made with ultra-low-magnetic fields, similar in strength to the Earth's magnetic field. To achieve images at such low fields they use exquisitely sensitive detectors called Superconducting Quantum Interference Devices, or SQUIDs. SQUIDs are among the most sensitive magnetic field detectors available, so interference with the signal is the primary stumbling block. "SQUIDs are so sensitive they'll respond to a truck driving by outside or a radio signal 50 miles away," said Al Urbaitis, a bMRI engineer. The team's first generation bMRI had to be built in a large metal housing in order to shield it from interference. Now the Los Alamos team is working in the open environment without the large metal housing using a lightweight series of wire coils that surround the bMRI system to compensate the Earth’s magnetic field. In the future, the field compensation system will also function similar to noise-cancelling headphones to eradicate invading magnetic field signals on-the-fly.« less
Koopmans, M.P.; Rijpstra, W.I.C.; De Leeuw, J. W.; Lewan, M.D.; Damste, J.S.S.
1998-01-01
An immature (Ro=0.39%), S-rich (S(org)/C = 0.07), organic matter-rich (19.6 wt. % TOC) limestone from the Ghareb Formation (Upper Cretaceous) in Jordan was artificially matured by hydrous pyrolysis (200, 220 ..., 300??C; 72 h) to study the effect of progressive diagenesis and early catagenesis on the amounts and distributions of hydrocarbons, organic sulfur compounds and S-rich geomacromolecules. The use of internal standards allowed the determination of absolute amounts. With increasing thermal maturation, large amounts of alkanes and alkylthiophenes with predominantly linear carbon skeletons are generated from the kerogen. The alkylthiophene isomer distributions do not change significantly with increasing thermal maturation, indicating the applicability of alkylthiophenes as biomarkers at relatively high levels of thermal maturity. For a given carbon skeleton, the saturated hydrocarbon, alkylthiophenes and alkylbenzo[b]thiophenes are stable forms at relatively high temperatures, whereas the alkylsulfides are not stable. The large amount of alkylthiophenes produced relative to the alkanes may be explained by the large number of monosulfide links per carbon skeleton. These results are in good agreement with those obtained previously for an artificial maturation series of an immature S-rich sample from the Gessoso-solfifera Formation.An immature (Ro = 0.39%), S-rich (Sorg/C = 0.07), organic matter-rich (19.6 wt.% TOC) limestone from the Ghareb Formation (Upper Cretaceous) in Jordan was artificially matured by hydrous pyrolysis (200, 220, ..., 300??C; 72 h) to study the effect of progressive diagenesis and early catagenesis on the amounts and distributions of hydrocarbons, organic sulfur compounds and S-rich geomacromolecules. The use of internal standards allowed the determination of absolute amounts. With increasing thermal maturation, large amounts of alkanes and alkylthiophenes with predominantly linear carbon skeletons are generated from the kerogen. The alkylthiophene isomer distributions do not change significantly with increasing thermal maturation, indicating the applicability of alkylthiophenes as biomarkers at relatively high levels of thermal maturity. For a given carbon skeleton, the saturated hydrocarbon, alkylthiophene and alkylbenzo[b]thiophenes are stable forms at relatively high temperatures, whereas the alkylsulfides are not stable. The large amount of alkylthiophenes produced relative to the alkanes may be explained by the large number of monosulfide links per carbon skeleton. These results are in good agreement with those obtained previously for an artificial maturation series of an immature S-rich sample from the Gessoso-solfifera Formation.
Shanower, G A; Kantor, G J
1997-11-01
Xeroderma pigmentosum group C cells repair DNA damaged by ultraviolet radiation in an unusual pattern throughout the genome. They remove cyclobutane pyrimidine dimers only from the DNA of transcriptionally active chromatin regions and only from the strand that contains the transcribed strand. The repair proceeds in a manner that creates damage-free islands which are in some cases much larger than the active gene associated with them. For example, the small transcriptionally active beta-actin gene (3.5 kb) is repaired as part of a 50 kb single-stranded region. The repair responsible for creating these islands requires active transcription, suggesting that the two activities are coupled. A preferential repair pathway in normal human cells promotes repair of actively transcribed DNA strands and is coupled to transcription. It is not known if similar large islands, referred to as repair domains, are preferentially created as a result of the coupling. Data are presented showing that in normal cells, preferential repair in the beta-actin region is associated with the creation of a large, completely repaired region in the partially repaired genome. Repair at other genomic locations which contain inactive genes (insulin, 754) does not create similar large regions as quickly. In contrast, repair in Cockayne syndrome cells, which are defective in the preferential repair pathway but not in genome-overall repair, proceeds in the beta-actin region by a mechanism which does not create preferentially a large repaired region. Thus a correlation between the activity required to preferentially repair active genes and that required to create repaired domains is detected. We propose an involvement of the transcription-repair coupling factor in a coordinated repair pathway for removing DNA damage from entire transcription units.
NASA Astrophysics Data System (ADS)
Rajib, M. A.; Merwade, V.; Song, C.; Zhao, L.; Kim, I. L.; Zhe, S.
2014-12-01
Setting up of any hydrologic model requires a large amount of efforts including compilation of all the data, creation of input files, calibration and validation. Given the amount of efforts involved, it is possible that models for a watershed get created multiple times by multiple groups or organizations to accomplish different research, educational or policy goals. To reduce the duplication of efforts and enable collaboration among different groups or organizations around an already existing hydrology model, a platform is needed where anyone can search for existing models, perform simple scenario analysis and visualize model results. The creator and users of a model on such a platform can then collaborate to accomplish new research or educational objectives. From this perspective, a prototype cyber-infrastructure (CI), called SWATShare, is developed for sharing, running and visualizing Soil Water Assessment Tool (SWAT) models in an interactive GIS-enabled web environment. Users can utilize SWATShare to publish or upload their own models, search and download existing SWAT models developed by others, run simulations including calibration using high performance resources provided by XSEDE and Cloud. Besides running and sharing, SWATShare hosts a novel spatio-temporal visualization system for SWAT model outputs. In temporal scale, the system creates time-series plots for all the hydrology and water quality variables available along the reach as well as in watershed-level. In spatial scale, the system can dynamically generate sub-basin level thematic maps for any variable at any user-defined date or date range; and thereby, allowing users to run animations or download the data for subsequent analyses. In addition to research, SWATShare can also be used within a classroom setting as an educational tool for modeling and comparing the hydrologic processes under different geographic and climatic settings. SWATShare is publicly available at https://www.water-hub.org/swatshare.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowe, J.T.; Carrington, D.B.
1990-09-01
The Austin Chalk is buried to a depth of only 2,100-2,500 ft and has retained primary microporosity unlike the typical deep fractured chalk reservoirs. The Van structure is a complexly faulted domal anticline created by salt intrusion and is approximately 2,000 ft higher than surrounding structures in the area. A major northwest-dipping fault acts as the primary trapping mechanism. The field has produced 0.5 billion BO from thick Woodbine sands since its discovery in 1929. Occurrence of oil in the Austin Chalk has been known since the field discovery, but prior completions were low rate oil producers. Recent development ofmore » a large fracture stimulation technique has resulted in increased production rates of up to 300 BOPD. The Austin Chalk reservoir limits were determined by isopaching feet of minimum productive resistivity having porosity above a cutoff value. The resistivity/porosity isopach showed a direct correlation between Austin Chalk productivity and the Austin Chalk structure and faulting pattern. Structural evidence along with oil typing indicate that the oil in the Austin Chalk has migrated upward along fault planes and through fault juxtaposition from the Woodbine sands 200 ft below the Austin Chalk. Thin-section and scanning electron microscopy work performed on conventional cores showed that the Van Austin Chalk formation is a very fine grained limestone composed primarily of coccoliths. Various amounts of detrital illite clay are present in the coccolith matrix. All effective porosity is micro-intergranular and ranges from 15 to 35%. Based on the core analyses, the main porosity reducing agent and therefore control on reservoir quality is the amount of detrital clay present filling the micropores. Permeability is very low with values ranging from 0.01 to 1.5 md. There is no evidence of significant natural fractures in the core. Artificial fractures are therefore required to create the permeability needed to sustain commercial production rates.« less
Remote sensing of global croplands for food security
Thenkabail, Prasad S.; Biradar, Chandrashekhar M.; Turral, Hugh; Lyon, John G.
2009-01-01
Increases in populations have created an increasing demand for food crops while increases in demand for biofuels have created an increase in demand for fuel crops. What has not increased is the amount of croplands and their productivity. These and many other factors such as decreasing water resources in a changing climate have created a crisis like situation in global food security. Decision makers in these situations need accurate information based on science. Remote Sensing of Global Croplands for Food Security provides a comprehensive knowledge base in use of satellite sensor-based maps and statistics that can be used to develop strategies for croplands (irrigated and rainfed) and their water use for food security.
USEPA EPIC IMPERVIOUS SURFACE RESEARCH IN THE MID-ATLANTIC
Anthropogenic impervious surfaces have an important relationship with non-point source pollution (NPS) in urban watersheds. These human-created surfaces include such features as roads, parking lots, rooftops, sidewalks, and driveways. The amount of impervious surface area in a ...
Calcium-based stabilizer induced heave in Oklahoma sulfate-bearing soils.
DOT National Transportation Integrated Search
2011-06-01
The addition of lime stabilizers can create problems in soils containing sulfates. In most cases, lime is mixed with expansive soils rendering them non-expansive; however, when a certain amount of sulfate is present naturally in expansive soils, the ...
A MODIFIED METHOD OF OBTAINING LARGE AMOUNTS OF RICKETTSIA PROWAZEKI BY ROENTGEN IRRADIATION OF RATS
Macchiavello, Atilio; Dresser, Richard
1935-01-01
The radiation method described by Zinsser and Castaneda for obtaining large amounts of Rickettsia has been carried out successfully with an ordinary radiographic machine. This allows the extension of the method to those communities which do not possess a high voltage Roentgen therapy unit as originally employed. PMID:19870416
Distributed Processing of Projections of Large Datasets: A Preliminary Study
Maddox, Brian G.
2004-01-01
Modern information needs have resulted in very large amounts of data being used in geographic information systems. Problems arise when trying to project these data in a reasonable amount of time and accuracy, however. Current single-threaded methods can suffer from two problems: fast projection with poor accuracy, or accurate projection with long processing time. A possible solution may be to combine accurate interpolation methods and distributed processing algorithms to quickly and accurately convert digital geospatial data between coordinate systems. Modern technology has made it possible to construct systems, such as Beowulf clusters, for a low cost and provide access to supercomputer-class technology. Combining these techniques may result in the ability to use large amounts of geographic data in time-critical situations.
Sakurai, Toshihiro; Aoki, Motohide; Ju, Xiaohui; Ueda, Tatsuya; Nakamura, Yasunori; Fujiwara, Shoko; Umemura, Tomonari; Tsuzuki, Mikio; Minoda, Ayumi
2016-01-01
The unicellular red alga Galdieria sulphuraria grows efficiently and produces a large amount of biomass in acidic conditions at high temperatures. It has great potential to produce biofuels and other beneficial compounds without becoming contaminated with other organisms. In G. sulphuraria, biomass measurements and glycogen and lipid analyses demonstrated that the amounts and compositions of glycogen and lipids differed when cells were grown under autotrophic, mixotrophic, and heterotrophic conditions. Maximum biomass production was obtained in the mixotrophic culture. High amounts of glycogen were obtained in the mixotrophic cultures, while the amounts of neutral lipids were similar between mixotrophic and heterotrophic cultures. The amounts of neutral lipids were highest in red algae, including thermophiles. Glycogen structure and fatty acids compositions largely depended on the growth conditions. Copyright © 2015. Published by Elsevier Ltd.
A pattern-based method to automate mask inspection files
NASA Astrophysics Data System (ADS)
Kamal Baharin, Ezni Aznida Binti; Muhsain, Mohamad Fahmi Bin; Ahmad Ibrahim, Muhamad Asraf Bin; Ahmad Noorhani, Ahmad Nurul Ihsan Bin; Sweis, Jason; Lai, Ya-Chieh; Hurat, Philippe
2017-03-01
Mask inspection is a critical step in the mask manufacturing process in order to ensure all dimensions printed are within the needed tolerances. This becomes even more challenging as the device nodes shrink and the complexity of the tapeout increases. Thus, the amount of measurement points and their critical dimension (CD) types are increasing to ensure the quality of the mask. In addition to the mask quality, there is a significant amount of manpower needed when the preparation and debugging of this process are not automated. By utilizing a novel pattern search technology with the ability to measure and report match region scan-line (edge) measurements, we can create a flow to find, measure and mark all metrology locations of interest and provide this automated report to the mask shop for inspection. A digital library is created based on the technology product and node which contains the test patterns to be measured. This paper will discuss how these digital libraries will be generated and then utilized. As a time-critical part of the manufacturing process, this can also reduce the data preparation cycle time, minimize the amount of manual/human error in naming and measuring the various locations, reduce the risk of wrong/missing CD locations, and reduce the amount of manpower needed overall. We will also review an example pattern and how the reporting structure to the mask shop can be processed. This entire process can now be fully automated.
Data Mining Web Services for Science Data Repositories
NASA Astrophysics Data System (ADS)
Graves, S.; Ramachandran, R.; Keiser, K.; Maskey, M.; Lynnes, C.; Pham, L.
2006-12-01
The maturation of web services standards and technologies sets the stage for a distributed "Service-Oriented Architecture" (SOA) for NASA's next generation science data processing. This architecture will allow members of the scientific community to create and combine persistent distributed data processing services and make them available to other users over the Internet. NASA has initiated a project to create a suite of specialized data mining web services designed specifically for science data. The project leverages the Algorithm Development and Mining (ADaM) toolkit as its basis. The ADaM toolkit is a robust, mature and freely available science data mining toolkit that is being used by several research organizations and educational institutions worldwide. These mining services will give the scientific community a powerful and versatile data mining capability that can be used to create higher order products such as thematic maps from current and future NASA satellite data records with methods that are not currently available. The package of mining and related services are being developed using Web Services standards so that community-based measurement processing systems can access and interoperate with them. These standards-based services allow users different options for utilizing them, from direct remote invocation by a client application to deployment of a Business Process Execution Language (BPEL) solutions package where a complex data mining workflow is exposed to others as a single service. The ability to deploy and operate these services at a data archive allows the data mining algorithms to be run where the data are stored, a more efficient scenario than moving large amounts of data over the network. This will be demonstrated in a scenario in which a user uses a remote Web-Service-enabled clustering algorithm to create cloud masks from satellite imagery at the Goddard Earth Sciences Data and Information Services Center (GES DISC).
NASA Astrophysics Data System (ADS)
Ekenes, K.
2017-12-01
This presentation will outline the process of creating a web application for exploring large amounts of scientific geospatial data using modern automated cartographic techniques. Traditional cartographic methods, including data classification, may inadvertently hide geospatial and statistical patterns in the underlying data. This presentation demonstrates how to use smart web APIs that quickly analyze the data when it loads, and provides suggestions for the most appropriate visualizations based on the statistics of the data. Since there are just a few ways to visualize any given dataset well, it is imperative to provide smart default color schemes tailored to the dataset as opposed to static defaults. Since many users don't go beyond default values, it is imperative that they are provided with smart default visualizations. Multiple functions for automating visualizations are available in the Smart APIs, along with UI elements allowing users to create more than one visualization for a dataset since there isn't a single best way to visualize a given dataset. Since bivariate and multivariate visualizations are particularly difficult to create effectively, this automated approach removes the guesswork out of the process and provides a number of ways to generate multivariate visualizations for the same variables. This allows the user to choose which visualization is most appropriate for their presentation. The methods used in these APIs and the renderers generated by them are not available elsewhere. The presentation will show how statistics can be used as the basis for automating default visualizations of data along continuous ramps, creating more refined visualizations while revealing the spread and outliers of the data. Adding interactive components to instantaneously alter visualizations allows users to unearth spatial patterns previously unknown among one or more variables. These applications may focus on a single dataset that is frequently updated, or configurable for a variety of datasets from multiple sources.
Openwebglobe 2: Visualization of Complex 3D-GEODATA in the (mobile) Webbrowser
NASA Astrophysics Data System (ADS)
Christen, M.
2016-06-01
Providing worldwide high resolution data for virtual globes consists of compute and storage intense tasks for processing data. Furthermore, rendering complex 3D-Geodata, such as 3D-City models with an extremely high polygon count and a vast amount of textures at interactive framerates is still a very challenging task, especially on mobile devices. This paper presents an approach for processing, caching and serving massive geospatial data in a cloud-based environment for large scale, out-of-core, highly scalable 3D scene rendering on a web based virtual globe. Cloud computing is used for processing large amounts of geospatial data and also for providing 2D and 3D map data to a large amount of (mobile) web clients. In this paper the approach for processing, rendering and caching very large datasets in the currently developed virtual globe "OpenWebGlobe 2" is shown, which displays 3D-Geodata on nearly every device.
ERIC Educational Resources Information Center
Whitehead, Linda C.; Ginsberg, Stacey I.
1999-01-01
Presents suggestions for creating family-like programs in large child-care centers in three areas: (1) physical environment, incorporating cozy spaces, beauty, and space for family interaction; (2) caregiving climate, such as sharing home photographs, and serving meals family style; and (3) family involvement, including regular conversations with…
... are: Chlorhexidine gluconate Ethanol (ethyl alcohol) Hydrogen peroxide Methyl salicylate ... amounts of alcohol (drunkenness). Swallowing large amounts of methyl salicylate and hydrogen peroxide may also cause serious stomach ...
How institutions shaped the last major evolutionary transition to large-scale human societies
Powers, Simon T.; van Schaik, Carel P.; Lehmann, Laurent
2016-01-01
What drove the transition from small-scale human societies centred on kinship and personal exchange, to large-scale societies comprising cooperation and division of labour among untold numbers of unrelated individuals? We propose that the unique human capacity to negotiate institutional rules that coordinate social actions was a key driver of this transition. By creating institutions, humans have been able to move from the default ‘Hobbesian’ rules of the ‘game of life’, determined by physical/environmental constraints, into self-created rules of social organization where cooperation can be individually advantageous even in large groups of unrelated individuals. Examples include rules of food sharing in hunter–gatherers, rules for the usage of irrigation systems in agriculturalists, property rights and systems for sharing reputation between mediaeval traders. Successful institutions create rules of interaction that are self-enforcing, providing direct benefits both to individuals that follow them, and to individuals that sanction rule breakers. Forming institutions requires shared intentionality, language and other cognitive abilities largely absent in other primates. We explain how cooperative breeding likely selected for these abilities early in the Homo lineage. This allowed anatomically modern humans to create institutions that transformed the self-reliance of our primate ancestors into the division of labour of large-scale human social organization. PMID:26729937
Fuentes-Montemayor, Elisa; Watts, Kevin; Macgregor, Nicholas A; Lopez-Gallego, Zeltia; J Park, Kirsty
2017-07-01
Conservation strategies to tackle habitat loss and fragmentation require actions at the local (e.g., improving/expanding existing habitat patches) and landscape level (e.g., creating new habitat in the matrix). However, the relative importance of these actions for biodiversity is still poorly understood, leading to debate on how to prioritize conservation activities. Here, we assess the relative importance of local vs. landscape-level attributes in determining the use of woodlands by bats in fragmented landscapes; we also compare the role of habitat amount in the surrounding landscape per se vs. a combination of both habitat amount and configuration and explore whether the relative importance of these attributes varies with species mobility and landscape context. We conducted acoustic surveys in 102 woodland patches in the UK that form part of the WrEN project (www.wren-project.com), a large-scale natural experiment designed to study the effects of 160 yr of woodland creation on biodiversity and inform landscape-scale conservation. We used multivariate analysis and a model-selection approach to assess the relative importance of local (e.g., vegetation structure) and landscape-level (e.g., amount/configuration of surrounding land types) attributes on bat occurrence and activity levels. Species mobility was an important trait determining the relative importance of local vs. landscape-level attributes for different bat species. Lower mobility species were most strongly influenced by local habitat quality; the landscape became increasingly important for higher mobility species. At the landscape-scale, a combination of habitat amount and configuration appeared more important than habitat amount alone for lower mobility species, while the opposite was observed for higher mobility species. Regardless of species mobility, landscape-level attributes appeared more important for bats in a more homogeneous and intensively farmed landscape. Conservation strategies involving habitat creation and restoration should take into account the mobility of target species and prioritize landscape-level actions in more homogeneous and intensively farmed landscapes where habitat loss and fragmentation have been more severe. © 2017 by the Ecological Society of America.
NASA Technical Reports Server (NTRS)
Chatterji, Gano
2011-01-01
Conclusions: Validated the fuel estimation procedure using flight test data. A good fuel model can be created if weight and fuel data are available. Error in assumed takeoff weight results in similar amount of error in the fuel estimate. Fuel estimation error bounds can be determined.
De Feo, Vito; Boi, Fabio; Safaai, Houman; Onken, Arno; Panzeri, Stefano; Vato, Alessandro
2017-01-01
Brain-machine interfaces (BMIs) promise to improve the quality of life of patients suffering from sensory and motor disabilities by creating a direct communication channel between the brain and the external world. Yet, their performance is currently limited by the relatively small amount of information that can be decoded from neural activity recorded form the brain. We have recently proposed that such decoding performance may be improved when using state-dependent decoding algorithms that predict and discount the large component of the trial-to-trial variability of neural activity which is due to the dependence of neural responses on the network's current internal state. Here we tested this idea by using a bidirectional BMI to investigate the gain in performance arising from using a state-dependent decoding algorithm. This BMI, implemented in anesthetized rats, controlled the movement of a dynamical system using neural activity decoded from motor cortex and fed back to the brain the dynamical system's position by electrically microstimulating somatosensory cortex. We found that using state-dependent algorithms that tracked the dynamics of ongoing activity led to an increase in the amount of information extracted form neural activity by 22%, with a consequently increase in all of the indices measuring the BMI's performance in controlling the dynamical system. This suggests that state-dependent decoding algorithms may be used to enhance BMIs at moderate computational cost.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hirokawa, Takako; /U. Colorado, Boulder /SLAC
In this paper, we examine data acquisition in a high harmonic generation (HHG) lab and preliminary data analysis with the Cyclohexadiene Collaboration at the Linac Coherent Lightsource (LCLS) at SLAC National Accelerator Laboratory. HHG experiments have a large number of parameters that need to be monitored constantly. In particular, the pressure of the target is critical to HHG yield. However, this pressure can fluctuate wildly and without a tool to monitor it, it is difficult to analyze the correlation between HHG yield and the pressure. I used the Arduino microcontroller board and created a complementary MATLAB graphical user interface (GUI),more » thereby enhancing the ease with which users can acquire time-stamped parameter data. Using the Arduino, it is much easier to match the pressure to the corresponding HHG yield. Collecting data by using the Arduino and the GUI is flexible, user-friendly, and cost-effective. In the future, we hope to be able to control and monitor parts of the lab with the Arduino alone. While more parameter information is needed in the HHG lab, we needed to reduce the amount of data during the cyclohexadiene collaboration. This was achieved by sorting the data into bins and filtering out unnecessary details. This method was highly effective in that it minimized the amount of data without losing any valuable information. This effective preliminary data analysis technique will continue to be used to decrease the size of the collected data.« less
Nutrient Budgets Calculated in Floodwaters Using a Whole-Ecosystem Experimental Manipulation
NASA Astrophysics Data System (ADS)
Talbot, C. J.; Paterson, M. J.; Xenopoulos, M. A.
2017-12-01
Flooding provides pathways for nutrients to move into surface waters and alter nutrient concentrations, therefore influencing downstream ecosystems and increasing events of eutrophication. Nutrient enrichment will likely affect water quality, primary production, and overall ecosystem function. Quantifying nutrient movement post-flood will help evaluate the risks or advantages that flooding will have on ecosystem processes. Here we constructed nutrient budgets using data collected as part of the Flooded Upland Dynamics Experiment (FLUDEX) at the Experimental Lakes Area (ELA) in northwestern Ontario. Three experimental reservoirs with varying amounts of stored carbon were created by flooding forested land from May through September annually from 1999 to 2003. Organic matter became a significant source of nutrients under flooded conditions and elevated reservoir total nitrogen (TN) and total phosphorus (TP) concentrations within one week of flooding. The highest TN (2.6 mg L-1) and TP (579 µg L-1) concentrations throughout the entire flooding experiment occurred in the medium carbon reservoir within the first two weeks of flooding in 1999. TN and TP fluxes were positive in all years of flooding. TP fluxes decreased after each flooding season therefore, TP production may be less problematic in floodplains subject to frequent repeated flooding. However, TN fluxes remained large even after repeated flooding. Therefore, flooding, whether naturally occurring or from anthropogenic flow alteration, may be responsible for producing significant amounts of nitrogen and phosphorus in aquatic ecosystems.
Oil platforms off California are among the most productive marine fish habitats globally
Claisse, Jeremy T.; Pondella, Daniel J.; Love, Milton; Zahn, Laurel A.; Williams, Chelsea M.; Williams, Jonathan P.; Bull, Ann S.
2014-01-01
Secondary (i.e., heterotrophic or animal) production is a main pathway of energy flow through an ecosystem as it makes energy available to consumers, including humans. Its estimation can play a valuable role in the examination of linkages between ecosystem functions and services. We found that oil and gas platforms off the coast of California have the highest secondary fish production per unit area of seafloor of any marine habitat that has been studied, about an order of magnitude higher than fish communities from other marine ecosystems. Most previous estimates have come from estuarine environments, generally regarded as one of the most productive ecosystems globally. High rates of fish production on these platforms ultimately result from high levels of recruitment and the subsequent growth of primarily rockfish (genus Sebastes) larvae and pelagic juveniles to the substantial amount of complex hardscape habitat created by the platform structure distributed throughout the water column. The platforms have a high ratio of structural surface area to seafloor surface area, resulting in large amounts of habitat for juvenile and adult demersal fishes over a relatively small footprint of seafloor. Understanding the biological implications of these structures will inform policy related to the decommissioning of existing (e.g., oil and gas platforms) and implementation of emerging (e.g., wind, marine hydrokinetic) energy technologies. PMID:25313050
Norm-based coding of facial identity in adults with autism spectrum disorder.
Walsh, Jennifer A; Maurer, Daphne; Vida, Mark D; Rhodes, Gillian; Jeffery, Linda; Rutherford, M D
2015-03-01
It is unclear whether reported deficits in face processing in individuals with autism spectrum disorders (ASD) can be explained by deficits in perceptual face coding mechanisms. In the current study, we examined whether adults with ASD showed evidence of norm-based opponent coding of facial identity, a perceptual process underlying the recognition of facial identity in typical adults. We began with an original face and an averaged face and then created an anti-face that differed from the averaged face in the opposite direction from the original face by a small amount (near adaptor) or a large amount (far adaptor). To test for norm-based coding, we adapted participants on different trials to the near versus far adaptor, then asked them to judge the identity of the averaged face. We varied the size of the test and adapting faces in order to reduce any contribution of low-level adaptation. Consistent with the predictions of norm-based coding, high functioning adults with ASD (n = 27) and matched typical participants (n = 28) showed identity aftereffects that were larger for the far than near adaptor. Unlike results with children with ASD, the strength of the aftereffects were similar in the two groups. This is the first study to demonstrate norm-based coding of facial identity in adults with ASD. Copyright © 2015 Elsevier Ltd. All rights reserved.
Exploring Cloud Computing for Large-scale Scientific Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Guang; Han, Binh; Yin, Jian
This paper explores cloud computing for large-scale data-intensive scientific applications. Cloud computing is attractive because it provides hardware and software resources on-demand, which relieves the burden of acquiring and maintaining a huge amount of resources that may be used only once by a scientific application. However, unlike typical commercial applications that often just requires a moderate amount of ordinary resources, large-scale scientific applications often need to process enormous amount of data in the terabyte or even petabyte range and require special high performance hardware with low latency connections to complete computation in a reasonable amount of time. To address thesemore » challenges, we build an infrastructure that can dynamically select high performance computing hardware across institutions and dynamically adapt the computation to the selected resources to achieve high performance. We have also demonstrated the effectiveness of our infrastructure by building a system biology application and an uncertainty quantification application for carbon sequestration, which can efficiently utilize data and computation resources across several institutions.« less
NASA Astrophysics Data System (ADS)
Lan, Hengxing; Derek Martin, C.; Lim, C. H.
2007-02-01
Geographic information system (GIS) modeling is used in combination with three-dimensional (3D) rockfall process modeling to assess rockfall hazards. A GIS extension, RockFall Analyst (RA), which is capable of effectively handling large amounts of geospatial information relative to rockfall behaviors, has been developed in ArcGIS using ArcObjects and C#. The 3D rockfall model considers dynamic processes on a cell plane basis. It uses inputs of distributed parameters in terms of raster and polygon features created in GIS. Two major components are included in RA: particle-based rockfall process modeling and geostatistics-based rockfall raster modeling. Rockfall process simulation results, 3D rockfall trajectories and their velocity features either for point seeders or polyline seeders are stored in 3D shape files. Distributed raster modeling, based on 3D rockfall trajectories and a spatial geostatistical technique, represents the distribution of spatial frequency, the flying and/or bouncing height, and the kinetic energy of falling rocks. A distribution of rockfall hazard can be created by taking these rockfall characteristics into account. A barrier analysis tool is also provided in RA to aid barrier design. An application of these modeling techniques to a case study is provided. The RA has been tested in ArcGIS 8.2, 8.3, 9.0 and 9.1.
AIRS Map of Carbon Monoxide Draped on Globe: Time Series from 8/1/2005 to 9/30/2005
NASA Technical Reports Server (NTRS)
2007-01-01
[figure removed for brevity, see original site] Click on the image for movie of AIRS Map of Carbon Monoxide Draped on Globe Forest fires and agricultural burning create large amounts of carbon monoxide. AIRS provides daily global maps of carbon monoxide from space, allowing scientists to follow the global transport of this gas day-to-day. In this image sequence, carbon monoxide pollution from agricultural burning blooms repeatedly over the Amazonian basin. The gas is then transported across the Atlantic Ocean. Carbon monoxide pollution from fires in sub-Saharan Africa is also apparent. The Atmospheric Infrared Sounder Experiment, with its visible, infrared, and microwave detectors, provides a three-dimensional look at Earth's weather. Working in tandem, the three instruments can make simultaneous observations all the way down to the Earth's surface, even in the presence of heavy clouds. With more than 2,000 channels sensing different regions of the atmosphere, the system creates a global, 3-D map of atmospheric temperature and humidity and provides information on clouds, greenhouse gases, and many other atmospheric phenomena. The AIRS Infrared Sounder Experiment flies onboard NASA's Aqua spacecraft and is managed by NASA's Jet Propulsion Laboratory, Pasadena, Calif., under contract to NASA. JPL is a division of the California Institute of Technology in Pasadena.Space Radar Image of Long Valley, California in 3-D
1999-05-01
This three-dimensional perspective view of Long Valley, California was created from data taken by the Spaceborne Imaging Radar-C/X-band Synthetic Aperture Radar on board the space shuttle Endeavour. This image was constructed by overlaying a color composite SIR-C radar image on a digital elevation map. The digital elevation map was produced using radar interferometry, a process by which radar data are acquired on different passes of the space shuttle. The two data passes are compared to obtain elevation information. The interferometry data were acquired on April 13,1994 and on October 3, 1994, during the first and second flights of the SIR-C/X-SAR instrument. The color composite radar image was taken in October and was produced by assigning red to the C-band (horizontally transmitted and vertically received) polarization; green to the C-band (vertically transmitted and received) polarization; and blue to the ratio of the two data sets. Blue areas in the image are smooth and yellow areas are rock outcrops with varying amounts of snow and vegetation. The view is looking north along the northeastern edge of the Long Valley caldera, a volcanic collapse feature created 750,000 years ago and the site of continued subsurface activity. Crowley Lake is the large dark feature in the foreground. http://photojournal.jpl.nasa.gov/catalog/PIA01769
The Sublimation Rate of CO2 Under Simulated Mars Conditions and the Possible Climatic Implications
NASA Astrophysics Data System (ADS)
Bryson, Kathryn; Chevrier, V.; Roe, L.; White, K.; Blackburn, D.
2008-09-01
In order to understand the behavior of CO2 on Mars, we have studied the sublimation of dry ice under simulated martian conditions. Our experiments resulted in an average sublimation rate for CO2 ice of 1.20 ± 0.27 mm h-1. These results are very close to those observed of the martian polar caps retreat, and suggest a common process for the sublimation mechanism on Mars and in our chamber. Based on these results we created a model where irradiance from the sun is the primary source of heat on the martian polar surface. Our model predicts a 32 cm offset between the amount of CO2 ice sublimated and deposited in the southern polar region. The eccentricity of the martian orbit causes the southern hemisphere to sublimate more then it deposits back during one martian year. We have compared MOC and HiRISE images from approximately the same season (Ls 285.57º and 289.5º, respectively) from three martian years apart. These images indicate an average sublimation rate of 0.43 ± 0.04 m y-1, very close to the 0.32 m y-1 predicted by our model. Due to the length of Mars’ precession cycle, 93,000 martian years, it will take an extensive amount of time for the equinoxes to change. Therefore, we predict that the CO2 of the south polar cap will migrate entirely to the northern polar cap before such changes could occur. If the CO2 ice is only a thin layer above a much thicker water ice layer, this could expose large amounts of water ice, having a drastic climactic affect.
Cumulative and episodic vaccine aluminum exposure in a population-based cohort of young children.
Glanz, Jason M; Newcomer, Sophia R; Daley, Matthew F; McClure, David L; Baxter, Roger P; Jackson, Michael L; Naleway, Allison L; Lugg, Marlene M; DeStefano, Frank
2015-11-27
In addition to antigens, vaccines contain small amounts of preservatives, adjuvants, and residual substances from the manufacturing process. Some parents have concerns about the safety of these ingredients, yet no large epidemiological studies have specifically examined associations between health outcomes and vaccine ingredients, other than thimerosal. This study examined the extent to which the Vaccine Safety Datalink (VSD) could be used to study vaccine ingredient safety in children. Children born 2004-2011 were identified in VSD data. Using immunization records, two cohorts were identified: children who were up-to-date and children who were undervaccinated before age 2 years. A database was also created linking vaccine type and manufacturer with ingredient amounts documented in vaccine package inserts. Thirty-four ingredients in two or more infant vaccines were identified. However, only amounts (in mg) for aluminum were consistently documented and commonly contained in infant vaccines. Analyses compared vaccine aluminum exposure across cohorts and determined the statistical power for studying associations between aluminum exposure and hypothetical vaccine adverse events. Among 408,608 children, mean cumulative vaccine aluminum exposure increased from 1.11 to 4.00 mg between ages 92-730 days. Up-to-date children were exposed to 11-26% more aluminum from vaccines than undervaccinated children. Power analyses demonstrated that safety studies of aluminum could detect relative risks ranging from 1.1 to 5.8 for a range of adverse event incidence. The safety of vaccine aluminum exposure can be feasibly studied in the VSD. However, possible biological mechanisms and confounding variables would need to be considered before conducting any studies. Copyright © 2015 Elsevier Ltd. All rights reserved.
Unit-Dose Bags For Formulating Intravenous Solutions
NASA Technical Reports Server (NTRS)
Finley, Mike; Kipp, Jim; Scharf, Mike; Packard, Jeff; Owens, Jim
1993-01-01
Smaller unit-dose flowthrough bags devised for use with large-volume parenteral (LVP) bags in preparing sterile intravenous solutions. Premeasured amount of solute stored in such unit-dose bag flushed by predetermined amount of water into LVP bag. Relatively small number of LVP bags used in conjunction with smaller unit-dose bags to formulate large number of LVP intravenous solutions in nonsterile environment.
Taking Energy to the Physics Classroom from the Large Hadron Collider at CERN
ERIC Educational Resources Information Center
Cid, Xabier; Cid, Ramon
2009-01-01
In 2008, the greatest experiment in history began. When in full operation, the Large Hadron Collider (LHC) at CERN will generate the greatest amount of information that has ever been produced in an experiment before. It will also reveal some of the most fundamental secrets of nature. Despite the enormous amount of information available on this…
Curtis H. Flather; Michael Bevers
2002-01-01
A discrete reaction-diffusion model was used to estimate long-term equilibrium populations of a hypothetical species inhabiting patchy landscapes to examine the relative importance of habitat amount and arrangement in explaining population size. When examined over a broad range of habitat amounts and arrangements, population size was largely determined by a pure amount...
Labeling for Big Data in radiation oncology: The Radiation Oncology Structures ontology.
Bibault, Jean-Emmanuel; Zapletal, Eric; Rance, Bastien; Giraud, Philippe; Burgun, Anita
2018-01-01
Leveraging Electronic Health Records (EHR) and Oncology Information Systems (OIS) has great potential to generate hypotheses for cancer treatment, since they directly provide medical data on a large scale. In order to gather a significant amount of patients with a high level of clinical details, multicenter studies are necessary. A challenge in creating high quality Big Data studies involving several treatment centers is the lack of semantic interoperability between data sources. We present the ontology we developed to address this issue. Radiation Oncology anatomical and target volumes were categorized in anatomical and treatment planning classes. International delineation guidelines specific to radiation oncology were used for lymph nodes areas and target volumes. Hierarchical classes were created to generate The Radiation Oncology Structures (ROS) Ontology. The ROS was then applied to the data from our institution. Four hundred and seventeen classes were created with a maximum of 14 children classes (average = 5). The ontology was then converted into a Web Ontology Language (.owl) format and made available online on Bioportal and GitHub under an Apache 2.0 License. We extracted all structures delineated in our department since the opening in 2001. 20,758 structures were exported from our "record-and-verify" system, demonstrating a significant heterogeneity within a single center. All structures were matched to the ROS ontology before integration into our clinical data warehouse (CDW). In this study we describe a new ontology, specific to radiation oncology, that reports all anatomical and treatment planning structures that can be delineated. This ontology will be used to integrate dosimetric data in the Assistance Publique-Hôpitaux de Paris CDW that stores data from 6.5 million patients (as of February 2017).
Monitoring of small laboratory animal experiments by a designated web-based database.
Frenzel, T; Grohmann, C; Schumacher, U; Krüll, A
2015-10-01
Multiple-parametric small animal experiments require, by their very nature, a sufficient number of animals which may need to be large to obtain statistically significant results.(1) For this reason database-related systems are required to collect the experimental data as well as to support the later (re-) analysis of the information gained during the experiments. In particular, the monitoring of animal welfare is simplified by the inclusion of warning signals (for instance, loss in body weight >20%). Digital patient charts have been developed for human patients but are usually not able to fulfill the specific needs of animal experimentation. To address this problem a unique web-based monitoring system using standard MySQL, PHP, and nginx has been created. PHP was used to create the HTML-based user interface and outputs in a variety of proprietary file formats, namely portable document format (PDF) or spreadsheet files. This article demonstrates its fundamental features and the easy and secure access it offers to the data from any place using a web browser. This information will help other researchers create their own individual databases in a similar way. The use of QR-codes plays an important role for stress-free use of the database. We demonstrate a way to easily identify all animals and samples and data collected during the experiments. Specific ways to record animal irradiations and chemotherapy applications are shown. This new analysis tool allows the effective and detailed analysis of huge amounts of data collected through small animal experiments. It supports proper statistical evaluation of the data and provides excellent retrievable data storage. © The Author(s) 2015.
Labeling for Big Data in radiation oncology: The Radiation Oncology Structures ontology
Zapletal, Eric; Rance, Bastien; Giraud, Philippe; Burgun, Anita
2018-01-01
Purpose Leveraging Electronic Health Records (EHR) and Oncology Information Systems (OIS) has great potential to generate hypotheses for cancer treatment, since they directly provide medical data on a large scale. In order to gather a significant amount of patients with a high level of clinical details, multicenter studies are necessary. A challenge in creating high quality Big Data studies involving several treatment centers is the lack of semantic interoperability between data sources. We present the ontology we developed to address this issue. Methods Radiation Oncology anatomical and target volumes were categorized in anatomical and treatment planning classes. International delineation guidelines specific to radiation oncology were used for lymph nodes areas and target volumes. Hierarchical classes were created to generate The Radiation Oncology Structures (ROS) Ontology. The ROS was then applied to the data from our institution. Results Four hundred and seventeen classes were created with a maximum of 14 children classes (average = 5). The ontology was then converted into a Web Ontology Language (.owl) format and made available online on Bioportal and GitHub under an Apache 2.0 License. We extracted all structures delineated in our department since the opening in 2001. 20,758 structures were exported from our “record-and-verify” system, demonstrating a significant heterogeneity within a single center. All structures were matched to the ROS ontology before integration into our clinical data warehouse (CDW). Conclusion In this study we describe a new ontology, specific to radiation oncology, that reports all anatomical and treatment planning structures that can be delineated. This ontology will be used to integrate dosimetric data in the Assistance Publique—Hôpitaux de Paris CDW that stores data from 6.5 million patients (as of February 2017). PMID:29351341
Information Pre-Processing using Domain Meta-Ontology and Rule Learning System
NASA Astrophysics Data System (ADS)
Ranganathan, Girish R.; Biletskiy, Yevgen
Around the globe, extraordinary amounts of documents are being created by Enterprises and by users outside these Enterprises. The documents created in the Enterprises constitute the main focus of the present chapter. These documents are used to perform numerous amounts of machine processing. While using thesedocuments for machine processing, lack of semantics of the information in these documents may cause misinterpretation of the information, thereby inhibiting the productiveness of computer assisted analytical work. Hence, it would be profitable to the Enterprises if they use well defined domain ontologies which will serve as rich source(s) of semantics for the information in the documents. These domain ontologies can be created manually, semi-automatically or fully automatically. The focus of this chapter is to propose an intermediate solution which will enable relatively easy creation of these domain ontologies. The process of extracting and capturing domain ontologies from these voluminous documents requires extensive involvement of domain experts and application of methods of ontology learning that are substantially labor intensive; therefore, some intermediate solutions which would assist in capturing domain ontologies must be developed. This chapter proposes a solution in this direction which involves building a meta-ontology that will serve as an intermediate information source for the main domain ontology. This chapter proposes a solution in this direction which involves building a meta-ontology as a rapid approach in conceptualizing a domain of interest from huge amount of source documents. This meta-ontology can be populated by ontological concepts, attributes and relations from documents, and then refined in order to form better domain ontology either through automatic ontology learning methods or some other relevant ontology building approach.
Lambert, N; Plumb, J; Looise, B; Johnson, I T; Harvey, I; Wheeler, C; Robinson, M; Rolfe, P
2005-08-01
The aim of the study was to test the abilities of the newly created smart card system to track the nutrient contents of foods chosen over several months by individual diners in a school cafeteria. From the food choice and composition of food data sets, an Access database was created encompassing 30 diners (aged 8-11 years), 78 days and eight nutrients. Data were available for a total of 1909 meals. Based upon population mean values the cohort were clearly choosing meals containing higher than the recommended maximum amounts for sugar and lower than the recommended minimum amounts of fibre, iron and vitamin A. Protein and vitamin C contents of meals chosen were well above minimum requirements. Over the 1909 meals, nutrient requirements were met 41% of the time. The system created was very effective at continually monitoring food choices of individual diners over limitless time. The data generated raised questions on the common practice of presenting nutrient intakes as population mean values calculated over a few days. The impact of heavily fortified foods on such studies in general is discussed.
Medical and Scientific Evaluations aboard the KC-135. Microgravity-Compatible Flow Cytometer
NASA Technical Reports Server (NTRS)
Crucian, Brian; Nelman-Gonzalez, Mayra; Sams, Clarence
2005-01-01
A spaceflight-compatible flow cytometer would be useful for the diagnosis of astronaut illness during long duration spaceflight and for conducting in-flight research to evaluate the effects of microgravity on human physiology. Until recently, the primary limitations preventing the development of a spaceflight compatible flow cytometer have been largely mechanical. Standard commercially available flow cytometers are large, complex instruments that use high-energy lasers and require significant training to operate. Standard flow cytometers function by suspending the particles to be analyzed inside a sheath fluid for analysis. This requires the presence of several liters of sheath fluid for operation, and generates a corresponding amount of liquid hazardous waste. The particles are then passed through a flow cell which uses the fluid mechanical property of hydrodynamic focusing to place the cells in single-file (laminar flow) as they pass through a laser beam for scanning and evaluation. Many spaceflight experiments have demonstrated that fluid physics is dramatically altered in microgravity (MSF [Manned Space Flight] Fluid Physics Data Sheet-August 1997) and previous studies have shown that sheath-fluid based hydrodynamic focusing may also be altered during microgravity (Crucian et al, 2000). For these reasons it is likely that any spaceflight compatible design for a flow cytometer would abandon the sheath fluid requirement. The elimination of sheath fluid would remove both the problems of weight associated with large volumes of liquids as well as the large volume of liquid waste generated. It would also create the need for a method to create laminar particle flow distinct from the standard sheath-fluid based method. The spaceflight prototype instrument is based on a recently developed commercial flow cytometer possessing a novel flow cell design that creates single-particle laser scanning and evaluation without the need for sheath-fluid based hydrodynamic focusing. This instrument also possesses a number of design advances that make it conditionally microgravity compatible: it is highly miniaturized and lightweight, uses a low energy diode laser, has a small number of moving parts, does not use sheath fluid and does not generate significant liquid waste. Although possessing certain limitations, the commercial cytometer functions operationally like a standard bench top laboratory flow cytometer, aspirating liquid particle samples and generating histogram or dot-plot data in standard FCS file format. In its current configuration however, the cytometer is limited to three parameter/two-color capability (two color PMTs + forward scatter), does not allow compensation between colors, does not allow linear analysis and is operated by rather inflexible software with limited capabilities. This is due to the fact that the cytometer has been designed and marketed as an instrument specific to a few particular assays, not as a multipurpose cytometer.
Stepped chute training wall height requirements
USDA-ARS?s Scientific Manuscript database
Stepped chutes are commonly used for overtopping protection for embankment dams. Aerated flow is commonly associated with stepped chutes if the chute has sufficient length. The aeration and turbulence of the flow can create a significant amount of splash over the training wall if not appropriately...
Sodium content in US packaged foods 2009
USDA-ARS?s Scientific Manuscript database
In 2010, the Institute of Medicine recommended food manufacturers reduce the amount of sodium in their products. Monitoring sodium in packaged foods is necessary to evaluate the impact of these efforts. Using commercially available data from Nielsen and Gladson, we created a database with sales and...
Collaboration Strategies to Reduce Technical Debt
ERIC Educational Resources Information Center
Miko, Jeffrey Allen
2017-01-01
Inadequate software development collaboration processes can allow technical debt to accumulate increasing future maintenance costs and the chance of system failures. The purpose of this qualitative case study was to explore collaboration strategies software development leaders use to reduce the amount of technical debt created by software…
Ontology-Based Empirical Knowledge Verification for Professional Virtual Community
ERIC Educational Resources Information Center
Chen, Yuh-Jen
2011-01-01
A professional virtual community provides an interactive platform for enterprise experts to create and share their empirical knowledge cooperatively, and the platform contains a tremendous amount of hidden empirical knowledge that knowledge experts have preserved in the discussion process. Therefore, enterprise knowledge management highly…
Membranes with artificial free-volume for biofuel production
Petzetakis, Nikos; Doherty, Cara M.; Thornton, Aaron W.; Chen, X. Chelsea; Cotanda, Pepa; Hill, Anita J.; Balsara, Nitash P.
2015-01-01
Free-volume of polymers governs transport of penetrants through polymeric films. Control over free-volume is thus important for the development of better membranes for a wide variety of applications such as gas separations, pharmaceutical purifications and energy storage. To date, methodologies used to create materials with different amounts of free-volume are based primarily on chemical synthesis of new polymers. Here we report a simple methodology for generating free-volume based on the self-assembly of polyethylene-b-polydimethylsiloxane-b-polyethylene triblock copolymers. We have used this method to fabricate a series of membranes with identical compositions but with different amounts of free-volume. We use the term artificial free-volume to refer to the additional free-volume created by self-assembly. The effect of artificial free-volume on selective transport through the membranes was tested using butanol/water and ethanol/water mixtures due to their importance in biofuel production. We found that the introduction of artificial free-volume improves both alcohol permeability and selectivity. PMID:26104672
Membranes with artificial free-volume for biofuel production
NASA Astrophysics Data System (ADS)
Petzetakis, Nikos; Doherty, Cara M.; Thornton, Aaron W.; Chen, X. Chelsea; Cotanda, Pepa; Hill, Anita J.; Balsara, Nitash P.
2015-06-01
Free-volume of polymers governs transport of penetrants through polymeric films. Control over free-volume is thus important for the development of better membranes for a wide variety of applications such as gas separations, pharmaceutical purifications and energy storage. To date, methodologies used to create materials with different amounts of free-volume are based primarily on chemical synthesis of new polymers. Here we report a simple methodology for generating free-volume based on the self-assembly of polyethylene-b-polydimethylsiloxane-b-polyethylene triblock copolymers. We have used this method to fabricate a series of membranes with identical compositions but with different amounts of free-volume. We use the term artificial free-volume to refer to the additional free-volume created by self-assembly. The effect of artificial free-volume on selective transport through the membranes was tested using butanol/water and ethanol/water mixtures due to their importance in biofuel production. We found that the introduction of artificial free-volume improves both alcohol permeability and selectivity.
Membranes with artificial free-volume for biofuel production
Petzetakis, Nikos; Doherty, Cara M.; Thornton, Aaron W.; ...
2015-06-24
Free-volume of polymers governs transport of penetrants through polymeric films. Control over free-volume is thus important for the development of better membranes for a wide variety of applications such as gas separations, pharmaceutical purifications and energy storage. To date, methodologies used to create materials with different amounts of free-volume are based primarily on chemical synthesis of new polymers. Here we report a simple methodology for generating free-volume based on the self-assembly of polyethylene-b-polydimethylsiloxane-b-polyethylene triblock copolymers. Here, we have used this method to fabricate a series of membranes with identical compositions but with different amounts of free-volume. We use the termmore » artificial free-volume to refer to the additional free-volume created by self-assembly. The effect of artificial free-volume on selective transport through the membranes was tested using butanol/water and ethanol/water mixtures due to their importance in biofuel production. Moreover, we found that the introduction of artificial free-volume improves both alcohol permeability and selectivity.« less
NASA Technical Reports Server (NTRS)
2001-01-01
Dr. Alexander Chernov, of the Universities Space Research Association (USRA) and based at Marshall Space Flight Center, is investigating why protein crystals grown in space are, in about 20 percent of cases, better-ordered than those grown on the ground. They are testing the idea that the amount of impurities trapped by space-grown crystals may be different than the amount trapped by crystals grown on Earth because convection is negligible in microgravity. The concentrations or impurities in many space-grown crystals turned out to be several times lower than that in the terrestrial ones, sometimes below the detection limit. The ground-based experiment also showed that the amount of impurities per unit volume of the crystals was usually higher than the amount per unit volume of the solution. This means that a growing crystal actually purifies the solution in its immediate vicinity. Here, an impurity depletion zone is created around apoferritin crystals grown in gel, imitating microgravity conditions.
2001-01-24
Dr. Alexander Chernov, of the Universities Space Research Association (USRA) and based at Marshall Space Flight Center, is investigating why protein crystals grown in space are, in about 20 percent of cases, better-ordered than those grown on the ground. They are testing the idea that the amount of impurities trapped by space-grown crystals may be different than the amount trapped by crystals grown on Earth because convection is negligible in microgravity. The concentrations or impurities in many space-grown crystals turned out to be several times lower than that in the terrestrial ones, sometimes below the detection limit. The ground-based experiment also showed that the amount of impurities per unit volume of the crystals was usually higher than the amount per unit volume of the solution. This means that a growing crystal actually purifies the solution in its immediate vicinity. Here, an impurity depletion zone is created around apoferritin crystals grown in gel, imitating microgravity conditions.
The topology of card transaction money flows
NASA Astrophysics Data System (ADS)
Zanin, Massimiliano; Papo, David; Romance, Miguel; Criado, Regino; Moral, Santiago
2016-11-01
Money flow models are essential tools to understand different economical phenomena, like saving propensities and wealth distributions. In spite of their importance, most of them are based on synthetic transaction networks with simple topologies, e.g. random or scale-free ones, as the characterisation of real networks is made difficult by the confidentiality and sensitivity of money transaction data. Here, we present an analysis of the topology created by real credit card transactions from one of the biggest world banks, and show how different distributions, e.g. number of transactions per card or amount, have nontrivial characteristics. We further describe a stochastic model to create transactions data sets, feeding from the obtained distributions, which will allow researchers to create more realistic money flow models.
Intelligent Systems: Terrestrial Observation and Prediction Using Remote Sensing Data
NASA Technical Reports Server (NTRS)
Coughlan, Joseph C.
2005-01-01
NASA has made science and technology investments to better utilize its large space-borne remote sensing data holdings of the Earth. With the launch of Terra, NASA created a data-rich environment where the challenge is to fully utilize the data collected from EOS however, despite unprecedented amounts of observed data, there is a need for increasing the frequency, resolution, and diversity of observations. Current terrestrial models that use remote sensing data were constructed in a relatively data and compute limited era and do not take full advantage of on-line learning methods and assimilation techniques that can exploit these data. NASA has invested in visualization, data mining and knowledge discovery methods which have facilitated data exploitation, but these methods are insufficient for improving Earth science models that have extensive background knowledge nor do these methods refine understanding of complex processes. Investing in interdisciplinary teams that include computational scientists can lead to new models and systems for online operation and analysis of data that can autonomously improve in prediction skill over time.
Planning the Human Variome Project: The Spain Report†
Kaput, Jim; Cotton, Richard G. H.; Hardman, Lauren; Al Aqeel, Aida I.; Al-Aama, Jumana Y.; Al-Mulla, Fahd; Aretz, Stefan; Auerbach, Arleen D.; Axton, Myles; Bapat, Bharati; Bernstein, Inge T.; Bhak, Jong; Bleoo, Stacey L.; Blöcker, Helmut; Brenner, Steven E.; Burn, John; Bustamante, Mariona; Calzone, Rita; Cambon-Thomsen, Anne; Cargill, Michele; Carrera, Paola; Cavedon, Lawrence; Cho, Yoon Shin; Chung, Yeun-Jun; Claustres, Mireille; Cutting, Garry; Dalgleish, Raymond; den Dunnen, Johan T.; Díaz, Carlos; Dobrowolski, Steven; dos Santos, M. Rosário N.; Ekong, Rosemary; Flanagan, Simon B.; Flicek, Paul; Furukawa, Yoichi; Genuardi, Maurizio; Ghang, Ho; Golubenko, Maria V.; Greenblatt, Marc S.; Hamosh, Ada; Hancock, John M.; Hardison, Ross; Harrison, Terence M.; Hoffmann, Robert; Horaitis, Rania; Howard, Heather J.; Barash, Carol Isaacson; Izagirre, Neskuts; Jung, Jongsun; Kojima, Toshio; Laradi, Sandrine; Lee, Yeon-Su; Lee, Jong-Young; Gil-da-Silva-Lopes, Vera L.; Macrae, Finlay A.; Maglott, Donna; Marafie, Makia J.; Marsh, Steven G.E.; Matsubara, Yoichi; Messiaen, Ludwine M.; Möslein, Gabriela; Netea, Mihai G.; Norton, Melissa L.; Oefner, Peter J.; Oetting, William S.; O’Leary, James C.; de Ramirez, Ana Maria Oller; Paalman, Mark H.; Parboosingh, Jillian; Patrinos, George P.; Perozzi, Giuditta; Phillips, Ian R.; Povey, Sue; Prasad, Suyash; Qi, Ming; Quin, David J.; Ramesar, Rajkumar S.; Richards, C. Sue; Savige, Judith; Scheible, Dagmar G.; Scott, Rodney J.; Seminara, Daniela; Shephard, Elizabeth A.; Sijmons, Rolf H.; Smith, Timothy D.; Sobrido, María-Jesús; Tanaka, Toshihiro; Tavtigian, Sean V.; Taylor, Graham R.; Teague, Jon; Töpel, Thoralf; Ullman-Cullere, Mollie; Utsunomiya, Joji; van Kranen, Henk J.; Vihinen, Mauno; Watson, Michael; Webb, Elizabeth; Weber, Thomas K.; Yeager, Meredith; Yeom, Young I.; Yim, Seon-Hee; Yoo, Hyang-Sook
2018-01-01
The remarkable progress in characterizing the human genome sequence, exemplified by the Human Genome Project and the HapMap Consortium, has led to the perception that knowledge and the tools (e.g., microarrays) are sufficient for many if not most biomedical research efforts. A large amount of data from diverse studies proves this perception inaccurate at best, and at worst, an impediment for further efforts to characterize the variation in the human genome. Since variation in genotype and environment are the fundamental basis to understand phenotypic variability and heritability at the population level, identifying the range of human genetic variation is crucial to the development of personalized nutrition and medicine. The Human Variome Project (HVP; http://www.humanvariomeproject.org/) was proposed initially to systematically collect mutations that cause human disease and create a cyber infrastructure to link locus specific databases (LSDB). We report here the discussions and recommendations from the 2008 HVP planning meeting held in San Feliu de Guixols, Spain, in May 2008. PMID:19306394
Simple Tools to Facilitate Project Management of a Nursing Research Project.
Aycock, Dawn M; Clark, Patricia C; Thomas-Seaton, LaTeshia; Lee, Shih-Yu; Moloney, Margaret
2016-07-01
Highly organized project management facilitates rigorous study implementation. Research involves gathering large amounts of information that can be overwhelming when organizational strategies are not used. We describe a variety of project management and organizational tools used in different studies that may be particularly useful for novice researchers. The studies were a multisite study of caregivers of stroke survivors, an Internet-based diary study of women with migraines, and a pilot study testing a sleep intervention in mothers of low-birth-weight infants. Project management tools were used to facilitate enrollment, data collection, and access to results. The tools included protocol and eligibility checklists, event calendars, screening and enrollment logs, instrument scoring tables, and data summary sheets. These tools created efficiency, promoted a positive image, minimized errors, and provided researchers with a sense of control. For the studies described, there were no protocol violations, there were minimal missing data, and the integrity of data collection was maintained. © The Author(s) 2016.
Heraty, Joanne M; Ellstrand, Norman C
Contemporary germplasm conservation studies largely focus on ex situ and in situ management of diversity within centers of genetic diversity. Transnational migrants who transport and introduce landraces to new locations may catalyze a third type of conservation that combines both approaches. Resulting populations may support reduced diversity as a result of evolutionary forces such as genetic drift, selection, and gene flow, yet they may also be more diverse as a result of multiple introductions, selective breeding and cross pollination among multiple introduced varietals. In this study, we measured the amount and structure of maize molecular genetic diversity in samples collected from home gardens and community gardens maintained by immigrant farmers in Southern California. We used the same markers to measure the genetic diversity and structure of commercially available maize varieties and compared our data to previously reported genetic diversity statistics of Mesoamerican landraces. Our results reveal that transnational dispersal creates an opportunity for the maintenance of maize genetic diversity beyond its recognized centers of diversity.
Puškár, Michal; Kopas, Melichar; Puškár, Dušan; Lumnitzer, Ján; Faltinová, Eva
2018-02-01
The marine auxiliary diesel engines installed in the large transoceanic ships are used in order to generate the electricity but at the same time these engines are able to produce a significant amount of the harmful exhaust gas emissions. Therefore the International Maritime Organisation (IMO) concluded an agreement, which has to control generating of gaseous emissions in maritime transport. From this reason started to be used some of the alternative fuels in this branch. There was performed a study, which investigated emissions of the auxiliary marine diesel engine during application of the experimental fuels. The different testing fuels were created using the ratios 0%, 50%, 80% and 100% between the biodiesel and the ULSDF (Ultra Low Sulphur Diesel Fuel). The experimental measurements were performed at the different engine loading levels and various engine speeds in order to investigate an influence of the mixed fuels on the engine operational characteristics. Copyright © 2017 Elsevier Ltd. All rights reserved.
Sekely, Angela; Taylor, Graeme J; Bagby, R Michael
2018-03-17
The Toronto Structured Interview for Alexithymia (TSIA) was developed to provide a structured interview method for assessing alexithymia. One drawback of this instrument is the amount of time it takes to administer and score. The current study used item response theory (IRT) methods to analyze data from a large heterogeneous multi-language sample (N = 842) to investigate whether a subset of items could be selected to create a short version of the instrument. Samejima's (1969) graded response model was used to fit the item responses. Items providing maximum information were retained in the short model, resulting in the elimination of 12-items from the original 24-items. Despite the 50% reduction in the number of items, 65.22% of the information was retained. Further studies are needed to validate the short version. A short version of the TSIA is potentially of practical value to clinicians and researchers with time constraints. Copyright © 2018. Published by Elsevier B.V.
Margolis, Ronald; Derr, Leslie; Dunn, Michelle; Huerta, Michael; Larkin, Jennie; Sheehan, Jerry; Guyer, Mark; Green, Eric D
2014-01-01
Biomedical research has and will continue to generate large amounts of data (termed 'big data') in many formats and at all levels. Consequently, there is an increasing need to better understand and mine the data to further knowledge and foster new discovery. The National Institutes of Health (NIH) has initiated a Big Data to Knowledge (BD2K) initiative to maximize the use of biomedical big data. BD2K seeks to better define how to extract value from the data, both for the individual investigator and the overall research community, create the analytic tools needed to enhance utility of the data, provide the next generation of trained personnel, and develop data science concepts and tools that can be made available to all stakeholders. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Planning the human variome project: the Spain report.
Kaput, Jim; Cotton, Richard G H; Hardman, Lauren; Watson, Michael; Al Aqeel, Aida I; Al-Aama, Jumana Y; Al-Mulla, Fahd; Alonso, Santos; Aretz, Stefan; Auerbach, Arleen D; Bapat, Bharati; Bernstein, Inge T; Bhak, Jong; Bleoo, Stacey L; Blöcker, Helmut; Brenner, Steven E; Burn, John; Bustamante, Mariona; Calzone, Rita; Cambon-Thomsen, Anne; Cargill, Michele; Carrera, Paola; Cavedon, Lawrence; Cho, Yoon Shin; Chung, Yeun-Jun; Claustres, Mireille; Cutting, Garry; Dalgleish, Raymond; den Dunnen, Johan T; Díaz, Carlos; Dobrowolski, Steven; dos Santos, M Rosário N; Ekong, Rosemary; Flanagan, Simon B; Flicek, Paul; Furukawa, Yoichi; Genuardi, Maurizio; Ghang, Ho; Golubenko, Maria V; Greenblatt, Marc S; Hamosh, Ada; Hancock, John M; Hardison, Ross; Harrison, Terence M; Hoffmann, Robert; Horaitis, Rania; Howard, Heather J; Barash, Carol Isaacson; Izagirre, Neskuts; Jung, Jongsun; Kojima, Toshio; Laradi, Sandrine; Lee, Yeon-Su; Lee, Jong-Young; Gil-da-Silva-Lopes, Vera L; Macrae, Finlay A; Maglott, Donna; Marafie, Makia J; Marsh, Steven G E; Matsubara, Yoichi; Messiaen, Ludwine M; Möslein, Gabriela; Netea, Mihai G; Norton, Melissa L; Oefner, Peter J; Oetting, William S; O'Leary, James C; de Ramirez, Ana Maria Oller; Paalman, Mark H; Parboosingh, Jillian; Patrinos, George P; Perozzi, Giuditta; Phillips, Ian R; Povey, Sue; Prasad, Suyash; Qi, Ming; Quin, David J; Ramesar, Rajkumar S; Richards, C Sue; Savige, Judith; Scheible, Dagmar G; Scott, Rodney J; Seminara, Daniela; Shephard, Elizabeth A; Sijmons, Rolf H; Smith, Timothy D; Sobrido, María-Jesús; Tanaka, Toshihiro; Tavtigian, Sean V; Taylor, Graham R; Teague, Jon; Töpel, Thoralf; Ullman-Cullere, Mollie; Utsunomiya, Joji; van Kranen, Henk J; Vihinen, Mauno; Webb, Elizabeth; Weber, Thomas K; Yeager, Meredith; Yeom, Young I; Yim, Seon-Hee; Yoo, Hyang-Sook
2009-04-01
The remarkable progress in characterizing the human genome sequence, exemplified by the Human Genome Project and the HapMap Consortium, has led to the perception that knowledge and the tools (e.g., microarrays) are sufficient for many if not most biomedical research efforts. A large amount of data from diverse studies proves this perception inaccurate at best, and at worst, an impediment for further efforts to characterize the variation in the human genome. Because variation in genotype and environment are the fundamental basis to understand phenotypic variability and heritability at the population level, identifying the range of human genetic variation is crucial to the development of personalized nutrition and medicine. The Human Variome Project (HVP; http://www.humanvariomeproject.org/) was proposed initially to systematically collect mutations that cause human disease and create a cyber infrastructure to link locus specific databases (LSDB). We report here the discussions and recommendations from the 2008 HVP planning meeting held in San Feliu de Guixols, Spain, in May 2008. (c) 2009 Wiley-Liss, Inc.
Clinical diabetes research using data mining: a Canadian perspective.
Shah, Baiju R; Lipscombe, Lorraine L
2015-06-01
With the advent of the digitization of large amounts of information and the computer power capable of analyzing this volume of information, data mining is increasingly being applied to medical research. Datasets created for administration of the healthcare system provide a wealth of information from different healthcare sectors, and Canadian provinces' single-payer universal healthcare systems mean that data are more comprehensive and complete in this country than in many other jurisdictions. The increasing ability to also link clinical information, such as electronic medical records, laboratory test results and disease registries, has broadened the types of data available for analysis. Data-mining methods have been used in many different areas of diabetes clinical research, including classic epidemiology, effectiveness research, population health and health services research. Although methodologic challenges and privacy concerns remain important barriers to using these techniques, data mining remains a powerful tool for clinical research. Copyright © 2015 Canadian Diabetes Association. Published by Elsevier Inc. All rights reserved.
Conceptual data sampling for breast cancer histology image classification.
Rezk, Eman; Awan, Zainab; Islam, Fahad; Jaoua, Ali; Al Maadeed, Somaya; Zhang, Nan; Das, Gautam; Rajpoot, Nasir
2017-10-01
Data analytics have become increasingly complicated as the amount of data has increased. One technique that is used to enable data analytics in large datasets is data sampling, in which a portion of the data is selected to preserve the data characteristics for use in data analytics. In this paper, we introduce a novel data sampling technique that is rooted in formal concept analysis theory. This technique is used to create samples reliant on the data distribution across a set of binary patterns. The proposed sampling technique is applied in classifying the regions of breast cancer histology images as malignant or benign. The performance of our method is compared to other classical sampling methods. The results indicate that our method is efficient and generates an illustrative sample of small size. It is also competing with other sampling methods in terms of sample size and sample quality represented in classification accuracy and F1 measure. Copyright © 2017 Elsevier Ltd. All rights reserved.
A Complex Network Approach to Stylometry
Amancio, Diego Raphael
2015-01-01
Statistical methods have been widely employed to study the fundamental properties of language. In recent years, methods from complex and dynamical systems proved useful to create several language models. Despite the large amount of studies devoted to represent texts with physical models, only a limited number of studies have shown how the properties of the underlying physical systems can be employed to improve the performance of natural language processing tasks. In this paper, I address this problem by devising complex networks methods that are able to improve the performance of current statistical methods. Using a fuzzy classification strategy, I show that the topological properties extracted from texts complement the traditional textual description. In several cases, the performance obtained with hybrid approaches outperformed the results obtained when only traditional or networked methods were used. Because the proposed model is generic, the framework devised here could be straightforwardly used to study similar textual applications where the topology plays a pivotal role in the description of the interacting agents. PMID:26313921
Closed-loop supply chain models with considering the environmental impact.
Mohajeri, Amir; Fallah, Mohammad
2014-01-01
Global warming and climate changes created by large scale emissions of greenhouse gases are a worldwide concern. Due to this, the issue of green supply chain management has received more attention in the last decade. In this study, a closed-loop logistic concept which serves the purposes of recycling, reuse, and recovery required in a green supply chain is applied to integrate the environmental issues into a traditional logistic system. Here, we formulate a comprehensive closed-loop model for the logistics planning considering profitability and ecological goals. In this way, we can achieve the ecological goal reducing the overall amount of CO2 emitted from journeys. Moreover, the profitability criterion can be supported in the cyclic network with the minimum costs and maximum service level. We apply three scenarios and develop problem formulations for each scenario corresponding to the specified regulations and investigate the effect of the regulation on the preferred transport mode and the emissions. To validate the models, some numerical experiments are worked out and a comparative analysis is investigated.
BioPAX – A community standard for pathway data sharing
Demir, Emek; Cary, Michael P.; Paley, Suzanne; Fukuda, Ken; Lemer, Christian; Vastrik, Imre; Wu, Guanming; D’Eustachio, Peter; Schaefer, Carl; Luciano, Joanne; Schacherer, Frank; Martinez-Flores, Irma; Hu, Zhenjun; Jimenez-Jacinto, Veronica; Joshi-Tope, Geeta; Kandasamy, Kumaran; Lopez-Fuentes, Alejandra C.; Mi, Huaiyu; Pichler, Elgar; Rodchenkov, Igor; Splendiani, Andrea; Tkachev, Sasha; Zucker, Jeremy; Gopinath, Gopal; Rajasimha, Harsha; Ramakrishnan, Ranjani; Shah, Imran; Syed, Mustafa; Anwar, Nadia; Babur, Ozgun; Blinov, Michael; Brauner, Erik; Corwin, Dan; Donaldson, Sylva; Gibbons, Frank; Goldberg, Robert; Hornbeck, Peter; Luna, Augustin; Murray-Rust, Peter; Neumann, Eric; Reubenacker, Oliver; Samwald, Matthias; van Iersel, Martijn; Wimalaratne, Sarala; Allen, Keith; Braun, Burk; Whirl-Carrillo, Michelle; Dahlquist, Kam; Finney, Andrew; Gillespie, Marc; Glass, Elizabeth; Gong, Li; Haw, Robin; Honig, Michael; Hubaut, Olivier; Kane, David; Krupa, Shiva; Kutmon, Martina; Leonard, Julie; Marks, Debbie; Merberg, David; Petri, Victoria; Pico, Alex; Ravenscroft, Dean; Ren, Liya; Shah, Nigam; Sunshine, Margot; Tang, Rebecca; Whaley, Ryan; Letovksy, Stan; Buetow, Kenneth H.; Rzhetsky, Andrey; Schachter, Vincent; Sobral, Bruno S.; Dogrusoz, Ugur; McWeeney, Shannon; Aladjem, Mirit; Birney, Ewan; Collado-Vides, Julio; Goto, Susumu; Hucka, Michael; Le Novère, Nicolas; Maltsev, Natalia; Pandey, Akhilesh; Thomas, Paul; Wingender, Edgar; Karp, Peter D.; Sander, Chris; Bader, Gary D.
2010-01-01
BioPAX (Biological Pathway Exchange) is a standard language to represent biological pathways at the molecular and cellular level. Its major use is to facilitate the exchange of pathway data (http://www.biopax.org). Pathway data captures our understanding of biological processes, but its rapid growth necessitates development of databases and computational tools to aid interpretation. However, the current fragmentation of pathway information across many databases with incompatible formats presents barriers to its effective use. BioPAX solves this problem by making pathway data substantially easier to collect, index, interpret and share. BioPAX can represent metabolic and signaling pathways, molecular and genetic interactions and gene regulation networks. BioPAX was created through a community process. Through BioPAX, millions of interactions organized into thousands of pathways across many organisms, from a growing number of sources, are available. Thus, large amounts of pathway data are available in a computable form to support visualization, analysis and biological discovery. PMID:20829833
Prilusky, Jaime; Oueillet, Eric; Ulryck, Nathalie; Pajon, Anne; Bernauer, Julie; Krimm, Isabelle; Quevillon-Cheruel, Sophie; Leulliot, Nicolas; Graille, Marc; Liger, Dominique; Trésaugues, Lionel; Sussman, Joel L; Janin, Joël; van Tilbeurgh, Herman; Poupon, Anne
2005-06-01
Structural genomics aims at the establishment of a universal protein-fold dictionary through systematic structure determination either by NMR or X-ray crystallography. In order to catch up with the explosive amount of protein sequence data, the structural biology laboratories are spurred to increase the speed of the structure-determination process. To achieve this goal, high-throughput robotic approaches are increasingly used in all the steps leading from cloning to data collection and even structure interpretation is becoming more and more automatic. The progress made in these areas has begun to have a significant impact on the more 'classical' structural biology laboratories, dramatically increasing the number of individual experiments. This automation creates the need for efficient data management. Here, a new piece of software, HalX, designed as an 'electronic lab book' that aims at (i) storage and (ii) easy access and use of all experimental data is presented. This should lead to much improved management and tracking of structural genomics experimental data.
NASA Astrophysics Data System (ADS)
Klein, Laura M.; McNamara, Laura A.
2017-05-01
In this paper, we address the needed components to create usable engineering and operational user interfaces (UIs) for airborne Synthetic Aperture Radar (SAR) systems. As airborne SAR technology gains wider acceptance in the remote sensing and Intelligence, Surveillance, and Reconnaissance (ISR) communities, the need for effective and appropriate UIs to command and control these sensors has also increased. However, despite the growing demand for SAR in operational environments, the technology still faces an adoption roadblock, in large part due to the lack of effective UIs. It is common to find operational interfaces that have barely grown beyond the disparate tools engineers and technologists developed to demonstrate an initial concept or system. While sensor usability and utility are common requirements to engineers and operators, their objectives for interacting with the sensor are different. As such, the amount and type of information presented ought to be tailored to the specific application.
NASA Astrophysics Data System (ADS)
Ali, Azhar Tareq; Warip, Mohd Nazri Mohd; Yaakob, Naimah; Abduljabbar, Waleed Khalid; Atta, Abdu Mohammed Ali
2017-11-01
Vehicular Ad-hoc Networks (VANETs) is an area of wireless technologies that is attracting a great deal of interest. There are still several areas of VANETS, such as security and routing protocols, medium access control, that lack large amounts of research. There is also a lack of freely available simulators that can quickly and accurately simulate VANETs. The main goal of this paper is to develop a freely available VANETS simulator and to evaluate popular mobile ad-hoc network routing protocols in several VANETS scenarios. The VANETS simulator consisted of a network simulator, traffic (mobility simulator) and used a client-server application to keep the two simulators in sync. The VANETS simulator also models buildings to create a more realistic wireless network environment. Ad-Hoc Distance Vector routing (AODV), Dynamic Source Routing (DSR) and Dynamic MANET On-demand (DYMO) were initially simulated in a city, country, and highway environment to provide an overall evaluation.
Spatial pattern recognition of seismic events in South West Colombia
NASA Astrophysics Data System (ADS)
Benítez, Hernán D.; Flórez, Juan F.; Duque, Diana P.; Benavides, Alberto; Lucía Baquero, Olga; Quintero, Jiber
2013-09-01
Recognition of seismogenic zones in geographical regions supports seismic hazard studies. This recognition is usually based on visual, qualitative and subjective analysis of data. Spatial pattern recognition provides a well founded means to obtain relevant information from large amounts of data. The purpose of this work is to identify and classify spatial patterns in instrumental data of the South West Colombian seismic database. In this research, clustering tendency analysis validates whether seismic database possesses a clustering structure. A non-supervised fuzzy clustering algorithm creates groups of seismic events. Given the sensitivity of fuzzy clustering algorithms to centroid initial positions, we proposed a methodology to initialize centroids that generates stable partitions with respect to centroid initialization. As a result of this work, a public software tool provides the user with the routines developed for clustering methodology. The analysis of the seismogenic zones obtained reveals meaningful spatial patterns in South-West Colombia. The clustering analysis provides a quantitative location and dispersion of seismogenic zones that facilitates seismological interpretations of seismic activities in South West Colombia.
Masseroli, Marco; Stella, Andrea; Meani, Natalia; Alcalay, Myriam; Pinciroli, Francesco
2004-12-12
High-throughput technologies create the necessity to mine large amounts of gene annotations from diverse databanks, and to integrate the resulting data. Most databanks can be interrogated only via Web, for a single gene at a time, and query results are generally available only in the HTML format. Although some databanks provide batch retrieval of data via FTP, this requires expertise and resources for locally reimplementing the databank. We developed MyWEST, a tool aimed at researchers without extensive informatics skills or resources, which exploits user-defined templates to easily mine selected annotations from different Web-interfaced databanks, and aggregates and structures results in an automatically updated database. Using microarray results from a model system of retinoic acid-induced differentiation, MyWEST effectively gathered relevant annotations from various biomolecular databanks, highlighted significant biological characteristics and supported a global approach to the understanding of complex cellular mechanisms. MyWEST is freely available for non-profit use at http://www.medinfopoli.polimi.it/MyWEST/
An energy-limited model of algal biofuel production: Toward the next generation of advanced biofuels
Dunlop, Eric H.; Coaldrake, A. Kimi; Silva, Cory S.; ...
2013-10-22
Algal biofuels are increasingly important as a source of renewable energy. The absence of reliable thermodynamic and other property data, and the large amount of kinetic data that would normally be required have created a major barrier to simulation. Additionally, the absence of a generally accepted flowsheet for biofuel production means that detailed simulation of the wrong approach is a real possibility. This model of algal biofuel production estimates the necessary data and places it into a heuristic model using a commercial simulator that back-calculates the process structure required. Furthermore, complex kinetics can be obviated for now by putting themore » simulator into energy limitation and forcing it to solve for the missing design variables, such as bioreactor surface area, productivity, and oil content. The model does not attempt to prescribe a particular approach, but provides a guide towards a sound engineering approach to this challenging and important problem.« less
2014-01-01
Background Massively parallel DNA sequencing generates staggering amounts of data. Decreasing cost, increasing throughput, and improved annotation have expanded the diversity of genomics applications in research and clinical practice. This expanding scale creates analytical challenges: accommodating peak compute demand, coordinating secure access for multiple analysts, and sharing validated tools and results. Results To address these challenges, we have developed the Mercury analysis pipeline and deployed it in local hardware and the Amazon Web Services cloud via the DNAnexus platform. Mercury is an automated, flexible, and extensible analysis workflow that provides accurate and reproducible genomic results at scales ranging from individuals to large cohorts. Conclusions By taking advantage of cloud computing and with Mercury implemented on the DNAnexus platform, we have demonstrated a powerful combination of a robust and fully validated software pipeline and a scalable computational resource that, to date, we have applied to more than 10,000 whole genome and whole exome samples. PMID:24475911
Fluorometric determination of zirconium in minerals
Alford, W.C.; Shapiro, L.; White, C.E.
1951-01-01
The increasing use of zirconium in alloys and in the ceramics industry has created renewed interest in methods for its determination. It is a common constituent of many minerals, but is usually present in very small amounts. Published methods tend to be tedious, time-consuming, and uncertain as to accuracy. A new fluorometric procedure, which overcomes these objections to a large extent, is based on the blue fluorescence given by zirconium and flavonol in sulfuric acid solution. Hafnium is the only element that interferes. The sample is fused with borax glass and sodium carbonate and extracted with water. The residue is dissolved in sulfuric acid, made alkaline with sodium hydroxide to separate aluminum, and filtered. The precipitate is dissolved in sulfuric acid and electrolysed in a Melaven cell to remove iron. Flavonol is then added and the fluorescence intensity is measured with a photo-fluorometer. Analysis of seven standard mineral samples shows excellent results. The method is especially useful for minerals containing less than 0.25% zirconium oxide.
Real-time detection of hazardous materials in air
NASA Astrophysics Data System (ADS)
Schechter, Israel; Schroeder, Hartmut; Kompa, Karl L.
1994-03-01
A new detection system has been developed for real-time analysis of organic compounds in ambient air. It is based on multiphoton ionization by an unfocused laser beam in a single parallel-plate device. Thus, the ionization volume can be relatively large. The amount of laser created ions is determined quantitatively from the induced total voltage drop between the biased plates (Q equals (Delta) V(DOT)C). Mass information is obtained from computer analysis of the time-dependent signal. When a KrF laser (5 ev) is used, most of the organic compounds can be ionized in a two-photon process, but none of the standard components of atmospheric air are ionized by this process. Therefore, this instrument may be developed as a `sniffer' for organic materials. The method has been applied for benzene analysis in air. The detection limit is about 10 ppb. With a simple preconcentration technique the detection limit can be decreased to the sub-ppb range. Simple binary mixtures are also resolved.
Transforming GIS data into functional road models for large-scale traffic simulation.
Wilkie, David; Sewall, Jason; Lin, Ming C
2012-06-01
There exists a vast amount of geographic information system (GIS) data that model road networks around the world as polylines with attributes. In this form, the data are insufficient for applications such as simulation and 3D visualization-tools which will grow in power and demand as sensor data become more pervasive and as governments try to optimize their existing physical infrastructure. In this paper, we propose an efficient method for enhancing a road map from a GIS database to create a geometrically and topologically consistent 3D model to be used in real-time traffic simulation, interactive visualization of virtual worlds, and autonomous vehicle navigation. The resulting representation provides important road features for traffic simulations, including ramps, highways, overpasses, legal merge zones, and intersections with arbitrary states, and it is independent of the simulation methodologies. We test the 3D models of road networks generated by our algorithm on real-time traffic simulation using both macroscopic and microscopic techniques.
Phillips, Charles D
2015-01-01
Case-mix classification and payment systems help assure that persons with similar needs receive similar amounts of care resources, which is a major equity concern for consumers, providers, and programs. Although health service programs for adults regularly use case-mix payment systems, programs providing health services to children and youth rarely use such models. This research utilized Medicaid home care expenditures and assessment data on 2,578 children receiving home care in one large state in the USA. Using classification and regression tree analyses, a case-mix model for long-term pediatric home care was developed. The Pediatric Home Care/Expenditure Classification Model (P/ECM) grouped children and youth in the study sample into 24 groups, explaining 41% of the variance in annual home care expenditures. The P/ECM creates the possibility of a more equitable, and potentially more effective, allocation of home care resources among children and youth facing serious health care challenges.
Quinoa starch: Structure, properties, and applications.
Li, Guantian; Zhu, Fan
2018-02-01
Quinoa (Chenopodium quinoa Willd.) has gained popularity worldwide largely due to the attractive nutritional profile. It also has much potential for food security due to the great genetic diversity. Starch is the main component of quinoa grain and makes up to 70% of the dry matter. The starch plays a crucial role in functional properties of quinoa and related food products. The starch granules are rather small (∼1-3μm) with relatively low amylose contents as compared with most of the other starches. Quinoa amylopectin has significant amounts of short chains and super-long chains. These unique features have generated research interest in using the starch for food and other applications such as creating Pickering emulsions. This review summarizes the present knowledge of the isolation, composition, granular and molecular structures, physicochemical properties, modifications, and applications of quinoa starch. It becomes obvious that this starch has great potential for food and nonfood applications. Copyright © 2017 Elsevier Ltd. All rights reserved.
Phillips, Charles D.
2015-01-01
Case-mix classification and payment systems help assure that persons with similar needs receive similar amounts of care resources, which is a major equity concern for consumers, providers, and programs. Although health service programs for adults regularly use case-mix payment systems, programs providing health services to children and youth rarely use such models. This research utilized Medicaid home care expenditures and assessment data on 2,578 children receiving home care in one large state in the USA. Using classification and regression tree analyses, a case-mix model for long-term pediatric home care was developed. The Pediatric Home Care/Expenditure Classification Model (P/ECM) grouped children and youth in the study sample into 24 groups, explaining 41% of the variance in annual home care expenditures. The P/ECM creates the possibility of a more equitable, and potentially more effective, allocation of home care resources among children and youth facing serious health care challenges. PMID:26740744
Jackson MSc, Richard G.; Ball, Michael; Patel, Rashmi; Hayes, Richard D.; Dobson, Richard J.B.; Stewart, Robert
2014-01-01
Observational research using data from electronic health records (EHR) is a rapidly growing area, which promises both increased sample size and data richness - therefore unprecedented study power. However, in many medical domains, large amounts of potentially valuable data are contained within the free text clinical narrative. Manually reviewing free text to obtain desired information is an inefficient use of researcher time and skill. Previous work has demonstrated the feasibility of applying Natural Language Processing (NLP) to extract information. However, in real world research environments, the demand for NLP skills outweighs supply, creating a bottleneck in the secondary exploitation of the EHR. To address this, we present TextHunter, a tool for the creation of training data, construction of concept extraction machine learning models and their application to documents. Using confidence thresholds to ensure high precision (>90%), we achieved recall measurements as high as 99% in real world use cases. PMID:25954379
Exploring Remote Sensing Products Online with Giovanni for Studying Urbanization
NASA Technical Reports Server (NTRS)
Shen, Suhung; Leptoukh, Gregory G.; Gerasimov, Irina; Kempler, Steve
2012-01-01
Recently, a Large amount of MODIS land products at multi-spatial resolutions have been integrated into the online system, Giovanni, to support studies on land cover and land use changes focused on Northern Eurasia and Monsoon Asia regions. Giovanni (Goddard Interactive Online Visualization ANd aNalysis Infrastructure) is a Web-based application developed by the NASA Goddard Earth Sciences Data and Information Services Center (GES-DISC) providing a simple and intuitive way to visualize, analyze, and access Earth science remotely-sensed and modeled data. The customized Giovanni Web portals (Giovanni-NEESPI and Giovanni-MAIRS) are created to integrate land, atmospheric, cryospheric, and social products, that enable researchers to do quick exploration and basic analyses of land surface changes and their relationships to climate at global and regional scales. This presentation documents MODIS land surface products in Giovanni system. As examples, images and statistical analysis results on land surface and local climate changes associated with urbanization over Yangtze River Delta region, China, using data in Giovanni are shown.
Nagy, Paul G; Konewko, Ramon; Warnock, Max; Bernstein, Wendy; Seagull, Jacob; Xiao, Yan; George, Ivan; Park, Adrian
2008-03-01
Routine clinical information systems now have the ability to gather large amounts of data that surgical managers can access to create a seamless and proactive approach to streamlining operations and minimizing delays. The challenge lies in aggregating and displaying these data in an easily accessible format that provides useful, timely information on current operations. A Web-based, graphical dashboard is described in this study, which can be used to interpret clinical operational data, allow managers to see trends in data, and help identify inefficiencies that were not apparent with more traditional, paper-based approaches. The dashboard provides a visual decision support tool that assists managers in pinpointing areas for continuous quality improvement. The limitations of paper-based techniques, the development of the automated display system, and key performance indicators in analyzing aggregate delays, time, specialties, and teamwork are reviewed. Strengths, weaknesses, opportunities, and threats associated with implementing such a program in the perioperative environment are summarized.
Underground thermal generation of hydrocarbons from dry, southwestern coals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vanderborgh, N.E.; Elliott, G.R.B.
1978-01-01
The LASL underground coal conversion concept produces intermediate-BTU fuel gas for nearby industries such as ''minemouth'' electric power plants, plus major byproducts in the form of liquid and gaseous hydrocarbons for feedstocks to chemical plants e.g., substitute natural gas (SNG) producers. The concept involves controlling the water influx and drying the coal, generating hydrocarbons, by pyrolysis and finally gasifying the residual char with O/sub 2//CO/sub 2/ or air/CO/sub 2/ mixtures to produce industrial fuel gases. Underground conversion can be frustrated by uncontrolled water in the coal bed. Moisture can (a) prevent combustion, (b) preclude fuel gas formation by lowering reactionmore » zone temperatures and creating kinetic problems, (c) ruin product gas quality by dropping temperatures into a thermodynamically unsatisfactory regime, (d) degrade an initially satisfactory fuel gas by consuming carbon monoxide, (e) waste large amounts of heat, and (f) isolate reaction zones so that the processing will bypass blocks of coal.« less
Dameron, Oliver J; Parke, Michael; Albins, Mark A; Brainard, Russell
2007-04-01
Large amounts of derelict fishing gear accumulate and cause damage to shallow coral reefs of the Northwestern Hawaiian Islands (NWHI). To facilitate maintenance of reefs cleaned during 1996-2005 removal efforts, we identify likely high-density debris areas by assessing reef characteristics (depth, benthic habitat type, and energy regime) that influence sub-regional debris accumulation. Previously cleaned backreef and lagoonal reefs at two NWHI locations were resurveyed for accumulated debris using two survey methods. Accumulated debris densities and weights were found to be greater in lagoonal reef areas. Sample weight-based debris densities are extrapolated to similar habitats throughout the NWHI using a spatial 'net habitat' dataset created by generalizing IKONOS satellite derivatives for depth and habitat classification. Prediction accuracy for this dataset is tested using historical debris point data. Annual NWHI debris accumulation is estimated to be 52.0 metric tonnes. For planning purposes, individual NWHI atolls/reefs are allotted a proportion of this total.
History of Chandra X-Ray Observatory
2000-03-01
The Chandra X-Ray Observatory has captured this spectacular image of G292.0+1.8, a young, oxygen-rich supernova remnant with a pulsar at its center surrounded by outflowing material. This image shows a rapidly expanding shell of gas that is 36 light-years across and contains large amounts of elements such as oxygen, neon, magnesium, silicon and sulfur. Embedded in this cloud of multimillion-degree gas is a key piece of evidence linking neutron stars and supernovae produced by the collapse of massive stars. With an age estimated at 1,600 years, G292.0+1.8 is one of three known oxygen-rich supernovae in our galaxy. These supernovae are of great interest to astronomers because they are one of the primary sources of the heavy elements necessary to form planets and people. Scattered through the image are bluish knots of emissions containing material that is highly enriched in newly created oxygen, neon, and magnesium produced deep within the original star and ejected by the supernova explosion.
Bution, Murillo L; Molina, Gustavo; Abrahão, Meissa R E; Pastore, Gláucia M
2015-01-01
Throughout human history, natural products have been the basis for the discovery and development of therapeutics, cosmetic and food compounds used in industry. Many compounds found in natural organisms are rather difficult to chemically synthesize and to extract in large amounts, and in this respect, genetic and metabolic engineering are playing an increasingly important role in the production of these compounds, such as new terpenes and terpenoids, which may potentially be used to create aromas in industry. Terpenes belong to the largest class of natural compounds, are produced by all living organisms and play a fundamental role in human nutrition, cosmetics and medicine. Recent advances in systems biology and synthetic biology are allowing us to perform metabolic engineering at the whole-cell level, thus enabling the optimal design of microorganisms for the efficient production of drugs, cosmetic and food additives. This review describes the recent advances made in the genetic and metabolic engineering of the terpenes pathway with a particular focus on systems biotechnology.
Synthesis and high-throughput processing of polymeric hydrogels for 3D cell culture.
Lowe, Stuart B; Tan, Vincent T G; Soeriyadi, Alexander H; Davis, Thomas P; Gooding, J Justin
2014-09-17
3D cell cultures have drawn a large amount of interest in the scientific community with their ability to closely mimic physiological conditions. Hydrogels have been used extensively in the development of extracellular matrix (ECM) mimics for 3D cell culture. Compounds such as collagen and fibrin are commonly used to synthesize natural ECM mimics; however they suffer from batch-to-batch variation. In this Review we explore the synthesis route of hydrogels; how they can be altered to give different chemical and physical properties; how different biomolecules such as arginylglycylaspartic acid (RGD) or vascular endothelial growth factor (VEGF) can be incorporated to give different biological cues; and how to create concentration gradients with UV light. There will also be emphasis on the types of techniques available in high-throughput processing such as nozzle and droplet-based biofabrication, photoenabled biofabrication, and microfluidics. The combination of these approaches and techniques allow the preparation of hydrogels which are capable of mimicking the ECM.
Multiple Vehicle Detection and Segmentation in Malaysia Traffic Flow
NASA Astrophysics Data System (ADS)
Fariz Hasan, Ahmad; Fikri Che Husin, Mohd; Affendi Rosli, Khairul; Norhafiz Hashim, Mohd; Faiz Zainal Abidin, Amar
2018-03-01
Vision based system are widely used in the field of Intelligent Transportation System (ITS) to extract a large amount of information to analyze traffic scenes. By rapid number of vehicles on the road as well as significant increase on cameras dictated the need for traffic surveillance systems. This system can take over the burden some task was performed by human operator in traffic monitoring centre. The main technique proposed by this paper is concentrated on developing a multiple vehicle detection and segmentation focusing on monitoring through Closed Circuit Television (CCTV) video. The system is able to automatically segment vehicle extracted from heavy traffic scene by optical flow estimation alongside with blob analysis technique in order to detect the moving vehicle. Prior to segmentation, blob analysis technique will compute the area of interest region corresponding to moving vehicle which will be used to create bounding box on that particular vehicle. Experimental validation on the proposed system was performed and the algorithm is demonstrated on various set of traffic scene.
Shinno, Yuki; Kage, Hidenori; Chino, Haruka; Inaba, Atsushi; Arakawa, Sayaka; Noguchi, Satoshi; Amano, Yosuke; Yamauchi, Yasuhiro; Tanaka, Goh; Nagase, Takahide
2018-01-01
Talc pleurodesis is commonly performed to manage refractory pleural effusion or pneumothorax. It is considered as a safe procedure as long as a limited amount of large particle size talc is used. However, acute respiratory distress syndrome (ARDS) is a rare but serious complication after talc pleurodesis. We sought to determine the risk factors for the development of ARDS after pleurodesis using a limited amount of large particle size talc. We retrospectively reviewed patients who underwent pleurodesis with talc or OK-432 at the University of Tokyo Hospital. Twenty-seven and 35 patients underwent chemical pleurodesis using large particle size talc (4 g or less) or OK-432, respectively. Four of 27 (15%) patients developed ARDS after talc pleurodesis. Patients who developed ARDS were significantly older than those who did not (median 80 vs 66 years, P = 0.02) and had a higher prevalence of underlying interstitial abnormalities on chest computed tomography (CT; 2/4 vs 1/23, P < 0.05). No patient developed ARDS after pleurodesis with OK-432. This is the first case series of ARDS after pleurodesis using a limited amount of large particle size talc. Older age and underlying interstitial abnormalities on chest CT seem to be risk factors for developing ARDS after talc pleurodesis. © 2017 Asian Pacific Society of Respirology.
9 CFR 381.412 - Reference amounts customarily consumed per eating occasion.
Code of Federal Regulations, 2013 CFR
2013-01-01
... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...
9 CFR 381.412 - Reference amounts customarily consumed per eating occasion.
Code of Federal Regulations, 2012 CFR
2012-01-01
... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...
9 CFR 317.312 - Reference amounts customarily consumed per eating occasion.
Code of Federal Regulations, 2013 CFR
2013-01-01
... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...
9 CFR 317.312 - Reference amounts customarily consumed per eating occasion.
Code of Federal Regulations, 2014 CFR
2014-01-01
... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...
9 CFR 381.412 - Reference amounts customarily consumed per eating occasion.
Code of Federal Regulations, 2014 CFR
2014-01-01
... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...
9 CFR 317.312 - Reference amounts customarily consumed per eating occasion.
Code of Federal Regulations, 2012 CFR
2012-01-01
... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...
Bag For Formulating And Dispersing Intravenous Solution
NASA Technical Reports Server (NTRS)
Kipp, Jim; Owens, Jim; Scharf, Mike; Finley, Mike; Dudar, Tom; Veillon, Joe; Ogle, Jim
1993-01-01
Large-volume parenteral (LVP) bag in which predetermined amount of sterile solution formulated by combining premeasured, prepackaged amount of sterile solute with predetermined amount of water. Bag designed to hold predetermined amount, typically 1 L, of sterile solution. Sterility of solution maintained during mixing by passing water into bag through sterilizing filter. System used in field or hospitals not having proper sterile facilities, and in field research.
Low-authority control synthesis for large space structures
NASA Technical Reports Server (NTRS)
Aubrun, J. N.; Margulies, G.
1982-01-01
The control of vibrations of large space structures by distributed sensors and actuators is studied. A procedure is developed for calculating the feedback loop gains required to achieve specified amounts of damping. For moderate damping (Low Authority Control) the procedure is purely algebraic, but it can be applied iteratively when larger amounts of damping are required and is generalized for arbitrary time invariant systems.
Sze, Sing-Hoi; Parrott, Jonathan J; Tarone, Aaron M
2017-12-06
While the continued development of high-throughput sequencing has facilitated studies of entire transcriptomes in non-model organisms, the incorporation of an increasing amount of RNA-Seq libraries has made de novo transcriptome assembly difficult. Although algorithms that can assemble a large amount of RNA-Seq data are available, they are generally very memory-intensive and can only be used to construct small assemblies. We develop a divide-and-conquer strategy that allows these algorithms to be utilized, by subdividing a large RNA-Seq data set into small libraries. Each individual library is assembled independently by an existing algorithm, and a merging algorithm is developed to combine these assemblies by picking a subset of high quality transcripts to form a large transcriptome. When compared to existing algorithms that return a single assembly directly, this strategy achieves comparable or increased accuracy as memory-efficient algorithms that can be used to process a large amount of RNA-Seq data, and comparable or decreased accuracy as memory-intensive algorithms that can only be used to construct small assemblies. Our divide-and-conquer strategy allows memory-intensive de novo transcriptome assembly algorithms to be utilized to construct large assemblies.
ERIC Educational Resources Information Center
Weber, Jonathan
2006-01-01
Creating a digital library might seem like a task best left to a large research collection with a vast staff and generous budget. However, tools for successfully creating digital libraries are getting easier to use all the time. The explosion of people creating content for the web has led to the availability of many high-quality applications and…
A MBD-seq protocol for large-scale methylome-wide studies with (very) low amounts of DNA.
Aberg, Karolina A; Chan, Robin F; Shabalin, Andrey A; Zhao, Min; Turecki, Gustavo; Staunstrup, Nicklas Heine; Starnawska, Anna; Mors, Ole; Xie, Lin Y; van den Oord, Edwin Jcg
2017-09-01
We recently showed that, after optimization, our methyl-CpG binding domain sequencing (MBD-seq) application approximates the methylome-wide coverage obtained with whole-genome bisulfite sequencing (WGB-seq), but at a cost that enables adequately powered large-scale association studies. A prior drawback of MBD-seq is the relatively large amount of genomic DNA (ideally >1 µg) required to obtain high-quality data. Biomaterials are typically expensive to collect, provide a finite amount of DNA, and may simply not yield sufficient starting material. The ability to use low amounts of DNA will increase the breadth and number of studies that can be conducted. Therefore, we further optimized the enrichment step. With this low starting material protocol, MBD-seq performed equally well, or better, than the protocol requiring ample starting material (>1 µg). Using only 15 ng of DNA as input, there is minimal loss in data quality, achieving 93% of the coverage of WGB-seq (with standard amounts of input DNA) at similar false/positive rates. Furthermore, across a large number of genomic features, the MBD-seq methylation profiles closely tracked those observed for WGB-seq with even slightly larger effect sizes. This suggests that MBD-seq provides similar information about the methylome and classifies methylation status somewhat more accurately. Performance decreases with <15 ng DNA as starting material but, even with as little as 5 ng, MBD-seq still achieves 90% of the coverage of WGB-seq with comparable genome-wide methylation profiles. Thus, the proposed protocol is an attractive option for adequately powered and cost-effective methylome-wide investigations using (very) low amounts of DNA.
Grand Coulee - Bell 500-kV Transmission Line Project, Draft Environmental Impact Statement
DOE Office of Scientific and Technical Information (OSTI.GOV)
N /A
2002-08-09
BPA is proposing to construct a 500-kilovolt (kV) transmission line that would extend approximately 84 miles between the Grand Coulee 500-kV Switchyard, near Grand Coulee Dam, and the Bell Substation, in Mead just north of Spokane. The new line would cross portions of Douglas, Grant, Lincoln, and Spokane counties. In addition to the transmission line, new equipment would be installed at the substations at each end of the new line and at other facilities. The proposed action would remove an existing 115-kV transmission line and replace it with the new 500-kV line on existing right-of-way for most of its length.more » Additional right-of-way would be needed in the first 3.5 miles out of the Grand Coulee Switchyard to connect to the existing 115-kV right-of-way. Since the mid-1990s, the transmission path west of Spokane, called the West of Hatwai transmission pathway, has grown increasingly constrained. To date, BPA has been able to manage operation of the path through available operating practices, and customer needed have been met while maintaining the reliability of the path. however, in early 2001, operations showed that the amount of electricity that needs to flow from east to west along this path creates severe transmission congestion. Under these conditions, the system is at risk of overloads and violation of industry safety and reliability standards. The problem is particularly acute in the spring and summer months because of the large amount of power generated by dams east of the path. Large amounts of water cannot be spilled during that time in order for BPA to fulfill its obligation to protect threatened and endangered fish. The amount of power that needs to move through this area during these months at times could exceed the carrying capacity of the existing transmission lines. In additional capacity is not added, BPA will run a significant risk that it will not be able to continue to meet its contractual obligations to deliver power and maintain reliability standards that minimize risks to public safety and to equipment. BPA is considering two construction alternatives, the Agency Proposed Action and the Alternative Action. The Alternative Action would include all the components of the Preferred Action except a double-circuit line would be constructed in the Spokane area between a point about 2 miles west of the Spokane River and Bell Substation, a distance of about 9 miles. BPA is also considering the No Action Alternative.« less
Win-Win for Wind and Wildlife: A Vision to Facilitate Sustainable Development
Kiesecker, Joseph M.; Evans, Jeffrey S.; Fargione, Joe; Doherty, Kevin; Foresman, Kerry R.; Kunz, Thomas H.; Naugle, Dave; Nibbelink, Nathan P.; Niemuth, Neal D.
2011-01-01
Wind energy offers the potential to reduce carbon emissions while increasing energy independence and bolstering economic development. However, wind energy has a larger land footprint per Gigawatt (GW) than most other forms of energy production, making appropriate siting and mitigation particularly important. Species that require large unfragmented habitats and those known to avoid vertical structures are particularly at risk from wind development. Developing energy on disturbed lands rather than placing new developments within large and intact habitats would reduce cumulative impacts to wildlife. The U.S. Department of Energy estimates that it will take 241 GW of terrestrial based wind development on approximately 5 million hectares to reach 20% electricity production for the U.S. by 2030. We estimate there are ∼7,700 GW of potential wind energy available across the U.S., with ∼3,500 GW on disturbed lands. In addition, a disturbance-focused development strategy would avert the development of ∼2.3 million hectares of undisturbed lands while generating the same amount of energy as development based solely on maximizing wind potential. Wind subsidies targeted at favoring low-impact developments and creating avoidance and mitigation requirements that raise the costs for projects impacting sensitive lands could improve public value for both wind energy and biodiversity conservation. PMID:21533285
Shenkin, Susan D.; Pernet, Cyril; Nichols, Thomas E.; Poline, Jean-Baptiste; Matthews, Paul M.; van der Lugt, Aad; Mackay, Clare; Lanyon, Linda; Mazoyer, Bernard; Boardman, James P.; Thompson, Paul M.; Fox, Nick; Marcus, Daniel S.; Sheikh, Aziz; Cox, Simon R.; Anblagan, Devasuda; Job, Dominic E.; Dickie, David Alexander; Rodriguez, David; Wardlaw, Joanna M.
2017-01-01
Brain imaging is now ubiquitous in clinical practice and research. The case for bringing together large amounts of image data from well-characterised healthy subjects and those with a range of common brain diseases across the life course is now compelling. This report follows a meeting of international experts from multiple disciplines, all interested in brain image biobanking. The meeting included neuroimaging experts (clinical and non-clinical), computer scientists, epidemiologists, clinicians, ethicists, and lawyers involved in creating brain image banks. The meeting followed a structured format to discuss current and emerging brain image banks; applications such as atlases; conceptual and statistical problems (e.g. defining ‘normality’); legal, ethical and technological issues (e.g. consents, potential for data linkage, data security, harmonisation, data storage and enabling of research data sharing). We summarise the lessons learned from the experiences of a wide range of individual image banks, and provide practical recommendations to enhance creation, use and reuse of neuroimaging data. Our aim is to maximise the benefit of the image data, provided voluntarily by research participants and funded by many organisations, for human health. Our ultimate vision is of a federated network of brain image biobanks accessible for large studies of brain structure and function. PMID:28232121
Large-scale control of the Arabian Sea monsoon inversion in August
NASA Astrophysics Data System (ADS)
Wu, Chi-Hua; Wang, S.-Y. Simon; Hsu, Huang-Hsiung
2017-12-01
The summer monsoon inversion in the Arabian Sea is characterized by a large amount of low clouds and August as the peak season. Atmospheric stratification associated with the monsoon inversion has been considered a local system influenced by the advancement of the India-Pakistan monsoon. Empirical and numerical evidence from this study suggests that the Arabian Sea monsoon inversion is linked to a broader-scale monsoon evolution across the African Sahel, South Asia, and East Asia-Western North Pacific (WNP), rather than being a mere byproduct of the India-Pakistan monsoon progression. In August, the upper-tropospheric anticyclone in South Asia extends sideways corresponding with the enhanced precipitation in the subtropical WNP, equatorial Indian Ocean, and African Sahel while the middle part of this anticyclone weakens over the Arabian Sea. The increased heating in the adjacent monsoon systems creates a suppression effect on the Arabian Sea, suggesting an apparent competition among the Africa-Asia-WNP monsoon subsystems. The peak Sahel rainfall in August, together with enhanced heating in the equatorial Indian Ocean, produces a critical effect on strengthening the Arabian Sea thermal inversion. By contrast, the WNP monsoon onset which signifies the eastward expansion of the subtropical Asian monsoon heating might play a secondary or opposite role in the Arabian Sea monsoon inversion.
An experimental investigation of two large annular diffusers with swirling and distorted inflow
NASA Technical Reports Server (NTRS)
Eckert, W. T.; Johnston, J. P.; Simons, T. D.; Mort, K. W.; Page, V. R.
1980-01-01
Two annular diffusers downstream of a nacelle-mounted fan were tested for aerodynamic performance, measured in terms of two static pressure recovery parameters (one near the diffuser exit plane and one about three diameters downstream in the settling duct) in the presence of several inflow conditions. The two diffusers each had an inlet diameter of 1.84 m, an area ratio of 2.3, and an equivalent cone angle of 11.5, but were distinguished by centerbodies of different lengths. The dependence of diffuser performance on various combinations of swirling, radially distorted, and/or azimuthally distorted inflow was examined. Swirling flow and distortions in the axial velocity profile in the annulus upstream of the diffuser inlet were caused by the intrinsic flow patterns downstream of a fan in a duct and by artificial intensification of the distortions. Azimuthal distortions or defects were generated by the addition of four artificial devices (screens and fences). Pressure recovery data indicated beneficial effects of both radial distortion (for a limited range of distortion levels) and inflow swirl. Small amounts of azimuthal distortion created by the artificial devices produced only small effects on diffuser performance. A large artificial distortion device was required to produce enough azimuthal flow distortion to significantly degrade the diffuser static pressure recovery.
An Intelligent Tool for Activity Data Collection
Jehad Sarkar, A. M.
2011-01-01
Activity recognition systems using simple and ubiquitous sensors require a large variety of real-world sensor data for not only evaluating their performance but also training the systems for better functioning. However, a tremendous amount of effort is required to setup an environment for collecting such data. For example, expertise and resources are needed to design and install the sensors, controllers, network components, and middleware just to perform basic data collections. It is therefore desirable to have a data collection method that is inexpensive, flexible, user-friendly, and capable of providing large and diverse activity datasets. In this paper, we propose an intelligent activity data collection tool which has the ability to provide such datasets inexpensively without physically deploying the testbeds. It can be used as an inexpensive and alternative technique to collect human activity data. The tool provides a set of web interfaces to create a web-based activity data collection environment. It also provides a web-based experience sampling tool to take the user’s activity input. The tool generates an activity log using its activity knowledge and the user-given inputs. The activity knowledge is mined from the web. We have performed two experiments to validate the tool’s performance in producing reliable datasets. PMID:22163832
Win-win for wind and wildlife: a vision to facilitate sustainable development.
Kiesecker, Joseph M; Evans, Jeffrey S; Fargione, Joe; Doherty, Kevin; Foresman, Kerry R; Kunz, Thomas H; Naugle, Dave; Nibbelink, Nathan P; Niemuth, Neal D
2011-04-13
Wind energy offers the potential to reduce carbon emissions while increasing energy independence and bolstering economic development. However, wind energy has a larger land footprint per Gigawatt (GW) than most other forms of energy production, making appropriate siting and mitigation particularly important. Species that require large unfragmented habitats and those known to avoid vertical structures are particularly at risk from wind development. Developing energy on disturbed lands rather than placing new developments within large and intact habitats would reduce cumulative impacts to wildlife. The U.S. Department of Energy estimates that it will take 241 GW of terrestrial based wind development on approximately 5 million hectares to reach 20% electricity production for the U.S. by 2030. We estimate there are ∼7,700 GW of potential wind energy available across the U.S., with ∼3,500 GW on disturbed lands. In addition, a disturbance-focused development strategy would avert the development of ∼2.3 million hectares of undisturbed lands while generating the same amount of energy as development based solely on maximizing wind potential. Wind subsidies targeted at favoring low-impact developments and creating avoidance and mitigation requirements that raise the costs for projects impacting sensitive lands could improve public value for both wind energy and biodiversity conservation.
Major transitions in information technology.
Valverde, Sergi
2016-08-19
When looking at the history of technology, we can see that all inventions are not of equal importance. Only a few technologies have the potential to start a new branching series (specifically, by increasing diversity), have a lasting impact in human life and ultimately became turning points. Technological transitions correspond to times and places in the past when a large number of novel artefact forms or behaviours appeared together or in rapid succession. Why does that happen? Is technological change continuous and gradual or does it occur in sudden leaps and bounds? The evolution of information technology (IT) allows for a quantitative and theoretical approach to technological transitions. The value of information systems experiences sudden changes (i) when we learn how to use this technology, (ii) when we accumulate a large amount of information, and (iii) when communities of practice create and exchange free information. The coexistence between gradual improvements and discontinuous technological change is a consequence of the asymmetric relationship between complexity and hardware and software. Using a cultural evolution approach, we suggest that sudden changes in the organization of ITs depend on the high costs of maintaining and transmitting reliable information.This article is part of the themed issue 'The major synthetic evolutionary transitions'. © 2016 The Author(s).
Shenkin, Susan D; Pernet, Cyril; Nichols, Thomas E; Poline, Jean-Baptiste; Matthews, Paul M; van der Lugt, Aad; Mackay, Clare; Lanyon, Linda; Mazoyer, Bernard; Boardman, James P; Thompson, Paul M; Fox, Nick; Marcus, Daniel S; Sheikh, Aziz; Cox, Simon R; Anblagan, Devasuda; Job, Dominic E; Dickie, David Alexander; Rodriguez, David; Wardlaw, Joanna M
2017-06-01
Brain imaging is now ubiquitous in clinical practice and research. The case for bringing together large amounts of image data from well-characterised healthy subjects and those with a range of common brain diseases across the life course is now compelling. This report follows a meeting of international experts from multiple disciplines, all interested in brain image biobanking. The meeting included neuroimaging experts (clinical and non-clinical), computer scientists, epidemiologists, clinicians, ethicists, and lawyers involved in creating brain image banks. The meeting followed a structured format to discuss current and emerging brain image banks; applications such as atlases; conceptual and statistical problems (e.g. defining 'normality'); legal, ethical and technological issues (e.g. consents, potential for data linkage, data security, harmonisation, data storage and enabling of research data sharing). We summarise the lessons learned from the experiences of a wide range of individual image banks, and provide practical recommendations to enhance creation, use and reuse of neuroimaging data. Our aim is to maximise the benefit of the image data, provided voluntarily by research participants and funded by many organisations, for human health. Our ultimate vision is of a federated network of brain image biobanks accessible for large studies of brain structure and function. Copyright © 2017 Elsevier Inc. All rights reserved.
Preserving and vouchering butterflies and moths for large-scale museum-based molecular research
Epstein, Samantha W.; Mitter, Kim; Hamilton, Chris A.; Plotkin, David; Mitter, Charles
2016-01-01
Butterflies and moths (Lepidoptera) comprise significant portions of the world’s natural history collections, but a standardized tissue preservation protocol for molecular research is largely lacking. Lepidoptera have traditionally been spread on mounting boards to display wing patterns and colors, which are often important for species identification. Many molecular phylogenetic studies have used legs from pinned specimens as the primary source for DNA in order to preserve a morphological voucher, but the amount of available tissue is often limited. Preserving an entire specimen in a cryogenic freezer is ideal for DNA preservation, but without an easily accessible voucher it can make specimen identification, verification, and morphological work difficult. Here we present a procedure that creates accessible and easily visualized “wing vouchers” of individual Lepidoptera specimens, and preserves the remainder of the insect in a cryogenic freezer for molecular research. Wings are preserved in protective holders so that both dorsal and ventral patterns and colors can be easily viewed without further damage. Our wing vouchering system has been implemented at the University of Maryland (AToL Lep Collection) and the University of Florida (Florida Museum of Natural History, McGuire Center of Lepidoptera and Biodiversity), which are among two of the largest Lepidoptera molecular collections in the world. PMID:27366654
Roehl, Edwin A.; Conrads, Paul
2010-01-01
This is the second of two papers that describe how data mining can aid natural-resource managers with the difficult problem of controlling the interactions between hydrologic and man-made systems. Data mining is a new science that assists scientists in converting large databases into knowledge, and is uniquely able to leverage the large amounts of real-time, multivariate data now being collected for hydrologic systems. Part 1 gives a high-level overview of data mining, and describes several applications that have addressed major water resource issues in South Carolina. This Part 2 paper describes how various data mining methods are integrated to produce predictive models for controlling surface- and groundwater hydraulics and quality. The methods include: - signal processing to remove noise and decompose complex signals into simpler components; - time series clustering that optimally groups hundreds of signals into "classes" that behave similarly for data reduction and (or) divide-and-conquer problem solving; - classification which optimally matches new data to behavioral classes; - artificial neural networks which optimally fit multivariate data to create predictive models; - model response surface visualization that greatly aids in understanding data and physical processes; and, - decision support systems that integrate data, models, and graphics into a single package that is easy to use.
Focazio, M.J.; Speiran, G.K.
1993-01-01
The groundwater-flow system of the Virginia Coastal Plain consists of areally extensive and interconnected aquifers. Large, regionally coalescing cones of depression that are caused by large withdrawals of water are found in these aquifers. Local groundwater systems are affected by regional pumping, because of the interactions within the system of aquifers. Accordingly, these local systems are affected by regional groundwater flow and by spatial and temporal differences in withdrawals by various users. A geographic- information system was used to refine a regional groundwater-flow model around selected withdrawal centers. A method was developed in which drawdown maps that were simulated by the regional groundwater-flow model and the principle of superposition could be used to estimate drawdown at local sites. The method was applied to create drawdown maps in the Brightseat/Upper Potomac Aquifer for periods of 3, 6, 9, and 12 months for Chesapeake, Newport News, Norfolk, Portsmouth, Suffolk, and Virginia Beach, Virginia. Withdrawal rates were supplied by the individual localities and remained constant for each simulation period. This provides an efficient method by which the individual local groundwater users can determine the amount of drawdown produced by their wells in a groundwater system that is a water source for multiple users and that is affected by regional-flow systems.
NASA Technical Reports Server (NTRS)
2002-01-01
[figure removed for brevity, see original site] (Released 5 July 2002) This is an image of a crater within part of Amazonis Planitia, located at 22.9N, 152.5W. This image features a number of common features exhibited by Martian craters. The crater is sufficiently large to exhibit a central peak that is seen in the upper right hand corner if the image. Also apparent is the slump blocks on the inside of the crater walls. When the crater was first formed, the crater walls were unstable and subsequently formed a series of landslides over time that formed the hummocky terrain just inside the present crater wall. While these cratering features are common to craters formed on other planetary bodies, such as the moon, the ejecta blanket surrounding the crater displays a morphology that is more unique to Mars. The lobate morphology implies that the ejecta blanket was emplaced in an almost fluid fashion rather than the traditional ballistic ejecta emplacement. This crater morphology occurs on Mars where water ice is suspected to be present just beneath the surface. The impact that created the crater would have enough energy to melt large amounts of water that could form the mud or debris flows that characterize the ejecta morphology that is seen in this image.
Patterson, C.J.; Boerboom, Terrence
1999-01-01
Minnesota is largely underlain by Precambrian crystalline bedrock that was weathered to an average depth of 30 m prior to Late Cretaceous time. The fresh-rock-weathered-rock interface is irregular, with as much as 45 m of relief. Weathering exploited joints, locally isolating meter-sized volumes of rock known as corestones. Variable amounts of residuum were removed through glaciation to leave (1) saprolite overlain by an in-situ Late Cretaceous soil profile; (2) partially eroded saprolite; and (3) undulating fresh rock surfaces (commonly mantled by rounded boulders) that display striae and glacial or fluvial polish. Significant subglacial erosion of fresh bedrock is not required to form smoothly undulating bedrock surfaces with closed depressions; they may also form through removal of weathered bedrock and exposure of the weathering front. Large rounded boulders are not always shaped during transport; they may represent chemically rounded corestones resting at or near the bedrock source. Unambiguous evidence for glacial erosion includes striae and streamlining of bedrock parallel to striae. Polish on rock can be created fluvially, and smoothed grooves and ridges in the rock may be chemically produced. Many rounded boulders found in glacial till and strewn on bedrock surfaces probably originated as corestones.
Enterprise 2.0: An Extended Technology Acceptance Model
ERIC Educational Resources Information Center
Kurz, James M.
2012-01-01
The amount of information that people produce is changing, especially as social networking becomes more commonplace and globalization inefficiencies continue to swamp enterprise. Companies are rising to the challenge to create a collaborative approach for information management, but according to many leading technology advisory firms, they have…