A Small District's Big Innovator
ERIC Educational Resources Information Center
Butler, Kevin
2010-01-01
This article profiles Cashton (Wisconsin) Public Schools Superintendent Brad Saron. Saron has always had a passion for technology. He has brought his personal passion for technology to bear in his job as superintendent of the 584-student Cashton Public Schools. As a principal and then as superintendent, he introduced iPod Touches, iPads, wireless…
Flint, Lorraine E.; Brandt, Justin; Christensen, Allen H.; Flint, Alan L.; Hevesi, Joseph A.; Jachens, Robert; Kulongoski, Justin T.; Martin, Peter; Sneed, Michelle
2012-01-01
The Big Bear Valley, located in the San Bernardino Mountains of southern California, has increased in population in recent years. Most of the water supply for the area is pumped from the alluvial deposits that form the Big Bear Valley groundwater basin. This study was conducted to better understand the thickness and structure of the groundwater basin in order to estimate the quantity and distribution of natural recharge to Big Bear Valley. A gravity survey was used to estimate the thickness of the alluvial deposits that form the Big Bear Valley groundwater basin. This determined that the alluvial deposits reach a maximum thickness of 1,500 to 2,000 feet beneath the center of Big Bear Lake and the area between Big Bear and Baldwin Lakes, and decrease to less than 500 feet thick beneath the eastern end of Big Bear Lake. Interferometric Synthetic Aperture Radar (InSAR) was used to measure pumping-induced land subsidence and to locate structures, such as faults, that could affect groundwater movement. The measurements indicated small amounts of land deformation (uplift and subsidence) in the area between Big Bear Lake and Baldwin Lake, the area near the city of Big Bear Lake, and the area near Sugarloaf, California. Both the gravity and InSAR measurements indicated the possible presence of subsurface faults in subbasins between Big Bear and Baldwin Lakes, but additional data are required for confirmation. The distribution and quantity of groundwater recharge in the area were evaluated by using a regional water-balance model (Basin Characterization Model, or BCM) and a daily rainfall-runoff model (INFILv3). The BCM calculated spatially distributed potential recharge in the study area of approximately 12,700 acre-feet per year (acre-ft/yr) of potential in-place recharge and 30,800 acre-ft/yr of potential runoff. Using the assumption that only 10 percent of the runoff becomes recharge, this approach indicated there is approximately 15,800 acre-ft/yr of total recharge in Big Bear Valley. The INFILv3 model was modified for this study to include a perched zone beneath the root zone to better simulate lateral seepage and recharge in the shallow subsurface in mountainous terrain. The climate input used in the INFILv3 model was developed by using daily climate data from 84 National Climatic Data Center stations and published Parameter Regression on Independent Slopes Model (PRISM) average monthly precipitation maps to match the drier average monthly precipitation measured in the Baldwin Lake drainage basin. This model resulted in a good representation of localized rain-shadow effects and calibrated well to measured lake volumes at Big Bear and Baldwin Lakes. The simulated average annual recharge was about 5,480 acre-ft/yr in the Big Bear study area, with about 2,800 acre-ft/yr in the Big Bear Lake surface-water drainage basin and about 2,680 acre-ft/yr in the Baldwin Lake surface-water drainage basin. One spring and eight wells were sampled and analyzed for chemical and isotopic data in 2005 and 2006 to determine if isotopic techniques could be used to assess the sources and ages of groundwater in the Big Bear Valley. This approach showed that the predominant source of recharge to the Big Bear Valley is winter precipitation falling on the surrounding mountains. The tritium and uncorrected carbon-14 ages of samples collected from wells for this study indicated that the groundwater basin contains water of different ages, ranging from modern to about 17,200-years old.The results of these investigations provide an understanding of the lateral and vertical extent of the groundwater basin, the spatial distribution of groundwater recharge, the processes responsible for the recharge, and the source and age of groundwater in the groundwater basin. Although the studies do not provide an understanding of the detailed water-bearing properties necessary to determine the groundwater availability of the basin, they do provide a framework for the future development of a groundwater model that would help to improve the understanding of the potential hydrologic effects of water-management alternatives in Big Bear Valley.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-24
... Regulations To Prohibit Public Contact With Big Cats, Bears, and Nonhuman Primates AGENCY: Animal and Plant... into direct or physical contact with big cats, bears, or nonhuman primates of any age, to define the... coming. FOR FURTHER INFORMATION CONTACT: Dr. Barbara Kohn, DVM, Senior Staff Officer, USDA, APHIS, Animal...
Stable isotope and trace element studies of black bear hair, Big Bend ecosystem, Texas and Mexico
Shanks, W.C. Pat; Hellgren, Eric C.; Stricker, Craig A.; Gemery-Hill, Pamela A.; Onorato, David P.
2008-01-01
Hair from black bears (Ursus americanus), collected from four areas in the Big Bend ecosystem, has been analyzed for stable isotopes of carbon, nitrogen, and sulfur to determine major food sources and for trace metals to infer possible effects of environmental contaminants. Results indicate that black bears are largely vegetarian, feeding on desert plants, nuts, and berries. Mercury concentrations in bear hair are below safe level standards (
76 FR 54415 - Proposed Flood Elevation Determinations
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-01
... following flooding sources: Bear Creek (backwater effects from Cumberland River), Big Renox Creek (backwater effects from Cumberland River), Big Whetstone Creek (backwater effects from Cumberland River), Big Willis... River), Big Renox Creek (backwater effects from Cumberland River), Big Whetstone Creek (backwater...
Eolian deposits in the Neoproterozoic Big Bear Group, San Bernardino Mountains, California, USA
NASA Astrophysics Data System (ADS)
Stewart, John H.
2005-12-01
Strata interpreted to be eolian are recognized in the Neoproterozoic Big Bear Group in the San Bernardino Mountains of southern California, USA. The strata consist of medium- to large-scale (30 cm to > 6 m) cross-stratified quartzite considered to be eolian dune deposits and interstratified thinly laminated quartzite that are problematically interpreted as either eolian translatent climbing ripple laminae, or as tidal-flat deposits. High index ripples and adhesion structures considered to be eolian are associated with the thinly laminated and cross-stratified strata. The eolian strata are in a succession that is characterized by flaser bedding, aqueous ripple marks, mudcracks, and interstratified small-scale cross-strata that are suggestive of a tidal environment containing local fluvial deposits. The eolian strata may have formed in a near-shore environment inland of a tidal flat. The Neoproterozoic Big Bear Group is unusual in the western United States and may represent a remnant of strata that were originally more widespread and part of the hypothetical Neoproterozoic supercontinent of Rodinia. The Big Bear Group perhaps is preserved only in blocks that were downdropped along Neoproterozoic extensional faults. The eolian deposits of the Big Bear Group may have been deposited during arid conditions that preceded worldwide glacial events in the late Neoproterozoic. Possibly similar pre-glacial arid events are recognized in northern Mexico, northeast Washington, Australia, and northwest Canada.
Eolian deposits in the Neoproterozoic Big Bear Group, San Bernardino Mountains, California, USA
Stewart, John H.
2005-01-01
Strata interpreted to be eolian are recognized in the Neoproterozoic Big Bear Group in the San Bernardino Mountains of southern California, USA. The strata consist of medium- to large-scale (30 cm to > 6 m) cross-stratified quartzite considered to be eolian dune deposits and interstratified thinly laminated quartzite that are problematically interpreted as either eolian translatent climbing ripple laminae, or as tidal-flat deposits. High index ripples and adhesion structures considered to be eolian are associated with the thinly laminated and cross-stratified strata. The eolian strata are in a succession that is characterized by flaser bedding, aqueous ripple marks, mudcracks, and interstratified small-scale cross-strata that are suggestive of a tidal environment containing local fluvial deposits. The eolian strata may have formed in a near-shore environment inland of a tidal flat. The Neoproterozoic Big Bear Group is unusual in the western United States and may represent a remnant of strata that were originally more widespread and part of the hypothetical Neoproterozoic supercontinent of Rodinia. The Big Bear Group perhaps is preserved only in blocks that were downdropped along Neoproterozoic extensional faults. The eolian deposits of the Big Bear Group may have been deposited during arid conditions that preceded worldwide glacial events in the late Neoproterozoic. Possibly similar pre-glacial arid events are recognized in northern Mexico, northeast Washington, Australia, and northwest Canada.
Bears, Big and Little. Young Discovery Library Series.
ERIC Educational Resources Information Center
Pfeffer, Pierre
This book is written for children 5 through 10. Part of a series designed to develop their curiosity, fascinate them and educate them, this volume describes: (1) the eight species of bears, including black bear, brown bear, grizzly bear, spectacled bear, sun bear, sloth bear, polar bear, and giant panda; (2) geographical habitats of bears; (3)…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-05
... by habitat loss due primarily to residential development and recreational encroachment (Big Wildlife... Great Basin ecosystem (Big Wildlife and NoBearHuntNV.org 2011, p. 13). The petition asserts that loss of... hair) from two American black bear populations: Lake Tahoe Basin, Nevada, and Yosemite National Park...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-12
... SECURITIES AND EXCHANGE COMMISSION [File No. 500-1] Big Bear Mining Corp., Four Rivers BioEnergy, Inc., Mainland Resources, Inc., QI Systems Inc., South Texas Oil Co., and Synova Healthcare Group, Inc... concerning the securities of Four Rivers BioEnergy, Inc. because it has not filed any periodic reports since...
Hedgehogs and foxes (and a bear)
NASA Astrophysics Data System (ADS)
Gibb, Bruce
2017-02-01
The chemical universe is big. Really big. You just won't believe how vastly, hugely, mind-bogglingly big it is. Bruce Gibb reminds us that it's somewhat messy too, and so we succeed by recognizing the limits of our knowledge.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-27
..., Springerville Muni, RNAV (GPS) RWY 21, Amdt 1 Big Bear City, CA, Big Bear City, RNAV (GPS) RWY 26, Orig-A Marina... Baltimore, MD, Martin State, RNAV (GPS) RWY 15, Amdt 1 Great Falls, MT, Great Falls Intl, GPS RWY 21, Orig-A, CANCELLED Great Falls, MT, Great Falls Intl, ILS OR LOC/DME RWY 3, ILS RWY 3 (SA CAT I), ILS RWY 3 (CAT II...
77 FR 17007 - Kootenai National Forest, Cabinet Ranger District, Montana Pilgrim Timber Sale Project
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-23
... fire use. Big game forage would be enhanced through use of prescribed fire to rejuvenate and increase... open road density in areas managed for big game summer range. Subsequent analyses of potential... for big game species, notably elk, deer, and bears. Generally, these areas are on southerly aspects...
Infrared Observations with the 1.6 Meter New Solar Telescope in Big Bear: Origins of Space Weather
2015-05-21
with the NST came in the Summer of 2009, while the first observations corrected by adaptive optics (AO) came in the Summer of 2010 and first vector...magnetograms (VMGs) in the Summer of 2011. In 2012, a new generation of solar adaptive optics (AO) developed in Big Bear led to hitherto only...upon which the NST has yield key information. Our concentration of sunspots in the second year of funding arises because of the improved resolution
Ruth, T.E.; Smith, D.W.; Haroldson, M.A.; Buotte, P.C.; Schwartz, C.C.; Quigley, H.B.; Cherry, S.; Tyres, D.; Frey, K.
2003-01-01
The Greater Yellowstone Ecosystem contains the rare combination of an intact guild of native large carnivores, their prey, and differing land management policies (National Park versus National Forest; no hunting versus hunting). Concurrent field studies on large carnivores allowed us to investigate activities of humans and carnivores on Yellowstone National Park's (YNP) northern boundary. Prior to and during the backcountry big-game hunting season, we monitored movements of grizzly bears (Ursus arctos), wolves (Canis lupus), and cougars (Puma concolor) on the northern boundary of YNP. Daily aerial telemetry locations (September 1999), augmented with weekly telemetry locations (August and October 1999), were obtained for 3 grizzly bears, 7 wolves in 2 groups of 1 pack, and 3 cougars in 1 family group. Grizzly bears were more likely located inside the YNP boundary during the pre-hunt period and north of the boundary once hunting began. The cougar family tended to be found outside YNP during the pre-hunt period and moved inside YNP when hunting began. Wolves did not significantly change their movement patterns during the pre-hunt and hunting periods. Qualitative information on elk (Cervus elaphus) indicated they moved into YNP once hunting started, suggesting that cougars followed living prey or responded to hunting activity, grizzly bears focused on dead prey (e.g., gut piles, crippled elk), and wolves may have taken advantage of both. Measures of association (Jacob's Index) were positive within carnivore species but inconclusive among species. Further collaborative research and the use of new technologies such as Global Positioning System (GPS) telemetry collars will advance our ability to understand these species, the carnivore community and its interactions, and human influences on carnivores.
Fiber optic sensor system for detecting movement or position of a rotating wheel bearing
Veeser, Lynn R.; Rodriguez, Patrick J.; Forman, Peter R.; Monahan, Russell E.; Adler, Jonathan M.
1997-01-01
An improved fiber optic sensor system and integrated sensor bearing assembly for detecting movement or position of a rotating wheel bearing having a multi-pole tone ring which produces an alternating magnetic field indicative of movement and position of the rotating member. A magneto-optical material, such as a bismuth garnet iron (B.I.G.) crystal, having discrete magnetic domains is positioned in the vicinity of the tone ring so that the domains align themselves to the magnetic field generated by the tone ring. A single fiber optic cable, preferably single mode fiber, carries light generated by a source of light to the B.I.G. crystal. The light passes through the B.I.G. crystal and is refracted at domain boundaries in the crystal. The intensity of the refracted light is indicative of the amount of alignment of the domains and therefore the strength of the magnetic field. The refracted light is carried by the fiber optic cable to an optic receiver where the intensity is measured and an electrical signal is generated and sent to a controller indicating the frequency of the changes in light intensity and therefore the rotational speed of the rotating wheel bearing.
Corsi, Steven R; Harwell, Glenn R; Geis, Steven W; Bergman, Daniel
2006-11-01
From October 2002 to April 2004, data were collected from Dallas/Fort Worth (DFW) International Airport (TX, U.S.A.) outfalls and receiving waters (Trigg Lake and Big Bear Creek) to document the magnitude and potential effects of aircraft deicer and anti-icer fluid (ADAF) runoff on water quality. Glycol concentrations at outfalls ranged from less than 18 to 23,800 mg/L, whereas concentrations in Big Bear Creek were less because of dilution, dispersion, and degradation, ranging from less than 18 to 230 mg/L. Annual loading results indicate that 10 and 35% of what was applied to aircraft was discharged to Big Bear Creek in 2003 and 2004, respectively. Glycol that entered Trigg Lake was diluted and degraded before reaching the lake outlet. Dissolved oxygen (DO) concentrations at airport outfalls sometimes were low (<2.0 mg/L) but typical of what was measured in an urban reference stream. In comparison, the DO concentration at Trigg Lake monitoring sites was consistently greater than 5.5 mg/L during the monitoring period, probably because of the installation of aerators in the lake by DFW personnel. The DO concentration in Big Bear Creek was very similar at sites upstream and downstream of airport influence (>5.0 mg/L). Results of toxicity tests indicate that effects on Ceriodaphnia dubia, Pimephales promelas, and Selanastrum capricornutum are influenced by type IV ADAF (anti-icer), not just type I ADAF (deicer) as is more commonly assumed.
Health Informatics Scientists' Perception About Big Data Technology.
Minou, John; Routsis, Fotios; Gallos, Parisis; Mantas, John
2017-01-01
The aim of this paper is to present the perceptions of the Health Informatics Scientists about the Big Data Technology in Healthcare. An empirical study was conducted among 46 scientists to assess their knowledge about the Big Data Technology and their perceptions about using this technology in healthcare. Based on the study findings, 86.7% of the scientists had knowledge of Big data Technology. Furthermore, 59.1% of the scientists believed that Big Data Technology refers to structured data. Additionally, 100% of the population believed that Big Data Technology can be implemented in Healthcare. Finally, the majority does not know any cases of use of Big Data Technology in Greece while 57,8% of the them mentioned that they knew use cases of the Big Data Technology abroad.
Corsi, S.R.; Harwell, G.R.; Geis, S.W.; Bergman, D.
2006-01-01
From October 2002 to April 2004, data were collected from Dallas/Fort Worth (DFW) International Airport (TX, USA) outfalls and receiving waters (Trigg Lake and Big Bear Creek) to document the magnitude and potential effects of aircraft deicer and anti-icer fluid (ADAF) runoff on water quality. Glycol concentrations at outfalls ranged from less than 18 to 23,800 mg/L, whereas concentrations in Big Bear Creek were less because of dilution, dispersion, and degradation, ranging from less than 18 to 230 mg/L. Annual loading results indicate that 10 and 35% of what was applied to aircraft was discharged to Big Bear Creek in 2003 and 2004, respectively. Glycol that entered Trigg Lake was diluted and degraded before reaching the lake outlet. Dissolved oxygen (DO) concentrations at airport outfalls sometimes were low (5.0 mg/L). Results of toxicity tests indicate that effects on Ceriodaphnia dubia, Pimephales promelas, and Selanastrum capricornutum are influenced by type IV ADAF (anti-icer), not just type I ADAF (deicer) as is more commonly assumed. ?? 2006 SETAC.
Beyond the Bells and Whistles: Technology Skills for a Purpose.
ERIC Educational Resources Information Center
Eisenberg, Michael B.
2001-01-01
Discusses the goal of K-12 education to have students learn to use technology, defines computer literacy, and describes the Big6 process model that helps solve information problems. Highlights include examples of technology in Big6 contexts, Big6 and the Internet, and the Big6 as a conceptual framework for meaningful technology use. (LRW)
Technology for a Purpose: Technology for Information Problem-Solving with the Big6[R].
ERIC Educational Resources Information Center
Eisenberg, Mike B
2003-01-01
Explains the Big6 model of information problem solving as a conceptual framework for learning and teaching information and technology skills. Highlights include information skills; examples of integrating technology in Big6 contexts; and the Big6 and the Internet, including email, listservs, chat, Web browsers, search engines, portals, Web…
Application and Prospect of Big Data in Water Resources
NASA Astrophysics Data System (ADS)
Xi, Danchi; Xu, Xinyi
2017-04-01
Because of developed information technology and affordable data storage, we h ave entered the era of data explosion. The term "Big Data" and technology relate s to it has been created and commonly applied in many fields. However, academic studies just got attention on Big Data application in water resources recently. As a result, water resource Big Data technology has not been fully developed. This paper introduces the concept of Big Data and its key technologies, including the Hadoop system and MapReduce. In addition, this paper focuses on the significance of applying the big data in water resources and summarizing prior researches by others. Most studies in this field only set up theoretical frame, but we define the "Water Big Data" and explain its tridimensional properties which are time dimension, spatial dimension and intelligent dimension. Based on HBase, the classification system of Water Big Data is introduced: hydrology data, ecology data and socio-economic data. Then after analyzing the challenges in water resources management, a series of solutions using Big Data technologies such as data mining and web crawler, are proposed. Finally, the prospect of applying big data in water resources is discussed, it can be predicted that as Big Data technology keeps developing, "3D" (Data Driven Decision) will be utilized more in water resources management in the future.
Semantic Web technologies for the big data in life sciences.
Wu, Hongyan; Yamaguchi, Atsuko
2014-08-01
The life sciences field is entering an era of big data with the breakthroughs of science and technology. More and more big data-related projects and activities are being performed in the world. Life sciences data generated by new technologies are continuing to grow in not only size but also variety and complexity, with great speed. To ensure that big data has a major influence in the life sciences, comprehensive data analysis across multiple data sources and even across disciplines is indispensable. The increasing volume of data and the heterogeneous, complex varieties of data are two principal issues mainly discussed in life science informatics. The ever-evolving next-generation Web, characterized as the Semantic Web, is an extension of the current Web, aiming to provide information for not only humans but also computers to semantically process large-scale data. The paper presents a survey of big data in life sciences, big data related projects and Semantic Web technologies. The paper introduces the main Semantic Web technologies and their current situation, and provides a detailed analysis of how Semantic Web technologies address the heterogeneous variety of life sciences big data. The paper helps to understand the role of Semantic Web technologies in the big data era and how they provide a promising solution for the big data in life sciences.
The New Improved Big6 Workshop Handbook. Professional Growth Series.
ERIC Educational Resources Information Center
Eisenberg, Michael B.; Berkowitz, Robert E.
This handbook is intended to help classroom teachers, teacher-librarians, technology teachers, administrators, parents, community members, and students to learn about the Big6 Skills approach to information and technology skills, to use the Big6 process in their own activities, and to implement a Big6 information and technology skills program. The…
Application and Exploration of Big Data Mining in Clinical Medicine.
Zhang, Yue; Guo, Shu-Li; Han, Li-Na; Li, Tie-Ling
2016-03-20
To review theories and technologies of big data mining and their application in clinical medicine. Literatures published in English or Chinese regarding theories and technologies of big data mining and the concrete applications of data mining technology in clinical medicine were obtained from PubMed and Chinese Hospital Knowledge Database from 1975 to 2015. Original articles regarding big data mining theory/technology and big data mining's application in the medical field were selected. This review characterized the basic theories and technologies of big data mining including fuzzy theory, rough set theory, cloud theory, Dempster-Shafer theory, artificial neural network, genetic algorithm, inductive learning theory, Bayesian network, decision tree, pattern recognition, high-performance computing, and statistical analysis. The application of big data mining in clinical medicine was analyzed in the fields of disease risk assessment, clinical decision support, prediction of disease development, guidance of rational use of drugs, medical management, and evidence-based medicine. Big data mining has the potential to play an important role in clinical medicine.
Big data processing in the cloud - Challenges and platforms
NASA Astrophysics Data System (ADS)
Zhelev, Svetoslav; Rozeva, Anna
2017-12-01
Choosing the appropriate architecture and technologies for a big data project is a difficult task, which requires extensive knowledge in both the problem domain and in the big data landscape. The paper analyzes the main big data architectures and the most widely implemented technologies used for processing and persisting big data. Clouds provide for dynamic resource scaling, which makes them a natural fit for big data applications. Basic cloud computing service models are presented. Two architectures for processing big data are discussed, Lambda and Kappa architectures. Technologies for big data persistence are presented and analyzed. Stream processing as the most important and difficult to manage is outlined. The paper highlights main advantages of cloud and potential problems.
Application and Exploration of Big Data Mining in Clinical Medicine
Zhang, Yue; Guo, Shu-Li; Han, Li-Na; Li, Tie-Ling
2016-01-01
Objective: To review theories and technologies of big data mining and their application in clinical medicine. Data Sources: Literatures published in English or Chinese regarding theories and technologies of big data mining and the concrete applications of data mining technology in clinical medicine were obtained from PubMed and Chinese Hospital Knowledge Database from 1975 to 2015. Study Selection: Original articles regarding big data mining theory/technology and big data mining's application in the medical field were selected. Results: This review characterized the basic theories and technologies of big data mining including fuzzy theory, rough set theory, cloud theory, Dempster–Shafer theory, artificial neural network, genetic algorithm, inductive learning theory, Bayesian network, decision tree, pattern recognition, high-performance computing, and statistical analysis. The application of big data mining in clinical medicine was analyzed in the fields of disease risk assessment, clinical decision support, prediction of disease development, guidance of rational use of drugs, medical management, and evidence-based medicine. Conclusion: Big data mining has the potential to play an important role in clinical medicine. PMID:26960378
NASA Astrophysics Data System (ADS)
Antonello, E.
2009-08-01
Arcturus is the brightest star in Bootes. The ancient Greek name Arktouros means Bear Guard. The star, however, is not close to Ursa Maior (Big She-Bear) and Ursa Minor (Little She-Bear), as the name would suggest. This curious discrepancy could be explained by the star proper motion, assuming the name Bear Guard is a remote cultural heritage. The proper motion analysis could allow us to get an insight also into an ancient myth regarding Ursa Maior. Though we cannot explain scientifically such a myth, some interesting suggestions can be obtained about its possible origin, in the context of the present knowledge of the importance of the cult of the bear both during the Palaeolithic times and for several primitive populations of modern times, as shown by the ethnological studies.
Opportunity and Challenges for Migrating Big Data Analytics in Cloud
NASA Astrophysics Data System (ADS)
Amitkumar Manekar, S.; Pradeepini, G., Dr.
2017-08-01
Big Data Analytics is a big word now days. As per demanding and more scalable process data generation capabilities, data acquisition and storage become a crucial issue. Cloud storage is a majorly usable platform; the technology will become crucial to executives handling data powered by analytics. Now a day’s trend towards “big data-as-a-service” is talked everywhere. On one hand, cloud-based big data analytics exactly tackle in progress issues of scale, speed, and cost. But researchers working to solve security and other real-time problem of big data migration on cloud based platform. This article specially focused on finding possible ways to migrate big data to cloud. Technology which support coherent data migration and possibility of doing big data analytics on cloud platform is demanding in natute for new era of growth. This article also gives information about available technology and techniques for migration of big data in cloud.
The Welfare of Bears in Zoos: A Case Study of Poland.
Maślak, Robert; Sergiel, Agnieszka; Bowles, David; Paśko, Łukasz
2016-01-01
The welfare of captive bears became a big issue of concern in Poland when a case of a bear being ill-treated became a high-profile case in the media. This case created a challenge to verify, study, and understand the main problems associated with bear keeping so that zoos could significantly improve the conditions in which they keep bears or ensure they keep bears at the minimum required standards. The results presented here are from 1 of the few countrywide studies of captive bear conditions conducted in all the captive institutions in Poland that keep bears. Thirteen institutions kept bears at the time of the study (2007-2009), including 54 individuals of 5 species. Major welfare problems were identified, and the results have been used to challenge zoos to address the changes required and focus the government's attention on areas that require legislative improvement.
THE BERKELEY DATA ANALYSIS SYSTEM (BDAS): AN OPEN SOURCE PLATFORM FOR BIG DATA ANALYTICS
2017-09-01
Evan Sparks, Oliver Zahn, Michael J. Franklin, David A. Patterson, Saul Perlmutter. Scientific Computing Meets Big Data Technology: An Astronomy ...Processing Astronomy Imagery Using Big Data Technology. IEEE Transaction on Big Data, 2016. Approved for Public Release; Distribution Unlimited. 22 [93
The 2016 Transit of Mercury Observed from Major Solar Telescopes and Satellites
NASA Astrophysics Data System (ADS)
Pasachoff, Jay M.; Schneider, Glenn; Gary, Dale; Chen, Bin; Sterling, Alphonse C.; Reardon, Kevin P.; Dantowitz, Ronald; Kopp, Greg A.
2016-10-01
We report observations from the ground and space of the 9 May 2016 transit of Mercury. We build on our explanation of the black-drop effect in transits of Venus based on spacecraft observations of the 1999 transit of Mercury (Schneider, Pasachoff, and Golub, Icarus 168, 249, 2004). In 2016, we used the 1.6-m New Solar Telescope at the Big Bear Solar Observatory with active optics to observe Mercury's transit at high spatial resolution. We again saw a small black-drop effect as 3rd contact neared, confirming the data that led to our earlier explanation as a confluence of the point-spread function and the extreme solar limb darkening (Pasachoff, Schneider, and Golub, in IAU Colloq. 196, 2004). We again used IBIS on the Dunn Solar Telescope of the Sacramento Peak Observatory, as A. Potter continued his observations, previously made at the 2006 transit of Mercury, at both telescopes of the sodium exosphere of Mercury (Potter, Killen, Reardon, and Bida, Icarus 226, 172, 2013). We imaged the transit with IBIS as well as with two RED Epic IMAX-quality cameras alongside it, one with a narrow passband. We show animations of our high-resolution ground-based observations along with observations from XRT on JAXA's Hinode and from NASA's Solar Dynamics Observatory. Further, we report on the limit of the transit change in the Total Solar Irradiance, continuing our interest from the transit of Venus TSI (Schneider, Pasachoff, and Willson, ApJ 641, 565, 2006; Pasachoff, Schneider, and Willson, AAS 2005), using NASA's SORCE/TIM and the Air Force's TCTE/TIM. See http://transitofvenus.info and http://nicmosis.as.arizona.edu.Acknowledgments: We were glad for the collaboration at Big Bear of Claude Plymate and his colleagues of the staff of the Big Bear Solar Observatory. We also appreciate the collaboration on the transit studies of Robert Lucas (Sydney, Australia) and Evan Zucker (San Diego, California). JMP appreciates the sabbatical hospitality of the Division of Geosciences and Planetary Sciences of the California Institute of Technology, and of Prof. Andrew Ingersoll there. The solar observations lead into the 2017 eclipse studies, for which JMP is supported by grants from the NSF AGS and National Geographic CRE.
A water-quality reconnaissance of Big Bear Lake, San Bernardino County, California, 1972-1973
Irwin, George A.; Lemons, Michael
1974-01-01
A water-quality reconnaissance study of the Big Bear Lake area in southern California was made by the U.S. Geological Survey from April 1972 through April 1973. The primary purpose of the study was to measure the concentration and distribution of selected primary nutrients, organic carbon, dissolved oxygen, phytoplankton, and water temperature in the lake. Estimates of the nitrogen, phosphorus, and silica loading to the lake from surface-water tributaries and precipitation were also made.Results of the study indicate that Big Bear Lake is moderately eutrophic, at least in regard to nitrogen, phosphorus, and organic content. Nitrate was found in either trace concentrations or below detectable limits; however, ammonia nitrogen was usually detected in concentrations greater than 0.05 milligrams per liter. Orthophosphate phosphorus was detected in mean concentrations ranging from 0.01 to 0.05 milligrams per liter. Organic nitrogen and phosphorus were also detected in measurable concentrations.Seasonal levels of dissolved oxygen indicated that the nutrients and other controlling factors were optimum for relatively high primary productivity. However, production varied both seasonally and areally in the lake. Primary productivity seemed highest in the eastern and middle parts of the lake. The middle and western parts of the lake exhibited severe oxygen deficits in the deeper water during the warmer summer months of June and July 1972.
BigBWA: approaching the Burrows-Wheeler aligner to Big Data technologies.
Abuín, José M; Pichel, Juan C; Pena, Tomás F; Amigo, Jorge
2015-12-15
BigBWA is a new tool that uses the Big Data technology Hadoop to boost the performance of the Burrows-Wheeler aligner (BWA). Important reductions in the execution times were observed when using this tool. In addition, BigBWA is fault tolerant and it does not require any modification of the original BWA source code. BigBWA is available at the project GitHub repository: https://github.com/citiususc/BigBWA. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Update on coal in Big Horn basin, Montana and Wyoming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, R.W.
1983-08-01
The Big Horn Coal basin is located within the topographic and structural basin of the same name and is defined by the limits of the Upper Cretaceous Mesaverde Formation in northwestern Wyoming and the Eagle Sandstone in south-central Montana. The coal in this basin ranges in rank from high volatile C bituminous (based primarily on resistance to weathering) to subbituminous B coal. In general, the Mesaverde and Eagle coals are highest in heat content, averaging over 10,500 Btu/lb; the Fort Union coals in the Red Lodge-Bear Creek and Grass Creek fields average about 10,200 Btu/lb and are second highest inmore » heating value. The Meeteetse Formation contains coals that average 9,800 Btu/lb, the lowest heating values in the basin. An average heating value for all coal in the basin is slightly less than 10,000 But/lb. The average sulfur content of all coals in this basin is less than 1%, with a range of 0.4 to 2.2%. Coal mining in the Big Horn Coal basin began in the late 1880s in the Red Lodge field and has continued to the present. Almost 53 million tons of coal have been mined in the basin; nearly 78% of this production (41 million tons) is from bituminous Fort Union coal beds in the Red Lodge-Bear Creek and Bridger coal fields, Montana. Original in-place resources for the Big Horn Coal basin are given by rank of coal: 1,265.12 million tons of bituminous coal resources have been calculated for the Silvertip field, Wyoming, and the Red Lodge-Bear Creek and Bridger fields, Montana; 563.78 million tons of subbituminous resources have been calculated for the remaining Wyoming coal fields.« less
Research on Technology Innovation Management in Big Data Environment
NASA Astrophysics Data System (ADS)
Ma, Yanhong
2018-02-01
With the continuous development and progress of the information age, the demand for information is getting larger. The processing and analysis of information data is also moving toward the direction of scale. The increasing number of information data makes people have higher demands on processing technology. The explosive growth of information data onto the current society have prompted the advent of the era of big data. At present, people have more value and significance in producing and processing various kinds of information and data in their lives. How to use big data technology to process and analyze information data quickly to improve the level of big data management is an important stage to promote the current development of information and data processing technology in our country. To some extent, innovative research on the management methods of information technology in the era of big data can enhance our overall strength and make China be an invincible position in the development of the big data era.
77 FR 74829 - Notice of Public Meeting-Cloud Computing and Big Data Forum and Workshop
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-18
...--Cloud Computing and Big Data Forum and Workshop AGENCY: National Institute of Standards and Technology... Standards and Technology (NIST) announces a Cloud Computing and Big Data Forum and Workshop to be held on... followed by a one-day hands-on workshop. The NIST Cloud Computing and Big Data Forum and Workshop will...
Teaching Information & Technology Skills: The Big6[TM] in Secondary Schools.
ERIC Educational Resources Information Center
Eisenberg, Michael B.; Berkowitz, Robert E.
This companion volume to a previous work focusing on the Big6 Approach in elementary schools provides secondary school classroom teachers, teacher-librarians, and technology teachers with the background and tools necessary to implement an integrated Big6 program. The first part of this book explains the Big6 approach and the rationale behind it.…
Preliminary geologic map of the Big Bear City 7.5' Quadrangle, San Bernardino County, California
Miller, Fred K.; Cossette, Digital preparation by Pamela M.
2004-01-01
This data set maps and describes the geology of the Big Bear City 7.5' quadrangle, San Bernardino County, California. Created using Environmental Systems Research Institute's ARC/INFO software, the data base consists of the following items: (1) a rock-unit coverage and attribute tables (polygon and arc) containing geologic contacts, units and rock-unit labels as annotation which are also included in a separate annotation coverage, bbc_anno (2) a point coverage containing structural point data and (3) a coverage containing fold axes. In addition, the data set includes the following graphic and text products: (1) A PostScript graphic plot-file containing the geologic map, topography, cultural data, a Correlation of Map Units (CMU) diagram, a Description of Map Units (DMU), an index map, a regional geologic and structure map, and an explanation for point and line symbols; (2) PDF files of the Readme (including the metadata file as an appendix), and a screen graphic of the plot produced by the PostScript plot file. The geologic map describes a geologically complex area on the north side of the San Bernardino Mountains. Bedrock units in the Big Bear City quadrangle are dominated by (1) large Cretaceous granitic bodies ranging in composition from monzogranite to gabbro, (2) metamorphosed sedimentary rocks ranging in age from late Paleozoic to late Proterozoic, and (3) Middle Proterozoic gneiss. These rocks are complexly deformed by normal, reverse, and thrust faults, and in places are tightly folded. The geologic map database contains original U.S. Geological Survey data generated by detailed field observation and by interpretation of aerial photographs. The map data was compiled on base-stable cronoflex copies of the Big Bear City 7.5' topographic map, transferred to a scribe-guide and subsequently digitized. Lines, points, and polygons were edited at the USGS using standard ARC/INFO commands. Digitizing and editing artifacts significant enough to display at a scale of 1:24,000 were corrected. Within the database, geologic contacts are represented as lines (arcs), geologic units as polygons, and site-specific data as points. Polygon, arc, and point attribute tables (.pat, .aat, and .pat, respectively) uniquely identify each geologic datum.
Research on Implementing Big Data: Technology, People, & Processes
ERIC Educational Resources Information Center
Rankin, Jenny Grant; Johnson, Margie; Dennis, Randall
2015-01-01
When many people hear the term "big data", they primarily think of a technology tool for the collection and reporting of data of high variety, volume, and velocity. However, the complexity of big data is not only the technology, but the supporting processes, policies, and people supporting it. This paper was written by three experts to…
[Relevance of big data for molecular diagnostics].
Bonin-Andresen, M; Smiljanovic, B; Stuhlmüller, B; Sörensen, T; Grützkau, A; Häupl, T
2018-04-01
Big data analysis raises the expectation that computerized algorithms may extract new knowledge from otherwise unmanageable vast data sets. What are the algorithms behind the big data discussion? In principle, high throughput technologies in molecular research already introduced big data and the development and application of analysis tools into the field of rheumatology some 15 years ago. This includes especially omics technologies, such as genomics, transcriptomics and cytomics. Some basic methods of data analysis are provided along with the technology, however, functional analysis and interpretation requires adaptation of existing or development of new software tools. For these steps, structuring and evaluating according to the biological context is extremely important and not only a mathematical problem. This aspect has to be considered much more for molecular big data than for those analyzed in health economy or epidemiology. Molecular data are structured in a first order determined by the applied technology and present quantitative characteristics that follow the principles of their biological nature. These biological dependencies have to be integrated into software solutions, which may require networks of molecular big data of the same or even different technologies in order to achieve cross-technology confirmation. More and more extensive recording of molecular processes also in individual patients are generating personal big data and require new strategies for management in order to develop data-driven individualized interpretation concepts. With this perspective in mind, translation of information derived from molecular big data will also require new specifications for education and professional competence.
[Applications of eco-environmental big data: Progress and prospect].
Zhao, Miao Miao; Zhao, Shi Cheng; Zhang, Li Yun; Zhao, Fen; Shao, Rui; Liu, Li Xiang; Zhao, Hai Feng; Xu, Ming
2017-05-18
With the advance of internet and wireless communication technology, the fields of ecology and environment have entered a new digital era with the amount of data growing explosively and big data technologies attracting more and more attention. The eco-environmental big data is based airborne and space-/land-based observations of ecological and environmental factors and its ultimate goal is to integrate multi-source and multi-scale data for information mining by taking advantages of cloud computation, artificial intelligence, and modeling technologies. In comparison with other fields, the eco-environmental big data has its own characteristics, such as diverse data formats and sources, data collected with various protocols and standards, and serving different clients and organizations with special requirements. Big data technology has been applied worldwide in ecological and environmental fields including global climate prediction, ecological network observation and modeling, and regional air pollution control. The development of eco-environmental big data in China is facing many problems, such as data sharing issues, outdated monitoring facilities and techno-logies, and insufficient data mining capacity. Despite all this, big data technology is critical to solving eco-environmental problems, improving prediction and warning accuracy on eco-environmental catastrophes, and boosting scientific research in the field in China. We expected that the eco-environmental big data would contribute significantly to policy making and environmental services and management, and thus the sustainable development and eco-civilization construction in China in the coming decades.
Gu, Fu; Ma, Buqing; Guo, Jianfeng; Summers, Peter A; Hall, Philip
2017-10-01
Management of Waste Electrical and Electronic Equipment (WEEE) is a vital part in solid waste management, there are still some difficult issues require attentionss. This paper investigates the potential of applying Internet of Things (IoT) and Big Data as the solutions to the WEEE management problems. The massive data generated during the production, consumption and disposal of Electrical and Electronic Equipment (EEE) fits the characteristics of Big Data. Through using the state-of-the-art communication technologies, the IoT derives the WEEE "Big Data" from the life cycle of EEE, and the Big Data technologies process the WEEE "Big Data" for supporting decision making in WEEE management. The framework of implementing the IoT and the Big Data technologies is proposed, with its multiple layers are illustrated. Case studies with the potential application scenarios of the framework are presented and discussed. As an unprecedented exploration, the combined application of the IoT and the Big Data technologies in WEEE management brings a series of opportunities as well as new challenges. This study provides insights and visions for stakeholders in solving the WEEE management problems under the context of IoT and Big Data. Copyright © 2017 Elsevier Ltd. All rights reserved.
A Study Of The Internet Of Things And Rfid Technology: Big Data In Navy Medicine
2017-12-01
NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA MBA PROFESSIONAL REPORT A STUDY OF THE INTERNET OF THINGS AND RFID TECHNOLOGY...December 2017 3. REPORT TYPE AND DATES COVERED MBA professional report 4. TITLE AND SUBTITLE A STUDY OF THE INTERNET OF THINGS AND RFID TECHNOLOGY: BIG...Distribution is unlimited. A STUDY OF THE INTERNET OF THINGS AND RFID TECHNOLOGY: BIG DATA IN NAVY MEDICINE Gill S. Trainor, Lieutenant
Research on information security in big data era
NASA Astrophysics Data System (ADS)
Zhou, Linqi; Gu, Weihong; Huang, Cheng; Huang, Aijun; Bai, Yongbin
2018-05-01
Big data is becoming another hotspot in the field of information technology after the cloud computing and the Internet of Things. However, the existing information security methods can no longer meet the information security requirements in the era of big data. This paper analyzes the challenges and a cause of data security brought by big data, discusses the development trend of network attacks under the background of big data, and puts forward my own opinions on the development of security defense in technology, strategy and product.
Analysis of the frontier technology of agricultural IoT and its predication research
NASA Astrophysics Data System (ADS)
Han, Shuqing; Zhang, Jianhua; Zhu, Mengshuai; Wu, Jianzhai; Shen, Chen; Kong, Fantao
2017-09-01
Agricultural IoT (Internet of Things) develops rapidly. Nanotechnology, biotechnology and optoelectronic technology are successfully integrated into the agricultural sensor technology. Big data, cloud computing and artificial intelligence technology have also been successfully used in IoT. This paper carries out the research on integration of agricultural sensor technology, nanotechnology, biotechnology and optoelectronic technology and the application of big data, cloud computing and artificial intelligence technology in agricultural IoT. The advantages and development of the integration of nanotechnology, biotechnology and optoelectronic technology with agricultural sensor technology were discussed. The application of big data, cloud computing and artificial intelligence technology in IoT and their development trend were analysed.
What’s So Different about Big Data?. A Primer for Clinicians Trained to Think Epidemiologically
Liu, Vincent
2014-01-01
The Big Data movement in computer science has brought dramatic changes in what counts as data, how those data are analyzed, and what can be done with those data. Although increasingly pervasive in the business world, it has only recently begun to influence clinical research and practice. As Big Data draws from different intellectual traditions than clinical epidemiology, the ideas may be less familiar to practicing clinicians. There is an increasing role of Big Data in health care, and it has tremendous potential. This Demystifying Data Seminar identifies four main strands in Big Data relevant to health care. The first is the inclusion of many new kinds of data elements into clinical research and operations, in a volume not previously routinely used. Second, Big Data asks different kinds of questions of data and emphasizes the usefulness of analyses that are explicitly associational but not causal. Third, Big Data brings new analytic approaches to bear on these questions. And fourth, Big Data embodies a new set of aspirations for a breaking down of distinctions between research data and operational data and their merging into a continuously learning health system. PMID:25102315
What's so different about big data?. A primer for clinicians trained to think epidemiologically.
Iwashyna, Theodore J; Liu, Vincent
2014-09-01
The Big Data movement in computer science has brought dramatic changes in what counts as data, how those data are analyzed, and what can be done with those data. Although increasingly pervasive in the business world, it has only recently begun to influence clinical research and practice. As Big Data draws from different intellectual traditions than clinical epidemiology, the ideas may be less familiar to practicing clinicians. There is an increasing role of Big Data in health care, and it has tremendous potential. This Demystifying Data Seminar identifies four main strands in Big Data relevant to health care. The first is the inclusion of many new kinds of data elements into clinical research and operations, in a volume not previously routinely used. Second, Big Data asks different kinds of questions of data and emphasizes the usefulness of analyses that are explicitly associational but not causal. Third, Big Data brings new analytic approaches to bear on these questions. And fourth, Big Data embodies a new set of aspirations for a breaking down of distinctions between research data and operational data and their merging into a continuously learning health system.
How to Use TCM Informatics to Study Traditional Chinese Medicine in Big Data Age.
Shi, Cheng; Gong, Qing-Yue; Zhou, Jinhai
2017-01-01
This paper introduces the characteristics and complexity of traditional Chinese medicine (TCM) data, considers that modern big data processing technology has brought new opportunities for the research of TCM, and gives some ideas and methods to apply big data technology in TCM.
On Study of Application of Big Data and Cloud Computing Technology in Smart Campus
NASA Astrophysics Data System (ADS)
Tang, Zijiao
2017-12-01
We live in an era of network and information, which means we produce and face a lot of data every day, however it is not easy for database in the traditional meaning to better store, process and analyze the mass data, therefore the big data was born at the right moment. Meanwhile, the development and operation of big data rest with cloud computing which provides sufficient space and resources available to process and analyze data of big data technology. Nowadays, the proposal of smart campus construction aims at improving the process of building information in colleges and universities, therefore it is necessary to consider combining big data technology and cloud computing technology into construction of smart campus to make campus database system and campus management system mutually combined rather than isolated, and to serve smart campus construction through integrating, storing, processing and analyzing mass data.
First Results of the Near Real-Time Imaging Reconstruction System at Big Bear Solar Observatory
NASA Astrophysics Data System (ADS)
Yang, G.; Denker, C.; Wang, H.
2003-05-01
The Near Real-Time Imaging Reconstruction system (RTIR) at Big Bear Solar Observatory (BBSO) is designed to obtain high spatial resolution solar images at a cadence of 1 minute utilizing the power of parallel processing. With this system, we can compute near diffraction-limited images without saving huge amounts of data that are involved in the speckle masking reconstruction algorithm. It enables us to monitor active regions and give fast response to the solar activity. In this poster we present the first results of our new 32-CPU Beowulf cluster system. The images are 1024 x 1024 and the field of view (FOV) is 80'' x 80''. Our target is an active region with complex magnetic configuration. We focus on pores and small spots in the active region with the goal of better understanding the formation of penumbra structure. In addition we expect to study evolution of active regions during solar flares.
Directions of flow of the water-bearing stratum in Friuli (NE Italy)
NASA Astrophysics Data System (ADS)
Cucchi, F.; Affatato, A.; Andrian, L.; Devoto, S.; Mereu, A.; Oberti, S.; Piano, C.; Rondi, V.; Zini, L.
2003-04-01
Flow directions of the water -- bearing stratum were executed with a Thermal Flowmeter in the Northern Friuli Plain. This type of instrument used is made up by a heater, a compass and various sensors of temperature. It is connected to an outside computer. It measures the induced thermal currents and identifies the direction and the intensity of the flow. The Thermal Flowmeter can be used in wells of little diameter and for big depths. The campaign of measures, about a hundred, confirms the general correspondence between the directions of the flows obtained from the water table and those measured through the Flowmeter in the permeable bodies with primary permeability. Different flow directions compared to the general picture were noticed in the conglomerate bodies, because of a secondary permeability. Direction changes are also noticed for the heterogeneity of the sediments which constitute the aquifer to big and to little scale.
1980-10-01
by block number) Air bearings, gas bearings, air lubrication, gas lubrication, rotor dynamics , gas turbines, turbomachinery, foil bearings, compliant...coverage of the subject at this time. Therefore, as a part of the Rotor -Bearing Dynamics Technology Design Guide update, this document is prepared...of the inertia and flexure properties of the rotor together with the dynamic character- istics of the bearing(s). However, an examination of the
Big data in wildlife research: remote web-based monitoring of hibernating black bears.
Laske, Timothy G; Garshelis, David L; Iaizzo, Paul A
2014-12-11
Numerous innovations for the management and collection of "big data" have arisen in the field of medicine, including implantable computers and sensors, wireless data transmission, and web-based repositories for collecting and organizing information. Recently, human clinical devices have been deployed in captive and free-ranging wildlife to aid in the characterization of both normal physiology and the interaction of animals with their environment, including reactions to humans. Although these devices have had a significant impact on the types and quantities of information that can be collected, their utility has been limited by internal memory capacities, the efforts required to extract and analyze information, and by the necessity to handle the animals in order to retrieve stored data. We surgically implanted miniaturized cardiac monitors (1.2 cc, Reveal LINQ™, Medtronic Inc.), a newly developed human clinical system, into hibernating wild American black bears (N = 6). These devices include wireless capabilities, which enabled frequent transmissions of detailed physiological data from bears in their remote den sites to a web-based data storage and management system. Solar and battery powered telemetry stations transmitted detailed physiological data over the cellular network during the winter months. The system provided the transfer of large quantities of data in near-real time. Observations included changes in heart rhythms associated with birthing and caring for cubs, and in all bears, long periods without heart beats (up to 16 seconds) occurred during each respiratory cycle. For the first time, detailed physiological data were successfully transferred from an animal in the wild to a web-based data collection and management system, overcoming previous limitations on the quantities of data that could be transferred. The system provides an opportunity to detect unusual events as they are occurring, enabling investigation of the animal and site shortly afterwards. Although the current study was limited to bears in winter dens, we anticipate that future systems will transmit data from implantable monitors to wearable transmitters, allowing for big data transfer on non-stationary animals.
Quality Attribute-Guided Evaluation of NoSQL Databases: A Case Study
2015-01-16
evaluations of NoSQL databases specifically, and big data systems in general, that have become apparent during our study. Keywords—NoSQL, distributed...technology, namely that of big data , software systems [1]. At the heart of big data systems are a collection of database technologies that are more...born organizations such as Google and Amazon [3][4], along with those of numerous other big data innovators, have created a variety of open source and
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-05
... standards for dogs and cats, guinea pigs and hamsters, rabbits, nonhuman primates, marine mammals, and... contact with animals, and specifically requires that dangerous animals such as lions, tigers, wolves...
The Big Science Questions About Mercury's Ice-Bearing Polar Deposits After MESSENGER
NASA Astrophysics Data System (ADS)
Chabot, N. L.; Lawrence, D. J.
2018-05-01
Mercury’s polar deposits provide many well-characterized locations that are known to have large expanses of exposed water ice and/or other volatile materials — presenting unique opportunities to address fundamental science questions.
The big data processing platform for intelligent agriculture
NASA Astrophysics Data System (ADS)
Huang, Jintao; Zhang, Lichen
2017-08-01
Big data technology is another popular technology after the Internet of Things and cloud computing. Big data is widely used in many fields such as social platform, e-commerce, and financial analysis and so on. Intelligent agriculture in the course of the operation will produce large amounts of data of complex structure, fully mining the value of these data for the development of agriculture will be very meaningful. This paper proposes an intelligent data processing platform based on Storm and Cassandra to realize the storage and management of big data of intelligent agriculture.
A Scalable, Open Source Platform for Data Processing, Archiving and Dissemination
2016-01-01
Object Oriented Data Technology (OODT) big data toolkit developed by NASA and the Work-flow INstance Generation and Selection (WINGS) scientific work...to several challenge big data problems and demonstrated the utility of OODT-WINGS in addressing them. Specific demonstrated analyses address i...source software, Apache, Object Oriented Data Technology, OODT, semantic work-flows, WINGS, big data , work- flow management 16. SECURITY CLASSIFICATION OF
1980-10-01
AFAPL-TR-78-6 ’: Part Vill (U ROTOR -BEARING DYNAMICS - TECHNOLOGY DESIGN GUIDE ¢ Part Vil A Comput eri eval Syteftor Fluid Film Bearings SHAKER...Protection," Task 304806, "Aerospace Lubrication," Work Unit 30480685, " Rotor -Bearing Dynamics Design." The work reported herein was performed during the...the previous issue of the Rotor -Bearing Dynamics Technology Design Guide, - one volume dealt with the calculation of performance parameters and pertur
A Demonstration of Big Data Technology for Data Intensive Earth Science (Invited)
NASA Astrophysics Data System (ADS)
Kuo, K.; Clune, T.; Ramachandran, R.; Rushing, J.; Fekete, G.; Lin, A.; Doan, K.; Oloso, A. O.; Duffy, D.
2013-12-01
Big Data technologies exhibit great potential to change the way we conduct scientific investigations, especially analysis of voluminous and diverse data sets. Obviously, not all Big Data technologies are applicable to all aspects of scientific data analysis. Our NASA Earth Science Technology Office (ESTO) Advanced Information Systems Technology (AIST) project, Automated Event Service (AES), pioneers the exploration of Big Data technologies for data intensive Earth science. Since Earth science data are largely stored and manipulated in the form of multidimensional arrays, the project first evaluates array performance of several candidate Big Data technologies, including MapReduce (Hadoop), SciDB, and a custom-built Polaris system, which have one important feature in common: shared nothing architecture. The evaluation finds SicDB to be the most promising. In this presentation, we demonstrate SciDB using a couple of use cases, each operating on a distinct data set in the regular latitude-longitude grid. The first use case is the discovery and identification of blizzards using NASA's Modern Era Retrospective-analysis for Research and Application (MERRA) data sets. The other finds diurnal signals in the same 8-year period using SSMI data from three different instruments with different equator crossing times by correlating their retrieved parameters. In addition, the AES project is also developing a collaborative component to enable the sharing of event queries and results. Preliminary capabilities will be presented as well.
Toward a manifesto for the 'public understanding of big data'.
Michael, Mike; Lupton, Deborah
2016-01-01
In this article, we sketch a 'manifesto' for the 'public understanding of big data'. On the one hand, this entails such public understanding of science and public engagement with science and technology-tinged questions as follows: How, when and where are people exposed to, or do they engage with, big data? Who are regarded as big data's trustworthy sources, or credible commentators and critics? What are the mechanisms by which big data systems are opened to public scrutiny? On the other hand, big data generate many challenges for public understanding of science and public engagement with science and technology: How do we address publics that are simultaneously the informant, the informed and the information of big data? What counts as understanding of, or engagement with, big data, when big data themselves are multiplying, fluid and recursive? As part of our manifesto, we propose a range of empirical, conceptual and methodological exhortations. We also provide Appendix 1 that outlines three novel methods for addressing some of the issues raised in the article. © The Author(s) 2015.
A Sparsity-Promoted Decomposition for Compressed Fault Diagnosis of Roller Bearings
Wang, Huaqing; Ke, Yanliang; Song, Liuyang; Tang, Gang; Chen, Peng
2016-01-01
The traditional approaches for condition monitoring of roller bearings are almost always achieved under Shannon sampling theorem conditions, leading to a big-data problem. The compressed sensing (CS) theory provides a new solution to the big-data problem. However, the vibration signals are insufficiently sparse and it is difficult to achieve sparsity using the conventional techniques, which impedes the application of CS theory. Therefore, it is of great significance to promote the sparsity when applying the CS theory to fault diagnosis of roller bearings. To increase the sparsity of vibration signals, a sparsity-promoted method called the tunable Q-factor wavelet transform based on decomposing the analyzed signals into transient impact components and high oscillation components is utilized in this work. The former become sparser than the raw signals with noise eliminated, whereas the latter include noise. Thus, the decomposed transient impact components replace the original signals for analysis. The CS theory is applied to extract the fault features without complete reconstruction, which means that the reconstruction can be completed when the components with interested frequencies are detected and the fault diagnosis can be achieved during the reconstruction procedure. The application cases prove that the CS theory assisted by the tunable Q-factor wavelet transform can successfully extract the fault features from the compressed samples. PMID:27657063
Collaborative Interactive Visualization Exploratory Concept
2015-06-01
the FIAC concepts. It consists of various DRDC-RDDC-2015-N004 intelligence analysis web services build of top of big data technologies exploited...sits on the UDS where validated common knowledge is stored. Based on the Lumify software2, this important component exploits big data technologies such...interfaces. Above this database resides the Big Data Manager responsible for transparent data transmission between the UDS and the rest of the S3
10 Aspects of the Big Five in the Personality Inventory for DSM-5
DeYoung, Colin. G.; Carey, Bridget E.; Krueger, Robert F.; Ross, Scott R.
2015-01-01
DSM-5 includes a dimensional model of personality pathology, operationalized in the Personality Inventory for DSM-5 (PID-5), with 25 facets grouped into five higher-order factors resembling the Big Five personality dimensions. The present study tested how well these 25 facets could be integrated with the 10-factor structure of traits within the Big Five that is operationalized by the Big Five Aspect Scales (BFAS). In two healthy adult samples, 10-factor solutions largely confirmed our hypothesis that each of the 10 BFAS scales would be the highest loading BFAS scale on one and only one factor. Varying numbers of PID-5 scales were additional markers of each factor, and the overall factor structure in the first sample was well replicated in the second. Our results allow Cybernetic Big Five Theory (CB5T) to be brought to bear on manifestations of personality disorder, because CB5T offers mechanistic explanations of the 10 factors measured by the BFAS. Future research, therefore, may begin to test hypotheses derived from CB5T regarding the mechanisms that are dysfunctional in specific personality disorders. PMID:27032017
Big data, big knowledge: big data for personalized healthcare.
Viceconti, Marco; Hunter, Peter; Hose, Rod
2015-07-01
The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.
Technology for Mining the Big Data of MOOCs
ERIC Educational Resources Information Center
O'Reilly, Una-May; Veeramachaneni, Kalyan
2014-01-01
Because MOOCs bring big data to the forefront, they confront learning science with technology challenges. We describe an agenda for developing technology that enables MOOC analytics. Such an agenda needs to efficiently address the detailed, low level, high volume nature of MOOC data. It also needs to help exploit the data's capacity to reveal, in…
Understanding long-term silver release from surface modified porous titanium implants.
Shivaram, Anish; Bose, Susmita; Bandyopadhyay, Amit
2017-08-01
Prevention of orthopedic device related infection (ODRI) using antibiotics has met with limited amount of success and is still a big concern during post-surgery. As an alternative, use of silver as an antibiotic treatment to prevent surgical infections is being used due to the well-established antimicrobial properties of silver. However, in most cases silver is used in particulate form with wound dressings or with short-term devices such as catheters but not with load-bearing implants. We hypothesize that strongly adherent silver to load-bearing implants can offer longer term solution to infection in vivo. Keeping that in mind, the focus of this study was to understand the long term release study of silver ions for a period of minimum 6months from silver coated surface modified porous titanium implants. Implants were fabricated using a LENS™ system, a powder based additive manufacturing technique, with at least 25% volume porosity, with and without TiO 2 nanotubes in phosphate buffer saline (pH 7.4) to see if the total release of silver ions is within the toxic limit for human cells. Considering the fact that infection sites may reduce the local pH, silver release was also studied in acetate buffer (pH 5.0) for a period of 4weeks. Along with that, the osseointegrative properties as well as cytotoxicity of porous titanium implants were assessed in vivo for a period of 12weeks using a rat distal femur model. In vivo results indicate that porous titanium implants with silver coating show comparable, if not better, biocompatibility and bonding at the bone-implant interface negating any concerns related to toxicity related to silver to normal cells. The current research is based on our recently patented technology, however focused on understanding longer-term silver release to mitigate infection related problems in load-bearing implants that can even arise several months after the surgery. Prevention of orthopedic device related infection using antibiotics has met with limited success and is still a big concern during post-surgery. Use of silver as an antibiotic treatment to prevent surgical infections is being explored due to the well-established antimicrobial properties of silver. However, in most cases silver is used in particulate form with wound dressings or with short-term devices such as catheters but not with load-bearing implants. We hypothesize that strongly adherent silver to load-bearing implants can offer longer-term solution towards infection in vivo. Keeping that in mind, the focus of this study was to understand the long-term release of silver ions, for a period of minimum 6months, from silver coated surface modified porous titanium implants. Copyright © 2017 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Eisenberg, Michael B.; Berkowitz, Robert E.
This book about using the Big6 information problem solving process model in elementary schools is organized into two parts. Providing an overview of the Big6 approach, Part 1 includes the following chapters: "Introduction: The Need," including the information problem, the Big6 and other process models, and teaching/learning the Big6;…
["Big data" - large data, a lot of knowledge?].
Hothorn, Torsten
2015-01-28
Since a couple of years, the term Big Data describes technologies to extract knowledge from data. Applications of Big Data and their consequences are also increasingly discussed in the mass media. Because medicine is an empirical science, we discuss the meaning of Big Data and its potential for future medical research.
An Overview of Magnetic Bearing Technology for Gas Turbine Engines
NASA Technical Reports Server (NTRS)
Clark, Daniel J.; Jansen, Mark J.; Montague, Gerald T.
2004-01-01
The idea of the magnetic bearing and its use in exotic applications has been conceptualized for many years, over a century, in fact. Patented, passive systems using permanent magnets date back over 150 years. More recently, scientists of the 1930s began investigating active systems using electromagnets for high-speed ultracentrifuges. However, passive magnetic bearings are physically unstable and active systems only provide proper stiffness and damping through sophisticated controllers and algorithms. This is precisely why, until the last decade, magnetic bearings did not become a practical alternative to rolling element bearings. Today, magnetic bearing technology has become viable because of advances in micro-processing controllers that allow for confident and robust active control. Further advances in the following areas: rotor and stator materials and designs which maximize flux, minimize energy losses, and minimize stress limitations; wire materials and coatings for high temperature operation; high-speed micro processing for advanced controller designs and extremely robust capabilities; back-up bearing technology for providing a viable touchdown surface; and precision sensor technology; have put magnetic bearings on the forefront of advanced, lubrication free support systems. This paper will discuss a specific joint program for the advancement of gas turbine engines and how it implies the vitality of magnetic bearings, a brief comparison between magnetic bearings and other bearing technologies in both their advantages and limitations, and an examination of foreseeable solutions to historically perceived limitations to magnetic bearing.
Panahiazar, Maryam; Taslimitehrani, Vahid; Jadhav, Ashutosh; Pathak, Jyotishman
2014-10-01
In healthcare, big data tools and technologies have the potential to create significant value by improving outcomes while lowering costs for each individual patient. Diagnostic images, genetic test results and biometric information are increasingly generated and stored in electronic health records presenting us with challenges in data that is by nature high volume, variety and velocity, thereby necessitating novel ways to store, manage and process big data. This presents an urgent need to develop new, scalable and expandable big data infrastructure and analytical methods that can enable healthcare providers access knowledge for the individual patient, yielding better decisions and outcomes. In this paper, we briefly discuss the nature of big data and the role of semantic web and data analysis for generating "smart data" which offer actionable information that supports better decision for personalized medicine. In our view, the biggest challenge is to create a system that makes big data robust and smart for healthcare providers and patients that can lead to more effective clinical decision-making, improved health outcomes, and ultimately, managing the healthcare costs. We highlight some of the challenges in using big data and propose the need for a semantic data-driven environment to address them. We illustrate our vision with practical use cases, and discuss a path for empowering personalized medicine using big data and semantic web technology.
A study on specialist or special disease clinics based on big data.
Fang, Zhuyuan; Fan, Xiaowei; Chen, Gong
2014-09-01
Correlation analysis and processing of massive medical information can be implemented through big data technology to find the relevance of different factors in the life cycle of a disease and to provide the basis for scientific research and clinical practice. This paper explores the concept of constructing a big medical data platform and introduces the clinical model construction. Medical data can be collected and consolidated by distributed computing technology. Through analysis technology, such as artificial neural network and grey model, a medical model can be built. Big data analysis, such as Hadoop, can be used to construct early prediction and intervention models as well as clinical decision-making model for specialist and special disease clinics. It establishes a new model for common clinical research for specialist and special disease clinics.
Machine learning for Big Data analytics in plants.
Ma, Chuang; Zhang, Hao Helen; Wang, Xiangfeng
2014-12-01
Rapid advances in high-throughput genomic technology have enabled biology to enter the era of 'Big Data' (large datasets). The plant science community not only needs to build its own Big-Data-compatible parallel computing and data management infrastructures, but also to seek novel analytical paradigms to extract information from the overwhelming amounts of data. Machine learning offers promising computational and analytical solutions for the integrative analysis of large, heterogeneous and unstructured datasets on the Big-Data scale, and is gradually gaining popularity in biology. This review introduces the basic concepts and procedures of machine-learning applications and envisages how machine learning could interface with Big Data technology to facilitate basic research and biotechnology in the plant sciences. Copyright © 2014 Elsevier Ltd. All rights reserved.
A Tour of Big Data, Open Source Data Management Technologies from the Apache Software Foundation
NASA Astrophysics Data System (ADS)
Mattmann, C. A.
2012-12-01
The Apache Software Foundation, a non-profit foundation charged with dissemination of open source software for the public good, provides a suite of data management technologies for distributed archiving, data ingestion, data dissemination, processing, triage and a host of other functionalities that are becoming critical in the Big Data regime. Apache is the world's largest open source software organization, boasting over 3000 developers from around the world all contributing to some of the most pervasive technologies in use today, from the HTTPD web server that powers a majority of Internet web sites to the Hadoop technology that is now projected at over a $1B dollar industry. Apache data management technologies are emerging as de facto off-the-shelf components for searching, distributing, processing and archiving key science data sets both geophysical, space and planetary based, all the way to biomedicine. In this talk, I will give a virtual tour of the Apache Software Foundation, its meritocracy and governance structure, and also its key big data technologies that organizations can take advantage of today and use to save cost, schedule, and resources in implementing their Big Data needs. I'll illustrate the Apache technologies in the context of several national priority projects, including the U.S. National Climate Assessment (NCA), and in the International Square Kilometre Array (SKA) project that are stretching the boundaries of volume, velocity, complexity, and other key Big Data dimensions.
Business Performance Measurements in Asset Management with the Support of Big Data Technologies
NASA Astrophysics Data System (ADS)
Campos, Jaime; Sharma, Pankaj; Jantunen, Erkki; Baglee, David; Fumagalli, Luca
2017-09-01
The paper reviews the performance measurement in the domain of interest. Important data in asset management are further, discussed. The importance and the characteristics of today's ICTs capabilities are also mentioned in the paper. The role of new concepts such as big data and data mining analytical technologies in managing the performance measurements in asset management are discussed in detail. The authors consequently suggest the use of the modified Balanced Scorecard methodology highlighting both quantitative and qualitative aspects, which is crucial for optimal use of the big data approach and technologies.
Clinical research of traditional Chinese medicine in big data era.
Zhang, Junhua; Zhang, Boli
2014-09-01
With the advent of big data era, our thinking, technology and methodology are being transformed. Data-intensive scientific discovery based on big data, named "The Fourth Paradigm," has become a new paradigm of scientific research. Along with the development and application of the Internet information technology in the field of healthcare, individual health records, clinical data of diagnosis and treatment, and genomic data have been accumulated dramatically, which generates big data in medical field for clinical research and assessment. With the support of big data, the defects and weakness may be overcome in the methodology of the conventional clinical evaluation based on sampling. Our research target shifts from the "causality inference" to "correlativity analysis." This not only facilitates the evaluation of individualized treatment, disease prediction, prevention and prognosis, but also is suitable for the practice of preventive healthcare and symptom pattern differentiation for treatment in terms of traditional Chinese medicine (TCM), and for the post-marketing evaluation of Chinese patent medicines. To conduct clinical studies involved in big data in TCM domain, top level design is needed and should be performed orderly. The fundamental construction and innovation studies should be strengthened in the sections of data platform creation, data analysis technology and big-data professionals fostering and training.
Tissue bioengineering and artificial organs.
Llames, Sara; García, Eva; Otero Hernández, Jesús; Meana, Alvaro
2012-01-01
The scarcity of organs and tissues for transplant and the need of immunosuppressive drugs to avoid rejection constitute two reasons that justify organ and tissue production in the laboratory. Tissue engineering based tissues (TE) could allow to regenerate the whole organ from a fragment or even to produce several organs from an organ donor for grafting purposes. TE is based in: (1) the ex vivo expansion of cells, (2) the seeding of these expanded cells in tridimensional structures that mimic physiological conditions and, (3) grafting the prototype. In order to graft big structures it is necessary that the organ or tissue produced "ex vivo" bears a vascular tree to ensure the nutrition of its deep layers. At present, no technology has been developed to provide this vascular tree to TE derived products. Thus, these tissues must be thin enough to acquire nutrients during the first days by diffusion from surrounding tissues. This fact constitutes nowadays the greatest limitation of technologies for organ development in the laboratory.In this chapter, all these problems and their possible solutions are commented. Also, the present status of TE techniques in the regeneration of different organ systems is reviewed.
Development of Structural Energy Storage for Aeronautics Applications
NASA Technical Reports Server (NTRS)
Santiago-Dejesus, Diana; Loyselle, Patricia L.; Demattia, Brianne; Bednarcyk, Brett; Olson, Erik; Smith, Russell; Hare, David
2017-01-01
The National Aeronautics and Space Administration (NASA) has identified Multifunctional Structures for High Efficiency Lightweight Load-bearing Storage (M-SHELLS) as critical to development of hybrid gas-electric propulsion for commercial aeronautical transport in the N+3 timeframe. The established goals include reducing emissions by 80 and fuel consumption by 60 from todays state of the art. The advancement will enable technology for NASA Aeronautics Research Mission Directorates (ARMD) Strategic Thrust 3 to pioneer big leaps in efficiency and environmental performance for ultra-efficient commercial transports, as well as Strategic Thrust 4 to pioneer low-carbon propulsion technology in the transition to that scheme. The M-SHELLS concept addresses the hybrid gas-electric highest risk with its primary objective: to save structures energy storage system weight for future commercial hybrid electric propulsion aircraft by melding the load-carrying structure with energy storage in a single material. NASA's multifunctional approach also combines supercapacitor and battery chemistries in a synergistic energy storage arrangement in tandem with supporting good mechanical properties. The arrangement provides an advantageous combination of specific power, energy, and strength.
BigData as a Driver for Capacity Building in Astrophysics
NASA Astrophysics Data System (ADS)
Shastri, Prajval
2015-08-01
Exciting public interest in astrophysics acquires new significance in the era of Big Data. Since Big Data involves advanced technologies of both software and hardware, astrophysics with Big Data has the potential to inspire young minds with diverse inclinations - i.e., not just those attracted to physics but also those pursuing engineering careers. Digital technologies have become steadily cheaper, which can enable expansion of the Big Data user pool considerably, especially to communities that may not yet be in the astrophysics mainstream, but have high potential because of access to thesetechnologies. For success, however, capacity building at the early stages becomes key. The development of on-line pedagogical resources in astrophysics, astrostatistics, data-mining and data visualisation that are designed around the big facilities of the future can be an important effort that drives such capacity building, especially if facilitated by the IAU.
Code of Federal Regulations, 2010 CFR
2010-10-01
.... Migratory Game Bird Hunting. We allow hunting of goose, duck, coot, and snipe on that portion of the refuge... § 32.2(k)). B. Upland Game Hunting. [Reserved] C. Big Game Hunting. [Reserved] D. Sport Fishing. We.... Highway 101. Bear Valley National Wildlife Refuge A. Migratory Game Bird Hunting. [Reserved] B. Upland...
Bringing the Tools of Big Science to Bear on Local Environmental Challenges
ERIC Educational Resources Information Center
Bronson, Scott; Jones, Keith W.; Brown, Maria
2013-01-01
We describe an interactive collaborative environmental education project that makes advanced laboratory facilities at Brookhaven National Laboratory accessible for one-year or multi-year science projects for the high school level. Cyber-enabled Environmental Science (CEES) utilizes web conferencing software to bring multi-disciplinary,…
ERIC Educational Resources Information Center
Phillips, Loraine; Roach, David; Williamson, Celia
2014-01-01
In Texas, educators working to coordinate the efforts of fifty community colleges, thirty-eight universities, and six university systems are bringing the resources of the Association of American Colleges and Universities (AAC&U) Liberal Education and America's Promise (LEAP) initiative to bear in order to ensure that the state's nearly 1.5…
[Medical big data and precision medicine: prospects of epidemiology].
Song, J; Hu, Y H
2016-08-10
Since the development of high-throughput technology, electronic medical record system and big data technology, the value of medical data has caused more attention. On the other hand, the proposal of Precision Medicine Initiative opens up the prospect for medical big data. As a Tool-related Discipline, Epidemiology is, focusing on exploitation the resources of existing big data and promoting the integration of translational research and knowledge to completely unlocking the "black box" of exposure-disease continuum. It also tries to accelerating the realization of the ultimate goal on precision medicine. The overall purpose, however is to translate the evidence from scientific research to improve the health of the people.
Big Data: An Opportunity for Collaboration with Computer Scientists on Data-Driven Science
NASA Astrophysics Data System (ADS)
Baru, C.
2014-12-01
Big data technologies are evolving rapidly, driven by the need to manage ever increasing amounts of historical data; process relentless streams of human and machine-generated data; and integrate data of heterogeneous structure from extremely heterogeneous sources of information. Big data is inherently an application-driven problem. Developing the right technologies requires an understanding of the applications domain. Though, an intriguing aspect of this phenomenon is that the availability of the data itself enables new applications not previously conceived of! In this talk, we will discuss how the big data phenomenon creates an imperative for collaboration among domain scientists (in this case, geoscientists) and computer scientists. Domain scientists provide the application requirements as well as insights about the data involved, while computer scientists help assess whether problems can be solved with currently available technologies or require adaptaion of existing technologies and/or development of new technologies. The synergy can create vibrant collaborations potentially leading to new science insights as well as development of new data technologies and systems. The area of interface between geosciences and computer science, also referred to as geoinformatics is, we believe, a fertile area for interdisciplinary research.
NASA Astrophysics Data System (ADS)
Hato, M.; Inamori, T.; Matsuoka, T.; Shimizu, S.
2003-04-01
Occurrence of methane hydrates in the Nankai Trough, located off the south-eastern coast of Japan, was confirmed by the exploratory test well drilling conducted by Japan’s Ministry of International Trade and Industry in 1999. Confirmation of methane hydrate has given so big impact to the Japan's future energy strategy and scientific and technological interest was derived from the information of the coring and logging results at the well. Following the above results, Japan National Oil Corporation (JNOC) launched the national project, named as MH21, for establishing the technology of methane hydrate exploration and related technologies such as production and development. As one of the research project for evaluating the total amount of the methane hydrate, Amplitude versus Offset (AVO) was applied to the seismic data acquired in the Nankai Trough area. The main purpose of the AVO application is to evaluate the validity of delineation of methane hydrate-bearing zones. Since methane hydrate is thought to accompany with free-gas in general just below the methane hydrate-bearing zones, the AVO has a possibility of describing the presence of free-gas. The free-gas is thought to be located just below the base of methane hydrate stability zone which is characterized by the Bottom Simulating Reflectors (BSRs) on the seismic section. In this sense, AVO technology, which was developed as gas delineation tools, can be utilized for methane hydrate exploration. The result of AVO analysis clearly shows gas-related anomaly below the BSRs. Appearance of the AVO anomaly has so wide variety. Some of the anomalies might not correspond to the free-gas existence, however, some of them may show free-gas. We are now going to develop methodology to clearly discriminate free-gas from non-gas zone by integrating various types of seismic methods such as seismic inversion and seismic attribute analysis.
Enhancement of K - 12 Astronomy Education Through Multicultural Outreach
NASA Astrophysics Data System (ADS)
Yanamandra-Fisher, P. A.
1997-12-01
History bears out the fact that various cultures developed their own unique interpretation of the stars and the universe. Children are first introduced to the cultural lore in their pre--school years by their primary teachers --- the parents. In today's technological world, with social migration and assimilation of differing ethnic peoples into a common society, parents often neglect or ignore this valuable contibution to enhancing the child's interest in astronomy at an early age. This important contibution can be re-awakened by applying a multicultural approach to introductory astronomy/solar system science in primary grades by teachers, parents and scientists. Such an integrated approach unifies a society and instructs the child by identifying its cultural and scientific heritage. Some common examples are the interpretation of the Big Dipper, Zodiac and the planets. These and other examples will be provided along with teaching aids.
NASA Astrophysics Data System (ADS)
Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.
2016-09-01
Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.
The Big6 Collection: The Best of the Big6 Newsletter.
ERIC Educational Resources Information Center
Eisenberg, Michael B.; Berkowitz, Robert E.
The Big6 is a complete approach to implementing meaningful learning and teaching of information and technology skills, essential for 21st century living. Including in-depth articles, practical tips, and explanations, this book offers a varied range of material about students and teachers, the Big6, and curriculum. The book is divided into 10 main…
Technical Development Path for Gas Foil Bearings
NASA Technical Reports Server (NTRS)
Dellacorte, Christopher
2016-01-01
Foil gas bearings are in widespread commercial use in air cycle machines, turbocompressors and microturbine generators and are emerging in more challenging applications such as turbochargers, auxiliary power units and propulsion gas turbines. Though not well known, foil bearing technology is well over fifty years old. Recent technological developments indicate that their full potential has yet to be realized. This paper investigates the key technological developments that have characterized foil bearing advances. It is expected that a better understanding of foil gas bearing development path will aid in future development and progress towards more advanced applications.
Exploring the Impact of Digital Technologies on Professional Responsibilities and Education
ERIC Educational Resources Information Center
Fenwick, Tara; Edwards, Richard
2016-01-01
Digital technologies in combination with "big" data and predictive analytics are having a significant impact upon professional practices at individual, organisational, national and international levels. The interplay of code, algorithms and big data are increasingly pervasive in the governing, leadership and practices of different…
Ten aspects of the Big Five in the Personality Inventory for DSM-5.
DeYoung, Colin G; Carey, Bridget E; Krueger, Robert F; Ross, Scott R
2016-04-01
Diagnostic and Statistical Manual of Mental Disorders (5th ed.; DSM-5) includes a dimensional model of personality pathology, operationalized in the Personality Inventory for DSM-5 (PID-5), with 25 facets grouped into 5 higher order factors resembling the Big Five personality dimensions. The present study tested how well these 25 facets could be integrated with the 10-factor structure of traits within the Big Five that is operationalized by the Big Five Aspect Scales (BFAS). In 2 healthy adult samples, 10-factor solutions largely confirmed our hypothesis that each of the 10 BFAS would be the highest loading BFAS on 1 and only 1 factor. Varying numbers of PID-5 scales were additional markers of each factor, and the overall factor structure in the first sample was well replicated in the second. Our results allow Cybernetic Big Five Theory (CB5T) to be brought to bear on manifestations of personality disorder, because CB5T offers mechanistic explanations of the 10 factors measured by the BFAS. Future research, therefore, may begin to test hypotheses derived from CB5T regarding the mechanisms that are dysfunctional in specific personality disorders. (c) 2016 APA, all rights reserved).
Implementing the Big6: Context, Context, Context.
ERIC Educational Resources Information Center
Eisenberg, Mike
1998-01-01
Emphasizes the importance of context within the curriculum when implementing the Big6 information and technology skills. Curriculum mapping is described as a means of gathering and displaying information about curriculum units which can then be analyzed to select units and assignments best suited to integrated information and technology skills…
Using Browser Notebooks to Analyse Big Atmospheric Data-sets in the Cloud
NASA Astrophysics Data System (ADS)
Robinson, N.; Tomlinson, J.; Arribas, A.; Prudden, R.
2016-12-01
We are presenting an account of our experience building an ecosystem for the analysis of big atmospheric data-sets. By using modern technologies we have developed a prototype platform which is scaleable and capable of analysing very large atmospheric datasets. We tested different big-data ecosystems such as Hadoop MapReduce, Spark and Dask, in order to find the one which was best suited for analysis of multidimensional binary data such as NetCDF. We make extensive use of infrastructure-as-code and containerisation to provide a platform which is reusable, and which can scale to accommodate changes in demand. We make this platform readily accessible using browser based notebooks. As a result, analysts with minimal technology experience can, in tens of lines of Python, make interactive data-visualisation web pages, which can analyse very large amounts of data using cutting edge big-data technology
Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang
2015-01-01
Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265
ERIC Educational Resources Information Center
Brainard, Jeffrey
2007-01-01
Plants that bear less familiar names such as switch grass, "Miscanthus," and kenaf, are not much to look at, having weathered Iowa's winter snows. But Iowa State researchers see these crops as seeds of change in alternative fuels. Rows of experimental crops line the test plots at Iowa State University's research farm. Although corn is…
How Should the Financial Crisis Change How We Teach Economics?
ERIC Educational Resources Information Center
Shiller, Robert J.
2010-01-01
Student dissatisfaction with teaching of economics--particularly with macroeconomics--during the current financial crisis mirrors dissatisfaction that was expressed during the last big crisis, the Great Depression. Then and now, a good number of students have felt that their lectures bear little relation to the economic crisis raging outside the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Casey, Daniel; Malta, Patrick
1990-06-01
Project goals are to rehabilitate 1120 acres of big game (elk and mule deer, Odocoileus hemionus) winter range on the Hungry Horse and Spotted Bear Districts of Flathead National Forest lands adjacent to Hungry Horse Reservoir. This project represents the initial phase of implementation toward the mitigation goal. A minimum of 547 acres Trust-funded enhancements are called for in this plan. The remainder are part of the typical Forest Service management activities for the project area. Monitor and evaluate the effects of project implementation on the big game forage base and elk and mule deer populations in the project area.more » Monitor enhancement success to determine effective acreage to be credited against mitigation goal. Additional enhancement acreage will be selected elsewhere in the Flathead Forest or other lands adjacent'' to the reservoir based on progress toward the mitigation goal as determined through monitoring. The Wildlife Mitigation Trust Fund Advisory Committee will serve to guide decisions regarding future enhancement efforts. 7 refs.« less
Current applications of big data in obstetric anesthesiology.
Klumpner, Thomas T; Bauer, Melissa E; Kheterpal, Sachin
2017-06-01
The narrative review aims to highlight several recently published 'big data' studies pertinent to the field of obstetric anesthesiology. Big data has been used to study rare outcomes, to identify trends within the healthcare system, to identify variations in practice patterns, and to highlight potential inequalities in obstetric anesthesia care. Big data studies have helped define the risk of rare complications of obstetric anesthesia, such as the risk of neuraxial hematoma in thrombocytopenic parturients. Also, large national databases have been used to better understand trends in anesthesia-related adverse events during cesarean delivery as well as outline potential racial/ethnic disparities in obstetric anesthesia care. Finally, real-time analysis of patient data across a number of disparate health information systems through the use of sophisticated clinical decision support and surveillance systems is one promising application of big data technology on the labor and delivery unit. 'Big data' research has important implications for obstetric anesthesia care and warrants continued study. Real-time electronic surveillance is a potentially useful application of big data technology on the labor and delivery unit.
2014-10-02
hadoop / Bradicich, T. & Orci, S. (2012). Moore’s Law of Big Data National Instruments Instrumentation News. December 2012...accurate and meaningful conclusions from such a large amount of data is a growing problem, and the term “ Big Data ” describes this phenomenon. Big Data ...is “ Big Data ”. 2. HISTORY OF BIG DATA The technology research firm International Data Corporation (IDC) recently performed a study on digital
Socioeconomic Factors Affecting Local Support for Black Bear Recovery Strategies
NASA Astrophysics Data System (ADS)
Morzillo, Anita T.; Mertig, Angela G.; Hollister, Jeffrey W.; Garner, Nathan; Liu, Jianguo
2010-06-01
There is global interest in recovering locally extirpated carnivore species. Successful efforts to recover Louisiana black bear in Louisiana have prompted interest in recovery throughout the species’ historical range. We evaluated support for three potential black bear recovery strategies prior to public release of a black bear conservation and management plan for eastern Texas, United States. Data were collected from 1,006 residents living in proximity to potential recovery locations, particularly Big Thicket National Preserve. In addition to traditional logistic regression analysis, we used conditional probability analysis to statistically and visually evaluate probabilities of public support for potential black bear recovery strategies based on socioeconomic characteristics. Allowing black bears to repopulate the region on their own (i.e., without active reintroduction) was the recovery strategy with the greatest probability of acceptance. Recovery strategy acceptance was influenced by many socioeconomic factors. Older and long-time local residents were most likely to want to exclude black bears from the area. Concern about the problems that black bears may cause was the only variable significantly related to support or non-support across all strategies. Lack of personal knowledge about black bears was the most frequent reason for uncertainty about preferred strategy. In order to reduce local uncertainty about possible recovery strategies, we suggest that wildlife managers focus outreach efforts on providing local residents with general information about black bears, as well as information pertinent to minimizing the potential for human-black bear conflict.
[The technological innovation strategy for quality control of Chinese medicine based on Big Data].
Li, Zhen-hao; Qian, Zhong-zhi; Cheng, Yi-yu
2015-09-01
The evolution of the quality control concepts of medical products within the global context and the development of the quality control technology of Chinese medicine are briefly described. Aimed at the bottlenecks in the regulation and quality control of Chinese medicine, using Big Data technology to address the significant challenges in Chinese medicine industry is proposed. For quality standard refinements and internationalization of Chinese medicine, a technological innovation strategy encompassing its methodology, and the R&D direction of the subsequent core technology are also presented.
Beyond Moore's Law: Harnessing spatial-digital disruptive technologies for Digital Earth
NASA Astrophysics Data System (ADS)
Foresman, Timothy W.
2016-11-01
Moore's law will reach its plateau by 2020. Big data, however, will continue to increase as the Internet of Things and social media converge into the new era of ‘huge data’. Disruptive technologies, including big data and cloud computing are forces impacting business and government communities. The truth of our collective future is suggested to align with the Digital Earth (DE) vision. Benefits of technological advances will be manifested from business performance improvements based on capitalizing the locational attributes of corporate and government assets - the foundation of big data. Better governance and better business represents a key foundation for sustainability and therefore should be explicit DE guiding principles.
Earth Science Data Analysis in the Era of Big Data
NASA Technical Reports Server (NTRS)
Kuo, K.-S.; Clune, T. L.; Ramachandran, R.
2014-01-01
Anyone with even a cursory interest in information technology cannot help but recognize that "Big Data" is one of the most fashionable catchphrases of late. From accurate voice and facial recognition, language translation, and airfare prediction and comparison, to monitoring the real-time spread of flu, Big Data techniques have been applied to many seemingly intractable problems with spectacular successes. They appear to be a rewarding way to approach many currently unsolved problems. Few fields of research can claim a longer history with problems involving voluminous data than Earth science. The problems we are facing today with our Earth's future are more complex and carry potentially graver consequences than the examples given above. How has our climate changed? Beside natural variations, what is causing these changes? What are the processes involved and through what mechanisms are these connected? How will they impact life as we know it? In attempts to answer these questions, we have resorted to observations and numerical simulations with ever-finer resolutions, which continue to feed the "data deluge." Plausibly, many Earth scientists are wondering: How will Big Data technologies benefit Earth science research? As an example from the global water cycle, one subdomain among many in Earth science, how would these technologies accelerate the analysis of decades of global precipitation to ascertain the changes in its characteristics, to validate these changes in predictive climate models, and to infer the implications of these changes to ecosystems, economies, and public health? Earth science researchers need a viable way to harness the power of Big Data technologies to analyze large volumes and varieties of data with velocity and veracity. Beyond providing speedy data analysis capabilities, Big Data technologies can also play a crucial, albeit indirect, role in boosting scientific productivity by facilitating effective collaboration within an analysis environment. To illustrate the effects of combining a Big Data technology with an effective means of collaboration, we relate the (fictitious) experience of an early-career Earth science researcher a few years beyond the present, interlaced and contrasted with reminiscences of its recent past (i.e., the present).
An ancient eye test--using the stars.
Bohigian, George M
2008-01-01
Vision testing in ancient times was as important as it is today. The predominant vision testing in some cultures was the recognition and identification of constellations and celestial bodies of the night sky. A common ancient naked eye test used the double star of the Big Dipper in the constellation Ursa Major or the Big Bear. The second star from the end of the handle of the Big Dipper is an optical double star. The ability to perceive this separation of these two stars, Mizar and Alcor, was considered a test of good vision and was called the "test" or presently the Arab Eye Test. This article is the first report of the correlation of this ancient eye test to the 20/20 line in the current Snellen visual acuity test. This article describes the astronomy, origin, history, and the practicality of this test and how it correlates with the present day Snellen visual acuity test.
Big Ideas at the Center for Innovation in Education at Thomas College
ERIC Educational Resources Information Center
Prawat, Ted
2016-01-01
Schools and teachers are looking for innovative ways to teach the "big ideas" emerging in the core curricula, especially in STEAM fields (science technology, engineering, arts and math). As a result, learning environments that support digital learning and educational technology on various platforms and devices are taking on…
High-Temperature (1000 F) Magnetic Thrust Bearing Test Rig Completed and Operational
NASA Technical Reports Server (NTRS)
Montague, Gerald T.
2005-01-01
Large axial loads are induced on the rolling element bearings of a gas turbine. To extend bearing life, designers use pneumatic balance pistons to reduce the axial load on the bearings. A magnetic thrust bearing could replace the balance pistons to further reduce the axial load. To investigate this option, the U.S. Army Research Laboratory, the NASA Glenn Research Center, and Texas A&M University designed and fabricated a 7-in.- diameter magnetic thrust bearing to operate at 1000 F and 30,000 rpm, with a 1000-lb load capacity. This research was funded through a NASA Space Technology Transfer Act with Allison Advance Development Company under the Ultra-Efficient Engine Technology (UEET) Intelligent Propulsion Systems Foundation Technology project.
Yin, Long-Lin; Song, Bin; Guan, Ying; Li, Ying-Chun; Chen, Guang-Wen; Zhao, Li-Ming; Lai, Li
2014-09-01
To investigate MRI features and associated histological and pathological changes of hilar and extrahepatic big bile duct cholangiocarcinoma with different morphological sub-types, and its value in differentiating between nodular cholangiocarcinoma (NCC) and intraductal growing cholangiocarcinoma (IDCC). Imaging data of 152 patients with pathologically confirmed hilar and extrahepatic big bile duct cholangiocarcinoma were reviewed, which included 86 periductal infiltrating cholangiocarcinoma (PDCC), 55 NCC, and 11 IDCC. Imaging features of the three morphological sub-types were compared. Each of the subtypes demonstrated its unique imaging features. Significant differences (P < 0.05) were found between NCC and IDCC in tumor shape, dynamic enhanced pattern, enhancement degree during equilibrium phase, multiplicity or singleness of tumor, changes in wall and lumen of bile duct at the tumor-bearing segment, dilatation of tumor upstream or downstream bile duct, and invasion of adjacent organs. Imaging features reveal tumor growth patterns of hilar and extrahepatic big bile duct cholangiocarcinoma. MRI united-sequences examination can accurately describe those imaging features for differentiation diagnosis.
Supporting diagnosis and treatment in medical care based on Big Data processing.
Lupşe, Oana-Sorina; Crişan-Vida, Mihaela; Stoicu-Tivadar, Lăcrămioara; Bernard, Elena
2014-01-01
With information and data in all domains growing every day, it is difficult to manage and extract useful knowledge for specific situations. This paper presents an integrated system architecture to support the activity in the Ob-Gin departments with further developments in using new technology to manage Big Data processing - using Google BigQuery - in the medical domain. The data collected and processed with Google BigQuery results from different sources: two Obstetrics & Gynaecology Departments, the TreatSuggest application - an application for suggesting treatments, and a home foetal surveillance system. Data is uploaded in Google BigQuery from Bega Hospital Timişoara, Romania. The analysed data is useful for the medical staff, researchers and statisticians from public health domain. The current work describes the technological architecture and its processing possibilities that in the future will be proved based on quality criteria to lead to a better decision process in diagnosis and public health.
Global Oscillation Network Group
NASA Astrophysics Data System (ADS)
Murdin, P.
2000-11-01
The Global Oscillation Network Group (GONG) is an international, community-based project, operated by the NATIONAL SOLAR OBSERVATORY for the US National Science Foundation, to conduct a detailed study of the internal structure and dynamics of the Sun over an 11 year solar cycle using helioseismology. 10 242 velocity images are obtained by a six-station network located at Big Bear Solar Observato...
ERIC Educational Resources Information Center
Kugelmass, Judy W.
2000-01-01
Describes the use of autobiographical storytelling, personal myths, and visual imagery in preparing elementary and special educators for activist roles in creating effective, inclusive schools. Graduate students were presented with a social-constructivist perspective toward the content and process of schooling. Examples of materials produced by…
Latino Workers Hitting a Blue-Collar Ceiling. New Journalism on Latino Children
ERIC Educational Resources Information Center
Fuller, Bruce; McElmurry, Sara
2011-01-01
Chicago has a dynamic history of embracing change, evolving from an agricultural and commercial hub to the steel powerhouse that would undergird America's industrial revolution. The "City of Big Shoulders" now bears a sizeable burden, one that again requires it to embrace change. The metro area must shift to an economy built on knowledge…
LLMapReduce: Multi-Lingual Map-Reduce for Supercomputing Environments
2015-11-20
1990s. Popularized by Google [36] and Apache Hadoop [37], map-reduce has become a staple technology of the ever- growing big data community...Lexington, MA, U.S.A Abstract— The map-reduce parallel programming model has become extremely popular in the big data community. Many big data ...to big data users running on a supercomputer. LLMapReduce dramatically simplifies map-reduce programming by providing simple parallel programming
NASA Technical Reports Server (NTRS)
1985-01-01
The service life of the Space Shuttle Main Engine (SSME) turbomachinery bearings was a predominant factor in engine durability and maintenance problems. Recent data has indicated that bearing life is about one order of magnitude lower than the goal of seven and one-half hours particularly those in the High Pressure Oxidizer Turbopump (HPOTP). Bearing technology, primarily cryogenic turbomachinery bearing technology, is expanded by exploring the life and performance effects of design changes; design concept changes; materials changes; manufacturing technique changes; and lubrication system changes. Each variation is assessed against the current bearing design in full scale cryogenic tests.
The Challenge of Handling Big Data Sets in the Sensor Web
NASA Astrophysics Data System (ADS)
Autermann, Christian; Stasch, Christoph; Jirka, Simon
2016-04-01
More and more Sensor Web components are deployed in different domains such as hydrology, oceanography or air quality in order to make observation data accessible via the Web. However, besides variability of data formats and protocols in environmental applications, the fast growing volume of data with high temporal and spatial resolution is imposing new challenges for Sensor Web technologies when sharing observation data and metadata about sensors. Variability, volume and velocity are the core issues that are addressed by Big Data concepts and technologies. Most solutions in the geospatial sector focus on remote sensing and raster data, whereas big in-situ observation data sets relying on vector features require novel approaches. Hence, in order to deal with big data sets in infrastructures for observational data, the following questions need to be answered: 1. How can big heterogeneous spatio-temporal datasets be organized, managed, and provided to Sensor Web applications? 2. How can views on big data sets and derived information products be made accessible in the Sensor Web? 3. How can big observation data sets be processed efficiently? We illustrate these challenges with examples from the marine domain and outline how we address these challenges. We therefore show how big data approaches from mainstream IT can be re-used and applied to Sensor Web application scenarios.
Big Data Application in Biomedical Research and Health Care: A Literature Review.
Luo, Jake; Wu, Min; Gopukumar, Deepika; Zhao, Yiqing
2016-01-01
Big data technologies are increasingly used for biomedical and health-care informatics research. Large amounts of biological and clinical data have been generated and collected at an unprecedented speed and scale. For example, the new generation of sequencing technologies enables the processing of billions of DNA sequence data per day, and the application of electronic health records (EHRs) is documenting large amounts of patient data. The cost of acquiring and analyzing biomedical data is expected to decrease dramatically with the help of technology upgrades, such as the emergence of new sequencing machines, the development of novel hardware and software for parallel computing, and the extensive expansion of EHRs. Big data applications present new opportunities to discover new knowledge and create novel methods to improve the quality of health care. The application of big data in health care is a fast-growing field, with many new discoveries and methodologies published in the last five years. In this paper, we review and discuss big data application in four major biomedical subdisciplines: (1) bioinformatics, (2) clinical informatics, (3) imaging informatics, and (4) public health informatics. Specifically, in bioinformatics, high-throughput experiments facilitate the research of new genome-wide association studies of diseases, and with clinical informatics, the clinical field benefits from the vast amount of collected patient data for making intelligent decisions. Imaging informatics is now more rapidly integrated with cloud platforms to share medical image data and workflows, and public health informatics leverages big data techniques for predicting and monitoring infectious disease outbreaks, such as Ebola. In this paper, we review the recent progress and breakthroughs of big data applications in these health-care domains and summarize the challenges, gaps, and opportunities to improve and advance big data applications in health care.
Big Data Application in Biomedical Research and Health Care: A Literature Review
Luo, Jake; Wu, Min; Gopukumar, Deepika; Zhao, Yiqing
2016-01-01
Big data technologies are increasingly used for biomedical and health-care informatics research. Large amounts of biological and clinical data have been generated and collected at an unprecedented speed and scale. For example, the new generation of sequencing technologies enables the processing of billions of DNA sequence data per day, and the application of electronic health records (EHRs) is documenting large amounts of patient data. The cost of acquiring and analyzing biomedical data is expected to decrease dramatically with the help of technology upgrades, such as the emergence of new sequencing machines, the development of novel hardware and software for parallel computing, and the extensive expansion of EHRs. Big data applications present new opportunities to discover new knowledge and create novel methods to improve the quality of health care. The application of big data in health care is a fast-growing field, with many new discoveries and methodologies published in the last five years. In this paper, we review and discuss big data application in four major biomedical subdisciplines: (1) bioinformatics, (2) clinical informatics, (3) imaging informatics, and (4) public health informatics. Specifically, in bioinformatics, high-throughput experiments facilitate the research of new genome-wide association studies of diseases, and with clinical informatics, the clinical field benefits from the vast amount of collected patient data for making intelligent decisions. Imaging informatics is now more rapidly integrated with cloud platforms to share medical image data and workflows, and public health informatics leverages big data techniques for predicting and monitoring infectious disease outbreaks, such as Ebola. In this paper, we review the recent progress and breakthroughs of big data applications in these health-care domains and summarize the challenges, gaps, and opportunities to improve and advance big data applications in health care. PMID:26843812
Big Data Technologies: New Opportunities for Diabetes Management.
Bellazzi, Riccardo; Dagliati, Arianna; Sacchi, Lucia; Segagni, Daniele
2015-04-24
The so-called big data revolution provides substantial opportunities to diabetes management. At least 3 important directions are currently of great interest. First, the integration of different sources of information, from primary and secondary care to administrative information, may allow depicting a novel view of patient's care processes and of single patient's behaviors, taking into account the multifaceted nature of chronic care. Second, the availability of novel diabetes technologies, able to gather large amounts of real-time data, requires the implementation of distributed platforms for data analysis and decision support. Finally, the inclusion of geographical and environmental information into such complex IT systems may further increase the capability of interpreting the data gathered and extract new knowledge from them. This article reviews the main concepts and definitions related to big data, it presents some efforts in health care, and discusses the potential role of big data in diabetes care. Finally, as an example, it describes the research efforts carried on in the MOSAIC project, funded by the European Commission. © 2015 Diabetes Technology Society.
Making big sense from big data in toxicology by read-across.
Hartung, Thomas
2016-01-01
Modern information technologies have made big data available in safety sciences, i.e., extremely large data sets that may be analyzed only computationally to reveal patterns, trends and associations. This happens by (1) compilation of large sets of existing data, e.g., as a result of the European REACH regulation, (2) the use of omics technologies and (3) systematic robotized testing in a high-throughput manner. All three approaches and some other high-content technologies leave us with big data--the challenge is now to make big sense of these data. Read-across, i.e., the local similarity-based intrapolation of properties, is gaining momentum with increasing data availability and consensus on how to process and report it. It is predominantly applied to in vivo test data as a gap-filling approach, but can similarly complement other incomplete datasets. Big data are first of all repositories for finding similar substances and ensure that the available data is fully exploited. High-content and high-throughput approaches similarly require focusing on clusters, in this case formed by underlying mechanisms such as pathways of toxicity. The closely connected properties, i.e., structural and biological similarity, create the confidence needed for predictions of toxic properties. Here, a new web-based tool under development called REACH-across, which aims to support and automate structure-based read-across, is presented among others.
Lighting innovations in concept cars
NASA Astrophysics Data System (ADS)
Berlitz, Stephan; Huhn, Wolfgang
2005-02-01
Concept cars have their own styling process. Because of the big media interest they give a big opportunity to bring newest technology with styling ideas to different fairgrounds. The LED technology in the concept cars Audi Pikes Peak, Nuvolari and Le Mans will be explained. Further outlook for the Audi LED strategy starting with LED Daytime Running Lamp will be given. The close work between styling and technical engineers results in those concept cars and further technical innovations based on LED technologies.
[Overall digitalization: leading innovation of endodontics in big data era].
Ling, J Q
2016-04-09
In big data era, digital technologies bring great challenges and opportunities to modern stomatology. The applications of digital technologies, such as cone-beam CT(CBCT), computer aided design,(CAD)and computer aided manufacture(CAM), 3D printing and digital approaches for education , provide new concepts and patterns to the treatment and study of endodontic diseases. This review provides an overview of the application and prospect of commonly used digital technologies in the development of endodontics.
Review of ultra-high density optical storage technologies for big data center
NASA Astrophysics Data System (ADS)
Hao, Ruan; Liu, Jie
2016-10-01
In big data center, optical storage technologies have many advantages, such as energy saving and long lifetime. However, how to improve the storage density of optical storage is still a huge challenge. Maybe the multilayer optical storage technology is the good candidate for big data center in the years to come. Due to the number of layers is primarily limited by transmission of each layer, the largest capacities of the multilayer disc are around 1 TB/disc and 10 TB/ cartridge. Holographic data storage (HDS) is a volumetric approach, but its storage capacity is also strictly limited by the diffractive nature of light. For a holographic disc with total thickness of 1.5mm, its potential capacities are not more than 4TB/disc and 40TB/ cartridge. In recent years, the development of super resolution optical storage technology has attracted more attentions. Super-resolution photoinduction-inhibition nanolithography (SPIN) technology with 9 nm feature size and 52nm two-line resolution was reported 3 years ago. However, turning this exciting principle into a real storage system is a huge challenge. It can be expected that in the future, the capacities of 10TB/disc and 100TB/cartridge can be achieved. More importantly, due to breaking the diffraction limit of light, SPIN technology will open the door to improve the optical storage capacity steadily to meet the need of the developing big data center.
Introducing Big Data Concepts in an Introductory Technology Course
ERIC Educational Resources Information Center
Frydenberg, Mark
2015-01-01
From their presence on social media sites to in-house application data files, the amount of data that companies, governments, individuals, and sensors generate is overwhelming. The growth of Big Data in both consumer and enterprise activities has caused educators to consider options for including Big Data in the Information Systems curriculum.…
New Data, Old Tensions: Big Data, Personalized Learning, and the Challenges of Progressive Education
ERIC Educational Resources Information Center
Dishon, Gideon
2017-01-01
Personalized learning has become the most notable application of big data in primary and secondary schools in the United States. The combination of big data and adaptive technological platforms is heralded as a revolution that could transform education, overcoming the outdated classroom model, and realizing the progressive vision of…
Will Big Data Mean the End of Privacy?
ERIC Educational Resources Information Center
Pence, Harry E.
2015-01-01
Big Data is currently a hot topic in the field of technology, and many campuses are considering the addition of this topic into their undergraduate courses. Big Data tools are not just playing an increasingly important role in many commercial enterprises; they are also combining with new digital devices to dramatically change privacy. This article…
ERIC Educational Resources Information Center
Wang, Yinying
2017-01-01
Despite abundant data and increasing data availability brought by technological advances, there has been very limited education policy studies that have capitalized on big data--characterized by large volume, wide variety, and high velocity. Drawing on the recent progress of using big data in public policy and computational social science…
NASA Astrophysics Data System (ADS)
Montanes Rodriguez, P.; Palle, E.; Goode, P.; Koonin, S.; Hickey, J.; Qiu, J.; Yurchysyn, V.
The Earthshine project, was run by California Institute of Technology (Caltech) between 1993 and 1995. Since 1998, it has been a collaborative effort between Caltech and Big Bear Solar Observatory (BBSO)/New Jersey Institute of Technology (NJIT). Our primary goal is the precise determination of a global and absolutely calibrated Earth's albedo and its synoptic, seasonal, and annual variability; as well as the measurement and investigation of the resolved reflected spectrum of the integrated Earth in the infrared region. The absorption in the infrared region, mainly due to rotational and vibrational transitions of the molecules, show the absorption bands of various telluric and solar components allowing the analysis of the Earth's spectrum such as it would be observed from the outer space. In this paper we present preliminary results of spectroscopic observations, made at Palomar Observatory with the 60-inch telescope's echelle spectrograph. They targeted the visible and near infrared region of the electromagnetic spectrum, and were performed in the spectral range (< 1μm) of the bands of Oxygen A, Oxygen B, water and Hydrogen alpha (H). The first three are typically terrestrial molecular bands. The fourth line, H, is a solar line, used mainly for spectral calibration.
'Big data' in pharmaceutical science: challenges and opportunities.
Dossetter, Al G; Ecker, Gerhard; Laverty, Hugh; Overington, John
2014-05-01
Future Medicinal Chemistry invited a selection of experts to express their views on the current impact of big data in drug discovery and design, as well as speculate on future developments in the field. The topics discussed include the challenges of implementing big data technologies, maintaining the quality and privacy of data sets, and how the industry will need to adapt to welcome the big data era. Their enlightening responses provide a snapshot of the many and varied contributions being made by big data to the advancement of pharmaceutical science.
Traffic information computing platform for big data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn
Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.
ERIC Educational Resources Information Center
Philip, Thomas M.; Schuler-Brown, Sarah; Way, Winmar
2013-01-01
As Big Data becomes increasingly important in policy-making, research, marketing, and commercial applications, we argue that literacy in this domain is critical for engaged democratic participation and that peer-generated data from mobile technologies offer rich possibilities for students to learn about this new genre of data. Through the lens of…
Bearing, gearing, and lubrication technology
NASA Technical Reports Server (NTRS)
Anderson, W. J.
1978-01-01
Results of selected NASA research programs on rolling-element and fluid-film bearings, gears, and elastohydrodynamic lubrication are reported. Advances in rolling-element bearing material technology, which have resulted in a significant improvement in fatigue life, and which make possible new applications for rolling bearings, are discussed. Research on whirl-resistant, fluid-film bearings, suitable for very high-speed applications, is discussed. An improved method for predicting gear pitting life is reported. An improved formula for calculating the thickness of elastohydrodynamic films (the existence of which help to define the operating regime of concentrated contact mechanisms such as bearings, gears, and cams) is described.
NASA Astrophysics Data System (ADS)
Bai, Bingdong; Chen, Jing; Wang, Mei; Yao, Jingjing
2017-06-01
In the context of big data age, the energy conservation and emission reduction of transportation is a natural big data industry. The planning, management, decision-making of energy conservation and emission reduction of transportation and other aspects should be supported by the analysis and forecasting of large amounts of data. Now, with the development of information technology, such as intelligent city, sensor road and so on, information collection technology in the direction of the Internet of things gradually become popular. The 3G/4G network transmission technology develop rapidly, and a large number of energy conservation and emission reduction of transportation data is growing into a series with different ways. The government not only should be able to make good use of big data to solve the problem of energy conservation and emission reduction of transportation, but also to explore and use a large amount of data behind the hidden value. Based on the analysis of the basic characteristics and application technology of energy conservation and emission reduction of transportation data, this paper carries out its application research in energy conservation and emission reduction of transportation industry, so as to provide theoretical basis and reference value for low carbon management.
Transmission Bearing Damage Detection Using Decision Fusion Analysis
NASA Technical Reports Server (NTRS)
Dempsey, Paula J.; Lewicki, David G.; Decker, Harry J.
2004-01-01
A diagnostic tool was developed for detecting fatigue damage to rolling element bearings in an OH-58 main rotor transmission. Two different monitoring technologies, oil debris analysis and vibration, were integrated using data fusion into a health monitoring system for detecting bearing surface fatigue pitting damage. This integrated system showed improved detection and decision-making capabilities as compared to using individual monitoring technologies. This diagnostic tool was evaluated by collecting vibration and oil debris data from tests performed in the NASA Glenn 500 hp Helicopter Transmission Test Stand. Data was collected during experiments performed in this test rig when two unanticipated bearing failures occurred. Results show that combining the vibration and oil debris measurement technologies improves the detection of pitting damage on spiral bevel gears duplex ball bearings and spiral bevel pinion triplex ball bearings in a main rotor transmission.
Out of the Lab...Into the Real World
NASA Technical Reports Server (NTRS)
1999-01-01
Big Sky Laser Technologies of Bozeman, MT is a developer of compact, ruggedized commercial and developmental laser systems, including small medical lasers and Lidars used in NASA tracking applications. Company engineers developing new laser products determined that NASA technology for detection and control of prelasing in a Q-switched laser would be of value to them in at least two of their product lines. Big Sky Laser's CFR-800 unit is based on NASA technology to which they obtained a non-exclusive patent.
Demonstration, Testing and Qualification of a High Temperature, High Speed Magnetic Thrust Bearing
NASA Technical Reports Server (NTRS)
DeWitt, Kenneth
2005-01-01
The gas turbine industry has a continued interest in improving engine performance and reducing net operating and maintenance costs. These goals are being realized because of advancements in aeroelasticity, materials, and computational tools such as CFD and engine simulations. These advancements aid in increasing engine thrust-to-weight ratios, specific fuel consumption, pressure ratios, and overall reliability through higher speed, higher temperature, and more efficient engine operation. Currently, rolling element bearing and squeeze film dampers are used to support rotors in gas turbine engines. Present ball bearing configurations are limited in speed (<2 million DN) and temperature (<5OO F) and require both cooling air and an elaborate lubrication system. Also, ball bearings require extensive preventative maintenance in order to assure their safe operation. Since these bearings are at their operational limits, new technologies must be found in order to take advantage of other advances. Magnetic bearings are well suited to operate at extreme temperatures and higher rotational speeds and are a promising solution to the problems that conventional rolling element bearings present. Magnetic bearing technology is being developed worldwide and is considered an enabling technology for new engine designs. Using magnetic bearings, turbine and compressor spools can be radically redesigned to be significantly larger and stiffer with better damping and higher rotational speeds. These advances, a direct result of magnetic bearing technology, will allow significant increases in engine power and efficiency. Also, magnetic bearings allow for real-time, in-situ health monitoring of the system, lower maintenance costs and down time.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. Senate Committee on Small Business.
This is a congressional hearing to acquire testimony and information about women in business or about unusual problems that have been found. Testimony includes statements from individuals representing Big Bear Shopper, Inc.; United States Business and Professional Women (BPW/USA); Rural Small Business Programs, Lane Community College;…
The Machine at the End of the Universe
ERIC Educational Resources Information Center
Monastersky, Richard
2008-01-01
Switzerland is the land of Big Ideas, where even the streets have Nobel prizes. At the European particle physics lab known as CERN, the roads through campus bear the names of Einstein, Curie, Bohr, and Heisenberg. Working amid those tributes to giants of the past century, physicists from around the world are trying to make history of their own and…
Big Data and Biomedical Informatics: A Challenging Opportunity
2014-01-01
Summary Big data are receiving an increasing attention in biomedicine and healthcare. It is therefore important to understand the reason why big data are assuming a crucial role for the biomedical informatics community. The capability of handling big data is becoming an enabler to carry out unprecedented research studies and to implement new models of healthcare delivery. Therefore, it is first necessary to deeply understand the four elements that constitute big data, namely Volume, Variety, Velocity, and Veracity, and their meaning in practice. Then, it is mandatory to understand where big data are present, and where they can be beneficially collected. There are research fields, such as translational bioinformatics, which need to rely on big data technologies to withstand the shock wave of data that is generated every day. Other areas, ranging from epidemiology to clinical care, can benefit from the exploitation of the large amounts of data that are nowadays available, from personal monitoring to primary care. However, building big data-enabled systems carries on relevant implications in terms of reproducibility of research studies and management of privacy and data access; proper actions should be taken to deal with these issues. An interesting consequence of the big data scenario is the availability of new software, methods, and tools, such as map-reduce, cloud computing, and concept drift machine learning algorithms, which will not only contribute to big data research, but may be beneficial in many biomedical informatics applications. The way forward with the big data opportunity will require properly applied engineering principles to design studies and applications, to avoid preconceptions or over-enthusiasms, to fully exploit the available technologies, and to improve data processing and data management regulations. PMID:24853034
Big data and biomedical informatics: a challenging opportunity.
Bellazzi, R
2014-05-22
Big data are receiving an increasing attention in biomedicine and healthcare. It is therefore important to understand the reason why big data are assuming a crucial role for the biomedical informatics community. The capability of handling big data is becoming an enabler to carry out unprecedented research studies and to implement new models of healthcare delivery. Therefore, it is first necessary to deeply understand the four elements that constitute big data, namely Volume, Variety, Velocity, and Veracity, and their meaning in practice. Then, it is mandatory to understand where big data are present, and where they can be beneficially collected. There are research fields, such as translational bioinformatics, which need to rely on big data technologies to withstand the shock wave of data that is generated every day. Other areas, ranging from epidemiology to clinical care, can benefit from the exploitation of the large amounts of data that are nowadays available, from personal monitoring to primary care. However, building big data-enabled systems carries on relevant implications in terms of reproducibility of research studies and management of privacy and data access; proper actions should be taken to deal with these issues. An interesting consequence of the big data scenario is the availability of new software, methods, and tools, such as map-reduce, cloud computing, and concept drift machine learning algorithms, which will not only contribute to big data research, but may be beneficial in many biomedical informatics applications. The way forward with the big data opportunity will require properly applied engineering principles to design studies and applications, to avoid preconceptions or over-enthusiasms, to fully exploit the available technologies, and to improve data processing and data management regulations.
"Your feet's too big": an inquiry into psychological and symbolic meanings of the foot.
Zerbe, K J
1985-01-01
The foot is a highly cathected appendage that is commonly singled out as the brunt of humorous or derisive remarks, as if it embodies repugnance and disgust. Attitudes toward the foot are overdetermined, bearing the imprint of man's early linguistic patterns and individual dynamics. This article suggests that feet are symbolic because they bear the feelings derived from earlier separations, good and bad object representations, collective memories, and genital representations. The foot's role as symbol of both the male and female genitals, repository of badness, symbol of passivity, initiator of movement, and site of self-mutilation have been briefly reviewed. As Fats Waller rhapsodizes that the "feet's too big," he finds a convenient way to displace his symbiotic and erotic anxieties vis-à-vis women. Similarly, patients who come for psychiatric treatment and psychotherapy frequently make references to their feet or use them in specific ways. An understanding of this type of communication can often provide insight into individual dynamics and enhance treatment. The weight placed on these communications depends, of course, on the vicissitudes of the previous therapeutic work as well as on the particular problems of the patient.
[Big data in medicine and healthcare].
Rüping, Stefan
2015-08-01
Healthcare is one of the business fields with the highest Big Data potential. According to the prevailing definition, Big Data refers to the fact that data today is often too large and heterogeneous and changes too quickly to be stored, processed, and transformed into value by previous technologies. The technological trends drive Big Data: business processes are more and more executed electronically, consumers produce more and more data themselves - e.g. in social networks - and finally ever increasing digitalization. Currently, several new trends towards new data sources and innovative data analysis appear in medicine and healthcare. From the research perspective, omics-research is one clear Big Data topic. In practice, the electronic health records, free open data and the "quantified self" offer new perspectives for data analytics. Regarding analytics, significant advances have been made in the information extraction from text data, which unlocks a lot of data from clinical documentation for analytics purposes. At the same time, medicine and healthcare is lagging behind in the adoption of Big Data approaches. This can be traced to particular problems regarding data complexity and organizational, legal, and ethical challenges. The growing uptake of Big Data in general and first best-practice examples in medicine and healthcare in particular, indicate that innovative solutions will be coming. This paper gives an overview of the potentials of Big Data in medicine and healthcare.
Optimization design of turbo-expander gas bearing for a 500W helium refrigerator
NASA Astrophysics Data System (ADS)
Li, S. S.; Fu, B.; Y Zhang, Q.
2017-12-01
Turbo-expander is the core machinery of the helium refrigerator. Bearing as the supporting element is the core technology to impact the design of turbo-expander. The perfect design and performance study for the gas bearing are essential to ensure the stability of turbo-expander. In this paper, numerical simulation is used to analyze the performance of gas bearing for a 500W helium refrigerator turbine, and the optimization design of the gas bearing has been completed. And the results of the gas bearing optimization have a guiding role in the processing technology. Finally, the turbine experiments verify that the gas bearing has good performance, and ensure the stable operation of the turbine.
Bearings: Technology and needs
NASA Technical Reports Server (NTRS)
Anderson, W. J.
1982-01-01
A brief status report on bearing technology and present and near-term future problems that warrant research support is presented. For rolling element bearings a material with improved fracture toughness, life data in the low Lambda region, a comprehensive failure theory verified by life data and incorporated into dynamic analyses, and an improved corrosion resistant alloy are perceived as important needs. For hydrodynamic bearings better definition of cavitation boundaries and pressure distributions for squeeze film dampers, and geometry optimization for minimum power loss in turbulent film bearings are needed. For gas film bearings, foil bearing geometries that form more nearly optimum film shapes for maximum load capacity, and more effective surface protective coatings for high temperature operation are needed.
The Ethics of Big Data and Nursing Science.
Milton, Constance L
2017-10-01
Big data is a scientific, social, and technological trend referring to the process and size of datasets available for analysis. Ethical implications arise as healthcare disciplines, including nursing, struggle over questions of informed consent, privacy, ownership of data, and its possible use in epistemology. The author offers straight-thinking possibilities for the use of big data in nursing science.
ERIC Educational Resources Information Center
Santana Arroyo, Sonia
2013-01-01
Health professionals frequently do not possess the necessary information-seeking abilities to conduct an effective search in databases and Internet sources. Reference librarians may teach health professionals these information and technology skills through the Big6 information literacy model (Big6). This article aims to address this issue. It also…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-11
... proposed projects to be submitted to the Office of Management and Budget (OMB) for review and approval... Information Technology (CBIIT) launched the enterprise phase of the caBIG [supreg] initiative in early 2007... resources available through the caBIG [supreg] Enterprise Support Network (ESN), including the caBIG [supreg...
de Brevern, Alexandre G; Meyniel, Jean-Philippe; Fairhead, Cécile; Neuvéglise, Cécile; Malpertuy, Alain
2015-01-01
Sequencing the human genome began in 1994, and 10 years of work were necessary in order to provide a nearly complete sequence. Nowadays, NGS technologies allow sequencing of a whole human genome in a few days. This deluge of data challenges scientists in many ways, as they are faced with data management issues and analysis and visualization drawbacks due to the limitations of current bioinformatics tools. In this paper, we describe how the NGS Big Data revolution changes the way of managing and analysing data. We present how biologists are confronted with abundance of methods, tools, and data formats. To overcome these problems, focus on Big Data Information Technology innovations from web and business intelligence. We underline the interest of NoSQL databases, which are much more efficient than relational databases. Since Big Data leads to the loss of interactivity with data during analysis due to high processing time, we describe solutions from the Business Intelligence that allow one to regain interactivity whatever the volume of data is. We illustrate this point with a focus on the Amadea platform. Finally, we discuss visualization challenges posed by Big Data and present the latest innovations with JavaScript graphic libraries.
de Brevern, Alexandre G.; Meyniel, Jean-Philippe; Fairhead, Cécile; Neuvéglise, Cécile; Malpertuy, Alain
2015-01-01
Sequencing the human genome began in 1994, and 10 years of work were necessary in order to provide a nearly complete sequence. Nowadays, NGS technologies allow sequencing of a whole human genome in a few days. This deluge of data challenges scientists in many ways, as they are faced with data management issues and analysis and visualization drawbacks due to the limitations of current bioinformatics tools. In this paper, we describe how the NGS Big Data revolution changes the way of managing and analysing data. We present how biologists are confronted with abundance of methods, tools, and data formats. To overcome these problems, focus on Big Data Information Technology innovations from web and business intelligence. We underline the interest of NoSQL databases, which are much more efficient than relational databases. Since Big Data leads to the loss of interactivity with data during analysis due to high processing time, we describe solutions from the Business Intelligence that allow one to regain interactivity whatever the volume of data is. We illustrate this point with a focus on the Amadea platform. Finally, we discuss visualization challenges posed by Big Data and present the latest innovations with JavaScript graphic libraries. PMID:26125026
'Big data', Hadoop and cloud computing in genomics.
O'Driscoll, Aisling; Daugelaite, Jurate; Sleator, Roy D
2013-10-01
Since the completion of the Human Genome project at the turn of the Century, there has been an unprecedented proliferation of genomic sequence data. A consequence of this is that the medical discoveries of the future will largely depend on our ability to process and analyse large genomic data sets, which continue to expand as the cost of sequencing decreases. Herein, we provide an overview of cloud computing and big data technologies, and discuss how such expertise can be used to deal with biology's big data sets. In particular, big data technologies such as the Apache Hadoop project, which provides distributed and parallelised data processing and analysis of petabyte (PB) scale data sets will be discussed, together with an overview of the current usage of Hadoop within the bioinformatics community. Copyright © 2013 Elsevier Inc. All rights reserved.
Getting to our Big Data Future: A 'How To' Guide for Producers and Users
NASA Astrophysics Data System (ADS)
Diggs, S. C.
2015-12-01
The informatics community is currently making the promise of a 'Big Data' future a reality, but the array of terms, technologies and personnel can be daunting. If your world is FORTRAN and FTP, how can you exploit Python, Semantically Linked Data and data center APIs? How can the scientist in the field take part in this cyber-revolution and go beyond data websites for discovery, access, data use? The Geoscience community needs a resource that explains what Big Data technologies are, how we got here, why we are here, why you should get involved, and who is out there to help.
Rolling-element bearings in China: From ancient times to the 20th century
NASA Astrophysics Data System (ADS)
Sun, Lie; Li, Ang
2016-03-01
The development of rolling-element bearings in China has spanned a long period. Based on several typical and important cases, the present article reconstructs the history of rolling-element bearings in China by dividing it into four stages according to the various characteristics of the bearings. The first stage represents the origin of rolling bearings in China, which remains controversial because of several suspected races and cages that were likely the components of bearings more than a millennium ago. At the second stage, a type of simple roller bearing was used for astronomical instruments not later than the 13th century based on clear philological and physical evidence. A similar bearing was also applied to an abridged armillary in the 17th century. Another type of spherical thrust bearings with rolling elements, which is a key component of a traditional Chinese windmill, could support a rotating shaft that moves rotationally and at an angle. At the third stage, the Chinese began studying and using the so-called Europeanstyle bearing since the 17th century. Moreover, over the last 100 years, the modern rolling bearing industry was gradually established in China, particularly because of the technology transfer from the Soviet Union in the 1950s. At the fourth stage, the Chinese government initiated the relatively rapid development of bearing technology. The government launched the "bearing movement" from the 1950s to the 1960s to establish the modern bearing industry and to promote rolling bearings as replacement for traditional sliding bearings. Furthermore, a number of large professional factories and institutions in China have continually introduced advanced technology and equipment. At present, these companies and institutions play a significant role in the international bearing industry.
Manufacture of a 1.7m prototype of the GMT primary mirror segments
NASA Astrophysics Data System (ADS)
Martin, H. M.; Burge, J. H.; Miller, S. M.; Smith, B. K.; Zehnder, R.; Zhao, C.
2006-06-01
We have nearly completed the manufacture of a 1.7 m off-axis mirror as part of the technology development for the Giant Magellan Telescope. The mirror is an off-axis section of a 5.3 m f/0.73 parent paraboloid, making it roughly a 1:5 model of the outer 8.4 m GMT segment. The 1.7 m mirror will be the primary mirror of the New Solar Telescope at Big Bear Solar Observatory. It has a 2.7 mm peak-to-valley departure from the best-fit sphere, presenting a serious challenge in terms of both polishing and measurement. The mirror was polished with a stressed lap, which bends actively to match the local curvature at each point on the mirror surface, and works for asymmetric mirrors as well as symmetric aspheres. It was measured using a hybrid reflective-diffractive null corrector to compensate for the mirror's asphericity. Both techniques will be applied in scaled-up versions to the GMT segments.
"Big data" and "open data": What kind of access should researchers enjoy?
Chatellier, Gilles; Varlet, Vincent; Blachier-Poisson, Corinne
2016-02-01
The healthcare sector is currently facing a new paradigm, the explosion of "big data". Coupled with advances in computer technology, the field of "big data" appears promising, allowing us to better understand the natural history of diseases, to follow-up new technologies (devices, drugs) implementation and to participate in precision medicine, etc. Data sources are multiple (medical and administrative data, electronic medical records, data from rapidly developing technologies such as DNA sequencing, connected devices, etc.) and heterogeneous while their use requires complex methods for accurate analysis. Moreover, faced with this new paradigm, we must determine who could (or should) have access to which data, how to combine collective interest and protection of personal data and how to finance in the long-term both operating costs and databases interrogation. This article analyses the opportunities and challenges related to the use of open and/or "big data", from the viewpoint of pharmacologists and representatives of the pharmaceutical and medical device industry. Copyright © 2016 Société française de pharmacologie et de thérapeutique. Published by Elsevier Masson SAS. All rights reserved.
Man and polar bear in Svalbard: a solvable ecological conflict?
Risholt, T; Persen, E; Solem, O I
1998-01-01
The objective of this study is twofold. First, it is to assess the nature and magnitude of the polar bear-human conflict with respect to injuries to man and bear. Second, a major concern has been to minimize injurious interactions in order to safeguard the people who live and work in the Arctic, and, at the same time, secure the future of the polar bear in one of the last relatively unspoiled habitats on earth for big carnivores. From 1971 to 1995, approximately 80 bears were involved in serious bear-human interactions. Of these, 77 bears were killed and 3 escaped after having injured people. During the same period, 10 people were injured, 4 of them fatally, in 7 separate interactions, each involving a single bear. None of the victims carried an appropriate firearm. The circumstances leading up to the confrontations give strong reasons for supposing that the majority of the attacks were predatory in nature. Seven of the injured, including the four who were killed, sustained bites to the head and neck. Correct use of firearms could probably have prevented all the fatalities. However, the keeping and use of firearms caused two accidental deaths in the same period. We conclude that alertness, the absence of attractants (food, garbage), and appropriate bear repellents to secure field camps are important items in preventing conflicts and should always be available. However, as a last but indispensable resort, a firearm (rifle or shotgun) carried by an experienced user is the only safe precaution for avoiding injuries in polar bear country. Killing a bear on the rare occasions when humans are in danger presents no threat to the bear population. With regard to physical injury to people, the problem is a minor one. Bears have a dual impact on everyday life in the Svalbard settlements. While there is some anxiety related to the presence of bears, the polar bear is a source of breathtaking adventure highly valued by both residents and visitors.
Biosecurity in the age of Big Data: a conversation with the FBI
Kozminski, Keith G.
2015-01-01
New scientific frontiers and emerging technologies within the life sciences pose many global challenges to society. Big Data is a premier example, especially with respect to individual, national, and international security. Here a Special Agent of the Federal Bureau of Investigation discusses the security implications of Big Data and the need for security in the life sciences. PMID:26543195
Dynamic Analytics-Driven Assessment of Vulnerabilities and Exploitation
2016-07-15
integration with big data technologies such as Hadoop , nor does it natively support exporting of events to external relational databases. OSSIM supports...power of big data analytics to determine correlations and temporal causality among vulnerabilities and cyber events. The vulnerability dependencies...via the SCAPE (formerly known as LLCySA [6]). This is illustrated as a big data cyber analytic system architecture in
NASA Technical Reports Server (NTRS)
Bruckner, Robert J.
2010-01-01
Over the past several years the term oil-free turbomachinery has been used to describe a rotor support system for high speed turbomachinery that does not require oil for lubrication, damping, or cooling. The foundation technology for oil-free turbomachinery is the compliant foil bearing. This technology can replace the conventional rolling element bearings found in current engines. Two major benefits are realized with this technology. The primary benefit is the elimination of the oil lubrication system, accessory gearbox, tower shaft, and one turbine frame. These components account for 8 to 13 percent of the turbofan engine weight. The second benefit that compliant foil bearings offer to turbofan engines is the capability to operate at higher rotational speeds and shaft diameters. While traditional rolling element bearings have diminished life, reliability, and load capacity with increasing speeds, the foil bearing has a load capacity proportional to speed. The traditional applications for foil bearings have been in small, lightweight machines. However, recent advancements in the design and manufacturing of foil bearings have increased their potential size. An analysis, grounded in experimentally proven operation, is performed to assess the scalability of the modern foil bearing. This analysis was coupled to the requirements of civilian turbofan engines. The application of the foil bearing to larger, high bypass ratio engines nominally at the 120 kN (approx.25000 lb) thrust class has been examined. The application of this advanced technology to this system was found to reduce mission fuel burn by 3.05 percent.
A study of pricing and trading model of Blockchain & Big data-based Energy-Internet electricity
NASA Astrophysics Data System (ADS)
Fan, Tao; He, Qingsu; Nie, Erbao; Chen, Shaozhen
2018-01-01
The development of Energy-Internet is currently suffering from a series of issues, such as the conflicts among high capital requirement, low-cost, high efficiency, the spreading gap between capital demand and supply, as well as the lagged trading & valuation mechanism, any of which would hinder Energy-Internet's evolution. However, with the development of Blockchain and big-data technology, it is possible to work out solutions for these issues. Based on current situation of Energy-Internet and its requirements for future progress, this paper demonstrates the validity of employing blockchain technology to solve the problems encountered by Energy-Internet during its development. It proposes applying the blockchain and big-data technologies to pricing and trading energy products through Energy-Internet and to accomplish cyber-based energy or power's transformation from physic products to financial assets.
Oil-Free Turbomachinery Research Enhanced by Thrust Bearing Test Capability
NASA Technical Reports Server (NTRS)
Bauman, Steven W.
2003-01-01
NASA Glenn Research Center s Oil-Free Turbomachinery research team is developing aircraft turbine engines that will not require an oil lubrication system. Oil systems are required today to lubricate rolling-element bearings used by the turbine and fan shafts. For the Oil-Free Turbomachinery concept, researchers combined the most advanced foil (air) bearings from industry with NASA-developed high-temperature solid lubricant technology. In 1999, the world s first Oil-Free turbocharger was demonstrated using these technologies. Now we are working with industry to demonstrate Oil-Free turbomachinery technology in a small business jet engine, the EJ-22 produced by Williams International and developed during Glenn s recently concluded General Aviation Propulsion (GAP) program. Eliminating the oil system in this engine will make it simpler, lighter (approximately 15 percent), more reliable, and less costly to purchase and maintain. Propulsion gas turbines will place high demands on foil air bearings, especially the thrust bearings. Up until now, the Oil-Free Turbomachinery research team only had the capability to test radial, journal bearings. This research has resulted in major improvements in the bearings performance, but journal bearings are cylindrical, and can only support radial shaft loads. To counteract axial thrust loads, thrust foil bearings, which are disk shaped, are required. Since relatively little research has been conducted on thrust foil air bearings, their performance lags behind that of journal bearings.
Forget the hype or reality. Big data presents new opportunities in Earth Science.
NASA Astrophysics Data System (ADS)
Lee, T. J.
2015-12-01
Earth science is arguably one of the most mature science discipline which constantly acquires, curates, and utilizes a large volume of data with diverse variety. We deal with big data before there is big data. For example, while developing the EOS program in the 1980s, the EOS data and information system (EOSDIS) was developed to manage the vast amount of data acquired by the EOS fleet of satellites. EOSDIS continues to be a shining example of modern science data systems in the past two decades. With the explosion of internet, the usage of social media, and the provision of sensors everywhere, the big data era has bring new challenges. First, Goggle developed the search algorithm and a distributed data management system. The open source communities quickly followed up and developed Hadoop file system to facility the map reduce workloads. The internet continues to generate tens of petabytes of data every day. There is a significant shortage of algorithms and knowledgeable manpower to mine the data. In response, the federal government developed the big data programs that fund research and development projects and training programs to tackle these new challenges. Meanwhile, comparatively to the internet data explosion, Earth science big data problem has become quite small. Nevertheless, the big data era presents an opportunity for Earth science to evolve. We learned about the MapReduce algorithms, in memory data mining, machine learning, graph analysis, and semantic web technologies. How do we apply these new technologies to our discipline and bring the hype to Earth? In this talk, I will discuss how we might want to apply some of the big data technologies to our discipline and solve many of our challenging problems. More importantly, I will propose new Earth science data system architecture to enable new type of scientific inquires.
Cultural Resources Survey of the Caernarvon Diversion Site, Mississippi Delta Region, Louisiana.
1987-07-08
grass, cypress trees , and scrub oak). Immediately to the south of the pedestrian survey area is Big Mar (Figure 2). Banklines examined during the survey...American black bear (Euractos americanus) inhabit these areas. The floral overstory supports many varieties of hardwood trees including live oak...banded water snake (Natrix fasciata), alligator (Alligator mississipiensis), and the red swamp crawfish (Procambarus clarkii). Floral species
Observations of vector magnetic fields with a magneto-optic filter
NASA Technical Reports Server (NTRS)
Cacciani, Alessandro; Varsik, John; Zirin, Harold
1990-01-01
The use of the magnetooptic filter to observe solar magnetic fields in the potassium line at 7699 A is described. The filter has been used in the Big Bear videomagnetograph since October 23. It gives a high sensitivity and dynamic range for longitudnal magnetic fields and enables measurement of transverse magnetic fields using the sigma component. Examples of the observations are presented.
Big physics quartet win government backing
NASA Astrophysics Data System (ADS)
Banks, Michael
2014-09-01
Four major physics-based projects are among 10 to have been selected by Japan’s Ministry of Education, Culture, Sports, Science and Technology for funding in the coming decade as part of its “roadmap” of big-science projects.
Mason, Paul H
2017-12-01
The availability of diverse sources of data related to health and illness from various types of modern communication technology presents the possibility of augmenting medical knowledge, clinical care, and the patient experience. New forms of data collection and analysis will undoubtedly transform epidemiology, public health, and clinical practice, but what ethical considerations come in to play? With a view to analysing the ethical and regulatory dimensions of burgeoning forms of biomedical big data, Brent Daniel Mittelstadt and Luciano Floridi have brought together thirty scholars in an edited volume that forms part of Springer's Law, Governance and Technology book series in a collection titled The Ethics of Biomedical Big Data. With eighteen chapters partitioned into six carefully devised sections, this volume engages with core theoretical, ethical, and regulatory challenges posed by biomedical big data.
Automated Predictive Big Data Analytics Using Ontology Based Semantics.
Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A
2015-10-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.
Automated Predictive Big Data Analytics Using Ontology Based Semantics
Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.
2017-01-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954
Big data in food safety: An overview.
Marvin, Hans J P; Janssen, Esmée M; Bouzembrak, Yamine; Hendriksen, Peter J M; Staats, Martijn
2017-07-24
Technology is now being developed that is able to handle vast amounts of structured and unstructured data from diverse sources and origins. These technologies are often referred to as big data, and open new areas of research and applications that will have an increasing impact in all sectors of our society. In this paper we assessed to which extent big data is being applied in the food safety domain and identified several promising trends. In several parts of the world, governments stimulate the publication on internet of all data generated in public funded research projects. This policy opens new opportunities for stakeholders dealing with food safety to address issues which were not possible before. Application of mobile phones as detection devices for food safety and the use of social media as early warning of food safety problems are a few examples of the new developments that are possible due to big data.
NASA Astrophysics Data System (ADS)
Jiang, Li; Shi, Tielin; Xuan, Jianping
2012-05-01
Generally, the vibration signals of fault bearings are non-stationary and highly nonlinear under complicated operating conditions. Thus, it's a big challenge to extract optimal features for improving classification and simultaneously decreasing feature dimension. Kernel Marginal Fisher analysis (KMFA) is a novel supervised manifold learning algorithm for feature extraction and dimensionality reduction. In order to avoid the small sample size problem in KMFA, we propose regularized KMFA (RKMFA). A simple and efficient intelligent fault diagnosis method based on RKMFA is put forward and applied to fault recognition of rolling bearings. So as to directly excavate nonlinear features from the original high-dimensional vibration signals, RKMFA constructs two graphs describing the intra-class compactness and the inter-class separability, by combining traditional manifold learning algorithm with fisher criteria. Therefore, the optimal low-dimensional features are obtained for better classification and finally fed into the simplest K-nearest neighbor (KNN) classifier to recognize different fault categories of bearings. The experimental results demonstrate that the proposed approach improves the fault classification performance and outperforms the other conventional approaches.
Quality of Big Data in Healthcare
Sukumar, Sreenivas R.; Ramachandran, Natarajan; Ferrell, Regina Kay
2015-01-01
The current trend in Big Data Analytics and in particular Health information technology is towards building sophisticated models, methods and tools for business, operational and clinical intelligence, but the critical issue of data quality required for these models is not getting the attention it deserves. The objective of the paper is to highlight the issues of data quality in the context of Big Data Healthcare Analytics.
Quality of Big Data in Healthcare
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sukumar, Sreenivas R.; Ramachandran, Natarajan; Ferrell, Regina Kay
The current trend in Big Data Analytics and in particular Health information technology is towards building sophisticated models, methods and tools for business, operational and clinical intelligence, but the critical issue of data quality required for these models is not getting the attention it deserves. The objective of the paper is to highlight the issues of data quality in the context of Big Data Healthcare Analytics.
Advancements Toward Oil-Free Rotorcraft Propulsion
NASA Technical Reports Server (NTRS)
Howard, Samuel A.; Bruckner, Robert J.; Radil, Kevin C.
2010-01-01
NASA and the Army have been working for over a decade to advance the state-of-the-art (SOA) in Oil-Free Turbomachinery with an eye toward reduced emissions and maintenance, and increased performance and efficiency among other benefits. Oil-Free Turbomachinery is enabled by oil-free gas foil bearing technology and relatively new high-temperature tribological coatings. Rotorcraft propulsion is a likely candidate to apply oil-free bearing technology because the engine size class matches current SOA for foil bearings and because foil bearings offer the opportunity for higher speeds and temperatures and lower weight, all critical issues for rotorcraft engines. This paper describes an effort to demonstrate gas foil journal bearing use in the hot section of a full-scale helicopter engine core. A production engine hot-core location is selected as the candidate foil bearing application. Rotordynamic feasibility, bearing sizing, and load capability are assessed. The results of the program will help guide future analysis and design in this area by documenting the steps required and the process utilized for successful application of oil-free technology to a full-scale engine.
Big data: survey, technologies, opportunities, and challenges.
Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah
2014-01-01
Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.
Big Data: Survey, Technologies, Opportunities, and Challenges
Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Mahmoud Ali, Waleed Kamaleldin; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah
2014-01-01
Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682
Big data - smart health strategies. Findings from the yearbook 2014 special theme.
Koutkias, V; Thiessard, F
2014-08-15
To select best papers published in 2013 in the field of big data and smart health strategies, and summarize outstanding research efforts. A systematic search was performed using two major bibliographic databases for relevant journal papers. The references obtained were reviewed in a two-stage process, starting with a blinded review performed by the two section editors, and followed by a peer review process operated by external reviewers recognized as experts in the field. The complete review process selected four best papers, illustrating various aspects of the special theme, among them: (a) using large volumes of unstructured data and, specifically, clinical notes from Electronic Health Records (EHRs) for pharmacovigilance; (b) knowledge discovery via querying large volumes of complex (both structured and unstructured) biological data using big data technologies and relevant tools; (c) methodologies for applying cloud computing and big data technologies in the field of genomics, and (d) system architectures enabling high-performance access to and processing of large datasets extracted from EHRs. The potential of big data in biomedicine has been pinpointed in various viewpoint papers and editorials. The review of current scientific literature illustrated a variety of interesting methods and applications in the field, but still the promises exceed the current outcomes. As we are getting closer towards a solid foundation with respect to common understanding of relevant concepts and technical aspects, and the use of standardized technologies and tools, we can anticipate to reach the potential that big data offer for personalized medicine and smart health strategies in the near future.
Big Data - Smart Health Strategies
2014-01-01
Summary Objectives To select best papers published in 2013 in the field of big data and smart health strategies, and summarize outstanding research efforts. Methods A systematic search was performed using two major bibliographic databases for relevant journal papers. The references obtained were reviewed in a two-stage process, starting with a blinded review performed by the two section editors, and followed by a peer review process operated by external reviewers recognized as experts in the field. Results The complete review process selected four best papers, illustrating various aspects of the special theme, among them: (a) using large volumes of unstructured data and, specifically, clinical notes from Electronic Health Records (EHRs) for pharmacovigilance; (b) knowledge discovery via querying large volumes of complex (both structured and unstructured) biological data using big data technologies and relevant tools; (c) methodologies for applying cloud computing and big data technologies in the field of genomics, and (d) system architectures enabling high-performance access to and processing of large datasets extracted from EHRs. Conclusions The potential of big data in biomedicine has been pinpointed in various viewpoint papers and editorials. The review of current scientific literature illustrated a variety of interesting methods and applications in the field, but still the promises exceed the current outcomes. As we are getting closer towards a solid foundation with respect to common understanding of relevant concepts and technical aspects, and the use of standardized technologies and tools, we can anticipate to reach the potential that big data offer for personalized medicine and smart health strategies in the near future. PMID:25123721
Telecom Big Data for Urban Transport Analysis - a Case Study of Split-Dalmatia County in Croatia
NASA Astrophysics Data System (ADS)
Baučić, M.; Jajac, N.; Bućan, M.
2017-09-01
Today, big data has become widely available and the new technologies are being developed for big data storage architecture and big data analytics. An ongoing challenge is how to incorporate big data into GIS applications supporting the various domains. International Transport Forum explains how the arrival of big data and real-time data, together with new data processing algorithms lead to new insights and operational improvements of transport. Based on the telecom customer data, the Study of Tourist Movement and Traffic in Split-Dalmatia County in Croatia is carried out as a part of the "IPA Adriatic CBC//N.0086/INTERMODAL" project. This paper briefly explains the big data used in the study and the results of the study. Furthermore, this paper investigates the main considerations when using telecom customer big data: data privacy and data quality. The paper concludes with GIS visualisation and proposes the further use of big data used in the study.
Oil-Free Turbomachinery Technologies for Long-Life, Maintenance-Free Power Generation Applications
NASA Technical Reports Server (NTRS)
Dellacorte, Christopher
2013-01-01
Turbines have long been used to convert thermal energy to shaft work for power generation. Conventional turbines rely upon oil-lubricated rotor supports (bearings, seals, etc.) to achieve low wear, high efficiency and reliability. Emerging Oil-Free technologies such as gas foil bearings and magnetic bearings offer a path for reduced weight and complexity and truly maintenance free systems. Oil-Free gas turbines, using gaseous and liquid fuels are commercially available in power outputs to at least 250kWe and are gaining acceptance for remote power generation where maintenance is a challenge. Closed Brayton Cycle (CBC) turbines are an approach to power generation that is well suited for long life space missions. In these systems, a recirculating gas is heated by nuclear, solar or other heat energy source then fed into a high-speed turbine that drives an electrical generator. For closed cycle systems such as these, the working fluid also passes through the bearing compartments thus serving as a lubricant and bearing coolant. Compliant surface foil gas bearings are well suited for the rotor support systems of these advanced turbines. Foil bearings develop a thin hydrodynamic gas film that separates the rotating shaft from the bearing preventing wear. During start-up and shut down when speeds are low, rubbing occurs. Solid lubricants are used to reduce starting torque and minimize wear. Other emerging technologies such as magnetic bearings can also contribute to robust and reliable Oil-Free turbomachinery. In this presentation, Oil-Free technologies for advanced rotor support systems will be reviewed as will the integration and development processes recommended for implementation.
NASA unveils its big astrophysics dreams
NASA Astrophysics Data System (ADS)
Gwynne, Peter
2014-02-01
A task force appointed by the astrophysicists subcommittee of NASA's advisory council has published a report looking at the technologies needed to answer three big questions: Are we alone? How did we get here? And how does the universe work?
Remaining Technical Challenges and Future Plans for Oil-Free Turbomachinery
NASA Technical Reports Server (NTRS)
DellaCorte, Christopher; Bruckner, Robert J.
2010-01-01
The application of Oil-Free technologies (foil gas bearings, solid lubricants and advanced analysis and predictive modeling tools) to advanced turbomachinery has been underway for several decades. During that time, full commercialization has occurred in aircraft air cycle machines, turbocompressors and cryocoolers and ever-larger microturbines. Emerging products in the automotive sector (turbochargers and superchargers) indicate that high volume serial production of foil bearings is imminent. Demonstration of foil bearings in APU s and select locations in propulsion gas turbines illustrates that such technology also has a place in these future systems. Foil bearing designs, predictive tools and advanced solid lubricants have been reported that can satisfy anticipated requirements but a major question remains regarding the scalability of foil bearings to ever larger sizes to support heavier rotors. In this paper, the technological history, primary physics, engineering practicalities and existing experimental and experiential database for scaling foil bearings are reviewed and the major remaining technical challenges are identified.
Aerospace applications of magnetic bearings
NASA Technical Reports Server (NTRS)
Downer, James; Goldie, James; Gondhalekar, Vijay; Hockney, Richard
1994-01-01
Magnetic bearings have traditionally been considered for use in aerospace applications only where performance advantages have been the primary, if not only, consideration. Conventional wisdom has been that magnetic bearings have certain performance advantages which must be traded off against increased weight, volume, electric power consumption, and system complexity. These perceptions have hampered the use of magnetic bearings in many aerospace applications because weight, volume, and power are almost always primary considerations. This paper will review progress on several active aerospace magnetic bearings programs at SatCon Technology Corporation. The magnetic bearing programs at SatCon cover a broad spectrum of applications including: a magnetically-suspended spacecraft integrated power and attitude control system (IPACS), a magnetically-suspended momentum wheel, magnetic bearings for the gas generator rotor of a turboshaft engine, a vibration-attenuating magnetic bearing system for an airborne telescope, and magnetic bearings for the compressor of a space-rated heat pump system. The emphasis of these programs is to develop magnetic bearing technologies to the point where magnetic bearings can be truly useful, reliable, and well tested components for the aerospace community.
Aerospace applications of magnetic bearings
NASA Astrophysics Data System (ADS)
Downer, James; Goldie, James; Gondhalekar, Vijay; Hockney, Richard
1994-05-01
Magnetic bearings have traditionally been considered for use in aerospace applications only where performance advantages have been the primary, if not only, consideration. Conventional wisdom has been that magnetic bearings have certain performance advantages which must be traded off against increased weight, volume, electric power consumption, and system complexity. These perceptions have hampered the use of magnetic bearings in many aerospace applications because weight, volume, and power are almost always primary considerations. This paper will review progress on several active aerospace magnetic bearings programs at SatCon Technology Corporation. The magnetic bearing programs at SatCon cover a broad spectrum of applications including: a magnetically-suspended spacecraft integrated power and attitude control system (IPACS), a magnetically-suspended momentum wheel, magnetic bearings for the gas generator rotor of a turboshaft engine, a vibration-attenuating magnetic bearing system for an airborne telescope, and magnetic bearings for the compressor of a space-rated heat pump system. The emphasis of these programs is to develop magnetic bearing technologies to the point where magnetic bearings can be truly useful, reliable, and well tested components for the aerospace community.
Quality Attribute-Guided Evaluation of NoSQL Databases: An Experience Report
2014-10-18
detailed technical evaluations of NoSQL databases specifically, and big data systems in general, that have become apparent during our study... big data , software systems [Agarwal 2011]. Internet-born organizations such as Google and Amazon are at the cutting edge of this revolution...Chang 2008], along with those of numerous other big data innovators, have made a variety of open source and commercial data management technologies
John T. Kliejunas; William J. Otrosina; James R. Allison
2005-01-01
Six annosus (Heterobasidion annosum) root disease centers in a proposed campground on the north shore of Big Bear Lake in southern California were treated in 1989. Trees, stumps, and roots were removed in six disease centers, and in two cases, soil trenching was used to stop the progress of the disease. A total of 154 trees and 26 stumps were removed...
F. T. Bonner
1971-01-01
Extended storage of heavy, high-moisture seeds is a problem the world over. Conifers and most hardwoods bear small seeds that can be stored for considerable periods at low moisture contents and low temperatures. But some of the big hardwood seeds can at best be kept overwinter; they are damaged or killed if dried to the levels commonly utilized for long-term storage....
Gas Foil Bearing Technology Advancements for Closed Brayton Cycle Turbines
NASA Technical Reports Server (NTRS)
Howard, Samuel A.; Bruckner, Robert J.; DellaCorte, Christopher; Radil, Kevin C.
2007-01-01
Closed Brayton Cycle (CBC) turbine systems are under consideration for future space electric power generation. CBC turbines convert thermal energy from a nuclear reactor, or other heat source, to electrical power using a closed-loop cycle. The operating fluid in the closed-loop is commonly a high pressure inert gas mixture that cannot tolerate contamination. One source of potential contamination in a system such as this is the lubricant used in the turbomachine bearings. Gas Foil Bearings (GFB) represent a bearing technology that eliminates the possibility of contamination by using the working fluid as the lubricant. Thus, foil bearings are well suited to application in space power CBC turbine systems. NASA Glenn Research Center is actively researching GFB technology for use in these CBC power turbines. A power loss model has been developed, and the effects of a very high ambient pressure, start-up torque, and misalignment, have been observed and are reported here.
Big Data in Science and Healthcare: A Review of Recent Literature and Perspectives
Miron-Shatz, T.; Lau, A. Y. S.; Paton, C.
2014-01-01
Summary Objectives As technology continues to evolve and rise in various industries, such as healthcare, science, education, and gaming, a sophisticated concept known as Big Data is surfacing. The concept of analytics aims to understand data. We set out to portray and discuss perspectives of the evolving use of Big Data in science and healthcare and, to examine some of the opportunities and challenges. Methods A literature review was conducted to highlight the implications associated with the use of Big Data in scientific research and healthcare innovations, both on a large and small scale. Results Scientists and health-care providers may learn from one another when it comes to understanding the value of Big Data and analytics. Small data, derived by patients and consumers, also requires analytics to become actionable. Connectivism provides a framework for the use of Big Data and analytics in the areas of science and healthcare. This theory assists individuals to recognize and synthesize how human connections are driving the increase in data. Despite the volume and velocity of Big Data, it is truly about technology connecting humans and assisting them to construct knowledge in new ways. Concluding Thoughts The concept of Big Data and associated analytics are to be taken seriously when approaching the use of vast volumes of both structured and unstructured data in science and health-care. Future exploration of issues surrounding data privacy, confidentiality, and education are needed. A greater focus on data from social media, the quantified self-movement, and the application of analytics to “small data” would also be useful. PMID:25123717
Hansen, M M; Miron-Shatz, T; Lau, A Y S; Paton, C
2014-08-15
As technology continues to evolve and rise in various industries, such as healthcare, science, education, and gaming, a sophisticated concept known as Big Data is surfacing. The concept of analytics aims to understand data. We set out to portray and discuss perspectives of the evolving use of Big Data in science and healthcare and, to examine some of the opportunities and challenges. A literature review was conducted to highlight the implications associated with the use of Big Data in scientific research and healthcare innovations, both on a large and small scale. Scientists and health-care providers may learn from one another when it comes to understanding the value of Big Data and analytics. Small data, derived by patients and consumers, also requires analytics to become actionable. Connectivism provides a framework for the use of Big Data and analytics in the areas of science and healthcare. This theory assists individuals to recognize and synthesize how human connections are driving the increase in data. Despite the volume and velocity of Big Data, it is truly about technology connecting humans and assisting them to construct knowledge in new ways. Concluding Thoughts: The concept of Big Data and associated analytics are to be taken seriously when approaching the use of vast volumes of both structured and unstructured data in science and health-care. Future exploration of issues surrounding data privacy, confidentiality, and education are needed. A greater focus on data from social media, the quantified self-movement, and the application of analytics to "small data" would also be useful.
Integrative methods for analyzing big data in precision medicine.
Gligorijević, Vladimir; Malod-Dognin, Noël; Pržulj, Nataša
2016-03-01
We provide an overview of recent developments in big data analyses in the context of precision medicine and health informatics. With the advance in technologies capturing molecular and medical data, we entered the area of "Big Data" in biology and medicine. These data offer many opportunities to advance precision medicine. We outline key challenges in precision medicine and present recent advances in data integration-based methods to uncover personalized information from big data produced by various omics studies. We survey recent integrative methods for disease subtyping, biomarkers discovery, and drug repurposing, and list the tools that are available to domain scientists. Given the ever-growing nature of these big data, we highlight key issues that big data integration methods will face. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ethics and Epistemology in Big Data Research.
Lipworth, Wendy; Mason, Paul H; Kerridge, Ian; Ioannidis, John P A
2017-12-01
Biomedical innovation and translation are increasingly emphasizing research using "big data." The hope is that big data methods will both speed up research and make its results more applicable to "real-world" patients and health services. While big data research has been embraced by scientists, politicians, industry, and the public, numerous ethical, organizational, and technical/methodological concerns have also been raised. With respect to technical and methodological concerns, there is a view that these will be resolved through sophisticated information technologies, predictive algorithms, and data analysis techniques. While such advances will likely go some way towards resolving technical and methodological issues, we believe that the epistemological issues raised by big data research have important ethical implications and raise questions about the very possibility of big data research achieving its goals.
NASA Technical Reports Server (NTRS)
1991-01-01
The Ultraprobe 2000, manufactured by UE Systems, Inc., Elmsford, NY, is a hand-held ultrasonic system that detects indications of bearing failure by analyzing changes in amplitude. It employs the technology of a prototype ultrasonic bearing-failure monitoring system developed by Mechanical Technology, Inc., Latham, New York and Marshall Space Flight Center (which was based on research into Skylab's gyroscope bearings). Bearings on the verge of failure send ultrasonic signals indicating their deterioration; the Ultraprobe changes these to audible signals. The operator hears the signals and gages their intensity with a meter in the unit.
Oil-Free Turbomachinery Team Passed Milestone on Path to the First Oil-Free Turbine Aircraft Engine
NASA Technical Reports Server (NTRS)
Bream, Bruce L.
2002-01-01
The Oil-Free Turbine Engine Technology Project team successfully demonstrated a foil-air bearing designed for the core rotor shaft of a turbine engine. The bearings were subjected to test conditions representative of the engine core environment through a combination of high speeds, sustained loads, and elevated temperatures. The operational test envelope was defined during conceptual design studies completed earlier this year by bearing manufacturer Mohawk Innovative Technologies and the turbine engine company Williams International. The prototype journal foil-air bearings were tested at the NASA Glenn Research Center. Glenn is working with Williams and Mohawk to create a revolution in turbomachinery by developing the world's first Oil-Free turbine aircraft engine. NASA's General Aviation Propulsion project and Williams International recently developed the FJX-2 turbofan engine that is being commercialized as the EJ-22. This core bearing milestone is a first step toward a future version of the EJ-22 that will take advantage of recent advances in foil-air bearings by eliminating the need for oil lubrication systems and rolling element bearings. Oil-Free technology can reduce engine weight by 15 percent and let engines operate at very high speeds, yielding power density improvements of 20 percent, and reducing engine maintenance costs. In addition, with NASA coating technology, engines can operate at temperatures up to 1200 F. Although the project is still a couple of years from a full engine test of the bearings, this milestone shows that the bearing design exceeds the expected environment, thus providing confidence that an Oil-Free turbine aircraft engine will be attained. The Oil-Free Turbomachinery Project is supported through the Aeropropulsion Base Research Program.
NASA Astrophysics Data System (ADS)
Mattmann, C. A.
2013-12-01
A wave of open source big data analytic infrastructure is currently shaping government, private sector, and academia. Projects are consuming, adapting, and contributing back to various ecosystems of software e.g., the Apache Hadoop project and its ecosystem of related efforts including Hive, HBase, Pig, Oozie, Ambari, Knox, Tez and Yarn, to name a few; the Berkeley AMPLab stack which includes Spark, Shark, Mesos, Tachyon, BlinkDB, MLBase, and other emerging efforts; MapR and its related stack of technologies, offerings from commercial companies building products around these tools e.g., Hortonworks Data Platform (HDP), Cloudera's CDH project, etc. Though the technologies all offer different capabilities including low latency support/in-memory, versus record oriented file I/O, high availability, support for the Map Reduce programming paradigm or other dataflow/workflow constructs, there is a common thread that binds these products - they are all released under an open source license e.g., Apache2, MIT, BSD, GPL/LGPL, etc.; all thrive in various ecosystems, such as Apache, or Berkeley AMPLab; all are developed collaboratively, and all technologies provide plug in architecture models and methodologies for allowing others to contribute, and participate via various community models. This talk will cover the open source aspects and governance aspects of the aforementioned Big Data ecosystems and point out the differences, subtleties, and implications of those differences. The discussion will be by example, using several national deployments and Big Data initiatives stemming from the Administration including DARPA's XDATA program; NASA's CMAC program; NSF's EarthCube and geosciences BigData projects. Lessons learned from these efforts in terms of the open source aspects of these technologies will help guide the AGU community in their use, deployment and understanding.
Creating value in health care through big data: opportunities and policy implications.
Roski, Joachim; Bo-Linn, George W; Andrews, Timothy A
2014-07-01
Big data has the potential to create significant value in health care by improving outcomes while lowering costs. Big data's defining features include the ability to handle massive data volume and variety at high velocity. New, flexible, and easily expandable information technology (IT) infrastructure, including so-called data lakes and cloud data storage and management solutions, make big-data analytics possible. However, most health IT systems still rely on data warehouse structures. Without the right IT infrastructure, analytic tools, visualization approaches, work flows, and interfaces, the insights provided by big data are likely to be limited. Big data's success in creating value in the health care sector may require changes in current polices to balance the potential societal benefits of big-data approaches and the protection of patients' confidentiality. Other policy implications of using big data are that many current practices and policies related to data use, access, sharing, privacy, and stewardship need to be revised. Project HOPE—The People-to-People Health Foundation, Inc.
Big Data in Public Health: Terminology, Machine Learning, and Privacy.
Mooney, Stephen J; Pejaver, Vikas
2018-04-01
The digital world is generating data at a staggering and still increasing rate. While these "big data" have unlocked novel opportunities to understand public health, they hold still greater potential for research and practice. This review explores several key issues that have arisen around big data. First, we propose a taxonomy of sources of big data to clarify terminology and identify threads common across some subtypes of big data. Next, we consider common public health research and practice uses for big data, including surveillance, hypothesis-generating research, and causal inference, while exploring the role that machine learning may play in each use. We then consider the ethical implications of the big data revolution with particular emphasis on maintaining appropriate care for privacy in a world in which technology is rapidly changing social norms regarding the need for (and even the meaning of) privacy. Finally, we make suggestions regarding structuring teams and training to succeed in working with big data in research and practice.
Big Data Management in US Hospitals: Benefits and Barriers.
Schaeffer, Chad; Booton, Lawrence; Halleck, Jamey; Studeny, Jana; Coustasse, Alberto
Big data has been considered as an effective tool for reducing health care costs by eliminating adverse events and reducing readmissions to hospitals. The purposes of this study were to examine the emergence of big data in the US health care industry, to evaluate a hospital's ability to effectively use complex information, and to predict the potential benefits that hospitals might realize if they are successful in using big data. The findings of the research suggest that there were a number of benefits expected by hospitals when using big data analytics, including cost savings and business intelligence. By using big data, many hospitals have recognized that there have been challenges, including lack of experience and cost of developing the analytics. Many hospitals will need to invest in the acquiring of adequate personnel with experience in big data analytics and data integration. The findings of this study suggest that the adoption, implementation, and utilization of big data technology will have a profound positive effect among health care providers.
Big data; sensor networks and remotely-sensed data for mapping; feature extraction from lidar
NASA Astrophysics Data System (ADS)
Tlhabano, Lorato
2018-05-01
Unmanned aerial vehicles (UAVs) can be used for mapping in the close range domain, combining aerial and terrestrial photogrammetry and now the emergence of affordable platforms to carry these technologies has opened up new opportunities for mapping and modeling cadastral boundaries. At the current state mainly low cost UAVs fitted with sensors are used in mapping projects with low budgets, the amount of data produced by the UAVs can be enormous hence the need for big data techniques' and concepts. The past couple of years have witnessed the dramatic rise of low-cost UAVs fitted with high tech Lidar sensors and as such the UAVS have now reached a level of practical reliability and professionalism which allow the use of these systems as mapping platforms. UAV based mapping provides not only the required accuracy with respect to cadastral laws and policies as well as requirements for feature extraction from the data sets and maps produced, UAVs are also competitive to other measurement technologies in terms of economic aspects. In the following an overview on how the various technologies of UAVs, big data concepts and lidar sensor technologies can work together to revolutionize cadastral mapping particularly in Africa and as a test case Botswana in particular will be used to investigate these technologies. These technologies can be combined to efficiently provide cadastral mapping in difficult to reach areas and over large areas of land similar to the Land Administration Procedures, Capacity and Systems (LAPCAS) exercise which was recently undertaken by the Botswana government, we will show how the uses of UAVS fitted with lidar sensor and utilizing big data concepts could have reduced not only costs and time for our government but also how UAVS could have provided more detailed cadastral maps.
NASA Astrophysics Data System (ADS)
Jiang, Li; Xuan, Jianping; Shi, Tielin
2013-12-01
Generally, the vibration signals of faulty machinery are non-stationary and nonlinear under complicated operating conditions. Therefore, it is a big challenge for machinery fault diagnosis to extract optimal features for improving classification accuracy. This paper proposes semi-supervised kernel Marginal Fisher analysis (SSKMFA) for feature extraction, which can discover the intrinsic manifold structure of dataset, and simultaneously consider the intra-class compactness and the inter-class separability. Based on SSKMFA, a novel approach to fault diagnosis is put forward and applied to fault recognition of rolling bearings. SSKMFA directly extracts the low-dimensional characteristics from the raw high-dimensional vibration signals, by exploiting the inherent manifold structure of both labeled and unlabeled samples. Subsequently, the optimal low-dimensional features are fed into the simplest K-nearest neighbor (KNN) classifier to recognize different fault categories and severities of bearings. The experimental results demonstrate that the proposed approach improves the fault recognition performance and outperforms the other four feature extraction methods.
Full Dynamic Reactions in the Basic Shaft Bearings of Big Band Saw Machines
NASA Astrophysics Data System (ADS)
Marinov, Boycho
2013-03-01
The band saws machines are a certain class woodworking machines for longitudinal or transversal cutting as well as for curvilinear wood cutting. These machines saw the wood through a band-saw blade and two feeding wheels. These wheels usually are very large and they are produced with inaccuracies. The centre of mass of the disc is displaced from the axis of rotation of the distance e (eccentricity) and the axis of the disk makes an angle with the axis of rotation. In this paper, the dy- namic reactions in the bearings of the basic shaft, which drives the band saw machines, are analyzed. These reactions are caused by the external loading and the kinematics and the mass characteristics of the rotating disk. The expressions for the full dynamic reactions are obtained. These expressions allow the parameters of the machines to be chosen in such a way that the loading in the shaft and the bearings to be minimal.
Epidemiology in wonderland: Big Data and precision medicine.
Saracci, Rodolfo
2018-03-01
Big Data and precision medicine, two major contemporary challenges for epidemiology, are critically examined from two different angles. In Part 1 Big Data collected for research purposes (Big research Data) and Big Data used for research although collected for other primary purposes (Big secondary Data) are discussed in the light of the fundamental common requirement of data validity, prevailing over "bigness". Precision medicine is treated developing the key point that high relative risks are as a rule required to make a variable or combination of variables suitable for prediction of disease occurrence, outcome or response to treatment; the commercial proliferation of allegedly predictive tests of unknown or poor validity is commented. Part 2 proposes a "wise epidemiology" approach to: (a) choosing in a context imprinted by Big Data and precision medicine-epidemiological research projects actually relevant to population health, (b) training epidemiologists, (c) investigating the impact on clinical practices and doctor-patient relation of the influx of Big Data and computerized medicine and (d) clarifying whether today "health" may be redefined-as some maintain in purely technological terms.
Bellazzi, Riccardo; Dagliati, Arianna; Sacchi, Lucia; Segagni, Daniele
2015-01-01
The so-called big data revolution provides substantial opportunities to diabetes management. At least 3 important directions are currently of great interest. First, the integration of different sources of information, from primary and secondary care to administrative information, may allow depicting a novel view of patient’s care processes and of single patient’s behaviors, taking into account the multifaceted nature of chronic care. Second, the availability of novel diabetes technologies, able to gather large amounts of real-time data, requires the implementation of distributed platforms for data analysis and decision support. Finally, the inclusion of geographical and environmental information into such complex IT systems may further increase the capability of interpreting the data gathered and extract new knowledge from them. This article reviews the main concepts and definitions related to big data, it presents some efforts in health care, and discusses the potential role of big data in diabetes care. Finally, as an example, it describes the research efforts carried on in the MOSAIC project, funded by the European Commission. PMID:25910540
Tapered Roller Bearing Damage Detection Using Decision Fusion Analysis
NASA Technical Reports Server (NTRS)
Dempsey, Paula J.; Kreider, Gary; Fichter, Thomas
2006-01-01
A diagnostic tool was developed for detecting fatigue damage of tapered roller bearings. Tapered roller bearings are used in helicopter transmissions and have potential for use in high bypass advanced gas turbine aircraft engines. A diagnostic tool was developed and evaluated experimentally by collecting oil debris data from failure progression tests conducted using health monitoring hardware. Failure progression tests were performed with tapered roller bearings under simulated engine load conditions. Tests were performed on one healthy bearing and three pre-damaged bearings. During each test, data from an on-line, in-line, inductance type oil debris sensor and three accelerometers were monitored and recorded for the occurrence of bearing failure. The bearing was removed and inspected periodically for damage progression throughout testing. Using data fusion techniques, two different monitoring technologies, oil debris analysis and vibration, were integrated into a health monitoring system for detecting bearing surface fatigue pitting damage. The data fusion diagnostic tool was evaluated during bearing failure progression tests under simulated engine load conditions. This integrated system showed improved detection of fatigue damage and health assessment of the tapered roller bearings as compared to using individual health monitoring technologies.
2009-10-01
DARPA) Legged Squad Support System (LS3) Program. DARPA’s LS3 Program is an effort to develop a walking platform, preferably a quadruped, which...top-scoring UGV’s are track- or wheel-based; only the BigDog is a leg -based system. This presented BigDog with certain advantages (particularly...Technologies, Inc.’s ( DTI ) first location in Ranlo, North Carolina) – is a system capable of wheeled or tracked locomotion and was recently
Analyzing big data with the hybrid interval regression methods.
Huang, Chia-Hui; Yang, Keng-Chieh; Kao, Han-Ying
2014-01-01
Big data is a new trend at present, forcing the significant impacts on information technologies. In big data applications, one of the most concerned issues is dealing with large-scale data sets that often require computation resources provided by public cloud services. How to analyze big data efficiently becomes a big challenge. In this paper, we collaborate interval regression with the smooth support vector machine (SSVM) to analyze big data. Recently, the smooth support vector machine (SSVM) was proposed as an alternative of the standard SVM that has been proved more efficient than the traditional SVM in processing large-scale data. In addition the soft margin method is proposed to modify the excursion of separation margin and to be effective in the gray zone that the distribution of data becomes hard to be described and the separation margin between classes.
Analyzing Big Data with the Hybrid Interval Regression Methods
Kao, Han-Ying
2014-01-01
Big data is a new trend at present, forcing the significant impacts on information technologies. In big data applications, one of the most concerned issues is dealing with large-scale data sets that often require computation resources provided by public cloud services. How to analyze big data efficiently becomes a big challenge. In this paper, we collaborate interval regression with the smooth support vector machine (SSVM) to analyze big data. Recently, the smooth support vector machine (SSVM) was proposed as an alternative of the standard SVM that has been proved more efficient than the traditional SVM in processing large-scale data. In addition the soft margin method is proposed to modify the excursion of separation margin and to be effective in the gray zone that the distribution of data becomes hard to be described and the separation margin between classes. PMID:25143968
Tangential Field Changes in the Great Flare of 1990 May 24.
Cameron; Sammis
1999-11-01
We examine the great (solar) flare of 1990 May 24 that occurred in active region NOAA 6063. The Big Bear Solar Observatory videomagnetograph Stokes V and I images show a change in the longitudinal field before and after the flare. Since the flare occurred near the limb, the change reflects a rearrangement of the tangential components of the magnetic field. These observations lack the 180 degrees ambiguity that characterizes vector magnetograms.
Magnetic bearing turbomachinery case histories and applications for space related equipment
NASA Technical Reports Server (NTRS)
Weise, David A.
1993-01-01
The concept of magnetic levitation is not a new one and can be easily traced back to the 1800's. It is only recently, however, that the congruous technologies of electronic control systems, power electronics, and magnetic materials have begun to merge to make the magnetic suspension device a viable product. A brief overview of an active magnetic bearing technology is provided. Case histories of various turbomachinery in North America presently operating on magnetic bearings are reviewed. Finally, projections are made as to the space related machinery that may be benefited by incorporating magnetic bearings into the equipment design.
Big Data’s Role in Precision Public Health
Dolley, Shawn
2018-01-01
Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts. PMID:29594091
Big Data's Role in Precision Public Health.
Dolley, Shawn
2018-01-01
Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts.
Cincinnati Big Area Additive Manufacturing (BAAM)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duty, Chad E.; Love, Lonnie J.
Oak Ridge National Laboratory (ORNL) worked with Cincinnati Incorporated (CI) to demonstrate Big Area Additive Manufacturing which increases the speed of the additive manufacturing (AM) process by over 1000X, increases the size of parts by over 10X and shows a cost reduction of over 100X. ORNL worked with CI to transition the Big Area Additive Manufacturing (BAAM) technology from a proof-of-principle (TRL 2-3) demonstration to a prototype product stage (TRL 7-8).
ERIC Educational Resources Information Center
Gordon, Dan
2011-01-01
When it comes to implementing innovative classroom technology programs, urban school districts face significant challenges stemming from their big-city status. These range from large bureaucracies, to scalability, to how to meet the needs of a more diverse group of students. Because of their size, urban districts tend to have greater distance…
A multi points ultrasonic detection method for material flow of belt conveyor
NASA Astrophysics Data System (ADS)
Zhang, Li; He, Rongjun
2018-03-01
For big detection error of single point ultrasonic ranging technology used in material flow detection of belt conveyor when coal distributes unevenly or is large, a material flow detection method of belt conveyor is designed based on multi points ultrasonic counter ranging technology. The method can calculate approximate sectional area of material by locating multi points on surfaces of material and belt, in order to get material flow according to running speed of belt conveyor. The test results show that the method has smaller detection error than single point ultrasonic ranging technology under the condition of big coal with uneven distribution.
Rolling Bearing Steels - A Technical and Historical Perspective
NASA Technical Reports Server (NTRS)
Zaretsky, Erwin V.
2012-01-01
Starting about 1920 it becomes easier to track the growth of bearing materials technology. Until 1955, with few exceptions, comparatively little progress was made in this area. AISI 52100 and some carburizing grades (AISI 4320, AISI 9310) were adequate for most applications. The catalyst to quantum advances in high-performance rolling-element bearing steels was the advent of the aircraft gas turbine engine. With improved bearing manufacturing and steel processing together with advanced lubrication technology, the potential improvements in bearing life can be as much as 80 times that attainable in the late 1950s or as much as 400 times that attainable in 1940. This paper summarizes the chemical, metallurgical and physical aspects of bearing steels and their effect on rolling bearing life and reliability. The single most important variable that has significantly increased bearing life and reliability is vacuum processing of bearing steel. Differences between through hardened, case carburized and corrosion resistant steels are discussed. The interrelation of alloy elements and carbides and their effect on bearing life are presented. An equation relating bearing life, steel hardness and temperature is given. Life factors for various steels are suggested and discussed. A relation between compressive residual stress and bearing life is presented. The effects of retained austenite and grain size are discussed.
NASA Technical Reports Server (NTRS)
Aber, Gregory S. (Inventor)
2000-01-01
An apparatus is provided for a blood pump bearing system within a pump housing to support long-term highspeed rotation of a rotor with an impeller blade having a plurality of individual magnets disposed thereon to provide a small radial air gap between the magnets and a stator of less than 0.025 inches. The bearing system may be mounted within a flow straightener, diffuser, or other pump element to support the shaft of a pump rotor. The bearing system includes a zirconia shaft having a radiused end. The radiused end has a first radius selected to be about three times greater than the radius of the zirconia shaft. The radiused end of the zirconia shaft engages a flat sapphire endstone. Due to the relative hardness of these materials a flat is quickly produced during break-in on the zirconia radiused end of precisely the size necessary to support thrust loads whereupon wear substantially ceases. Due to the selection of the first radius, the change in shaft end-play during pump break-in is limited to a total desired end-play of less than about 0.010 inches. Radial loads are supported by an olive hole ring jewel that makes near line contact around the circumference of the Ir shaft to support big speed rotation with little friction. The width of olive hole ring jewel is small to allow heat to conduct through to thereby prevent heat build-up in the bearing. A void defined by the bearing elements may fill with blood that then coagulates within the void. The coagulated blood is then conformed to the shape of the bearing surfaces.
Spatially explicit population estimates for black bears based on cluster sampling
Humm, J.; McCown, J. Walter; Scheick, B.K.; Clark, Joseph D.
2017-01-01
We estimated abundance and density of the 5 major black bear (Ursus americanus) subpopulations (i.e., Eglin, Apalachicola, Osceola, Ocala-St. Johns, Big Cypress) in Florida, USA with spatially explicit capture-mark-recapture (SCR) by extracting DNA from hair samples collected at barbed-wire hair sampling sites. We employed a clustered sampling configuration with sampling sites arranged in 3 × 3 clusters spaced 2 km apart within each cluster and cluster centers spaced 16 km apart (center to center). We surveyed all 5 subpopulations encompassing 38,960 km2 during 2014 and 2015. Several landscape variables, most associated with forest cover, helped refine density estimates for the 5 subpopulations we sampled. Detection probabilities were affected by site-specific behavioral responses coupled with individual capture heterogeneity associated with sex. Model-averaged bear population estimates ranged from 120 (95% CI = 59–276) bears or a mean 0.025 bears/km2 (95% CI = 0.011–0.44) for the Eglin subpopulation to 1,198 bears (95% CI = 949–1,537) or 0.127 bears/km2 (95% CI = 0.101–0.163) for the Ocala-St. Johns subpopulation. The total population estimate for our 5 study areas was 3,916 bears (95% CI = 2,914–5,451). The clustered sampling method coupled with information on land cover was efficient and allowed us to estimate abundance across extensive areas that would not have been possible otherwise. Clustered sampling combined with spatially explicit capture-recapture methods has the potential to provide rigorous population estimates for a wide array of species that are extensive and heterogeneous in their distribution.
An Oil-Free Thrust Foil Bearing Facility Design, Calibration, and Operation
NASA Technical Reports Server (NTRS)
Bauman, Steve
2005-01-01
New testing capabilities are needed in order to foster thrust foil air bearing technology development and aid its transition into future Oil-Free gas turbines. This paper describes a new test apparatus capable of testing thrust foil air bearings up to 100 mm in diameter at speeds to 80,000 rpm and temperatures to 650 C (1200 F). Measured parameters include bearing torque, load capacity, and bearing temperatures. This data will be used for design performance evaluations and for validation of foil bearing models. Preliminary test results demonstrate that the rig is capable of testing thrust foil air bearings under a wide range of conditions which are anticipated in future Oil-Free gas turbines. Torque as a function of speed and temperature corroborates results expected from rudimentary performance models. A number of bearings were intentionally failed with no resultant damage whatsoever to the test rig. Several test conditions (specific speeds and loads) revealed undesirable axial shaft vibrations which have been attributed to the magnetic bearing control system and are under study. Based upon these preliminary results, this test rig will be a valuable tool for thrust foil bearing research, parametric studies and technology development.
NASA Technical Reports Server (NTRS)
DellaCorte, Christopher; Valco, Mark J.
2002-01-01
The Oil-Free Turbomachinery team at the NASA Glenn Research Center has unlocked one of the mysteries surrounding foil air bearing performance. Foil air bearings are self-acting hydrodynamic bearings that use ambient air, or any fluid, as their lubricant. In operation, the motion of the shaft's surface drags fluid into the bearing by viscous action, creating a pressurized lubricant film. This lubricating film separates the stationary foil bearing surface from the moving shaft and supports load. Foil bearings have been around for decades and are widely employed in the air cycle machines used for cabin pressurization and cooling aboard commercial jetliners. The Oil-Free Turbomachinery team is fostering the maturation of this technology for integration into advanced Oil-Free aircraft engines. Elimination of the engine oil system can significantly reduce weight and cost and could enable revolutionary new engine designs. Foil bearings, however, have complex elastic support structures (spring packs) that make the prediction of bearing performance, such as load capacity, difficult if not impossible. Researchers at Glenn recently found a link between foil bearing design and load capacity performance. The results have led to a simple rule-of-thumb that relates a bearing's size, speed, and design to its load capacity. Early simple designs (Generation I) had simple elastic (spring) support elements, and performance was limited. More advanced bearings (Generation III) with elastic supports, in which the stiffness is varied locally to optimize gas film pressures, exhibit load capacities that are more than double those of the best previous designs. This is shown graphically in the figure. These more advanced bearings have enabled industry to introduce commercial Oil-Free gas-turbine-based electrical generators and are allowing the aeropropulsion industry to incorporate the technology into aircraft engines. The rule-of-thumb enables engine and bearing designers to easily size and select bearing technology for a new application and determine the level of complexity required in the bearings. This new understanding enables industry to assess the feasibility of new engine designs and provides critical guidance toward the future development of Oil-Free turbomachinery propulsion systems.
Technology and Pedagogy: Using Big Data to Enhance Student Learning
ERIC Educational Resources Information Center
Brinton, Christopher Greg
2016-01-01
The "big data revolution" has penetrated many fields, from network monitoring to online retail. Education and learning are quickly becoming part of it, too, because today, course delivery platforms can collect unprecedented amounts of behavioral data about students as they interact with learning content online. This data includes, for…
Technology Enhanced Analytics (TEA) in Higher Education
ERIC Educational Resources Information Center
Daniel, Ben Kei; Butson, Russell
2013-01-01
This paper examines the role of Big Data Analytics in addressing contemporary challenges associated with current changes in institutions of higher education. The paper first explores the potential of Big Data Analytics to support instructors, students and policy analysts to make better evidence based decisions. Secondly, the paper presents an…
Challenges of Big Data in Educational Assessment
ERIC Educational Resources Information Center
Gibson, David C.; Webb, Mary; Ifenthaler, Dirk
2015-01-01
This paper briefly discusses four measurement challenges of data science or "big data" in educational assessments that are enabled by technology: 1. Dealing with change over time via time-based data. 2. How a digital performance space's relationships interact with learner actions, communications and products. 3. How layers of…
ERIC Educational Resources Information Center
Eisenberg, Mike
1998-01-01
Presents strategies for relating the Big6 information problem-solving process to sports to gain students' attention, sustain it, and make instruction relevant to their interests. Lectures by coaches, computer-based sports games, sports information sources, the use of technology in sports, and judging sports events are discussed. (LRW)
ERIC Educational Resources Information Center
Cords, Sarah Statz
2009-01-01
In 2008, as business increasingly became the news, business publishing showed a growing awareness of the big picture, putting out fewer personality-driven titles. In this list, one will see that business histories embrace recent technological innovation as much as longevity. The best management/HR books look at the big picture through pushing…
Teleconferencing Technology Facilitates Collaboration. Spotlight Feature
ERIC Educational Resources Information Center
Dopke-Wilson, MariRae
2006-01-01
Big, comprehensive projects involving multiple teachers, components, and electronic media can daunt the most ambitious educator. But for Library Media Specialist Bonnie French, big projects are no problem! A pioneer SOS database contributor, Bonnie can be aptly dubbed the "queen of collaboration." In this article, the author discusses how Bonnie…
High School Teen Mentoring Handbook
ERIC Educational Resources Information Center
Alberta Advanced Education and Technology, 2009
2009-01-01
Big Brothers Big Sisters Edmonton & Area, in partnership with Alberta Advanced Education and Technology, are providing the High School Teen Mentoring Program, a school-based mentoring program where mentor-mentee matches meet for one hour per week to engage in relationship-building activities at an elementary school. This initiative aims to…
Small Public Libraries Can Serve Big. ERIC Digest.
ERIC Educational Resources Information Center
Parry, Norm
Small public libraries can deliver service like big libraries, without sacrificing hometown warmth and charm. By borrowing strategies used by successful small businesses in the private sector, defining goals and exploiting low cost technologies, small public libraries can serve customer wants as well as much larger institutions. Responding to just…
Leveraging Big-Data for Business Process Analytics
ERIC Educational Resources Information Center
Vera-Baquero, Alejandro; Colomo Palacios, Ricardo; Stantchev, Vladimir; Molloy, Owen
2015-01-01
Purpose: This paper aims to present a solution that enables organizations to monitor and analyse the performance of their business processes by means of Big Data technology. Business process improvement can drastically influence in the profit of corporations and helps them to remain viable. However, the use of traditional Business Intelligence…
Kalid, Naser; Zaidan, A A; Zaidan, B B; Salman, Omar H; Hashim, M; Muzammil, H
2017-12-29
The growing worldwide population has increased the need for technologies, computerised software algorithms and smart devices that can monitor and assist patients anytime and anywhere and thus enable them to lead independent lives. The real-time remote monitoring of patients is an important issue in telemedicine. In the provision of healthcare services, patient prioritisation poses a significant challenge because of the complex decision-making process it involves when patients are considered 'big data'. To our knowledge, no study has highlighted the link between 'big data' characteristics and real-time remote healthcare monitoring in the patient prioritisation process, as well as the inherent challenges involved. Thus, we present comprehensive insights into the elements of big data characteristics according to the six 'Vs': volume, velocity, variety, veracity, value and variability. Each of these elements is presented and connected to a related part in the study of the connection between patient prioritisation and real-time remote healthcare monitoring systems. Then, we determine the weak points and recommend solutions as potential future work. This study makes the following contributions. (1) The link between big data characteristics and real-time remote healthcare monitoring in the patient prioritisation process is described. (2) The open issues and challenges for big data used in the patient prioritisation process are emphasised. (3) As a recommended solution, decision making using multiple criteria, such as vital signs and chief complaints, is utilised to prioritise the big data of patients with chronic diseases on the basis of the most urgent cases.
Research on Durability of Big Recycled Aggregate Self-Compacting Concrete Beam
NASA Astrophysics Data System (ADS)
Gao, Shuai; Liu, Xuliang; Li, Jing; Li, Juan; Wang, Chang; Zheng, Jinkai
2018-03-01
Deflection and crack width are the most important durability indexes, which play a pivotal role in the popularization and application of the Big Recycled Aggregate Self-Compacting Concrete technology. In this research, comparative study on the Big Recycled Aggregate Self-Compacting Concrete Beam and ordinary concrete beam were conducted by measuring the deflection and crack width index. The results show that both kind of concrete beams have almost equal mid-span deflection value and are slightly different in the maximum crack width. It indicates that the Big Recycled Aggregate Self-Compacting Concrete Beam will be a good substitute for ordinary concrete beam in some less critical structure projects.
Small, high-speed bearing technology for cryogenic turbo-pumps
NASA Technical Reports Server (NTRS)
Winn, L. W.; Eusepi, M. W.; Smalley, A. J.
1974-01-01
The design of 20-mm bore ball bearings is described for cryogenic turbo-machinery applications, operating up to speeds of 120,000 rpm. A special section is included on the design of hybrid bearings. Each hybrid bearing is composed of a ball bearing in series with a conventional pressurized fluid-film journal bearing. Full details are presented on the design of a test vehicle which possesses the capability of testing the above named bearings within the given speed range under externally applied radial and axial loads.
Screening the psychological laboratory: Hugo Münsterberg, psychotechnics, and the cinema, 1892-1916.
Blatter, Jeremy
2015-03-01
According to Hugo Münsterberg, the direct application of experimental psychology to the practical problems of education, law, industry, and art belonged by definition to the domain of psychotechnics. Whether in the form of pedagogical prescription, interrogation technique, hiring practice, or aesthetic principle, the psychotechnical method implied bringing the psychological laboratory to bear on everyday life. There were, however, significant pitfalls to leaving behind the putative purity of the early psychological laboratory in pursuit of technological utility. In the Vocation Bureau, for example, psychological instruments were often deemed too intimidating for a public unfamiliar with the inner workings of experimental science. Similarly, when psychotechnical means were employed by big business in screening job candidates, ethical red flags were raised about this new alliance between science and capital. This tension was particularly evident in Münsterberg's collaboration with the Paramount Pictures Corporation in 1916. In translating psychological tests into short experimental films, Münsterberg not only envisioned a new mass medium for the dissemination of psychotechnics, but a means by which to initiate the masses into the culture of experimental psychology.
Privacy Challenges of Genomic Big Data.
Shen, Hong; Ma, Jian
2017-01-01
With the rapid advancement of high-throughput DNA sequencing technologies, genomics has become a big data discipline where large-scale genetic information of human individuals can be obtained efficiently with low cost. However, such massive amount of personal genomic data creates tremendous challenge for privacy, especially given the emergence of direct-to-consumer (DTC) industry that provides genetic testing services. Here we review the recent development in genomic big data and its implications on privacy. We also discuss the current dilemmas and future challenges of genomic privacy.
2017-01-01
China is setting up a hierarchical medical system to solve the problems of biased resource allocation and high patient flows to large hospitals. The development of big data and mobile Internet technology provides a new perspective for the establishment of hierarchical medical system. This viewpoint discusses the challenges with the hierarchical medical system in China and how big data and mobile Internet can be used to mitigate these challenges. PMID:28790024
A New High-Speed Oil-Free Turbine Engine Rotordynamic Simulator Test Rig
NASA Technical Reports Server (NTRS)
Howard, Samuel A.
2007-01-01
A new test rig has been developed for simulating high-speed turbomachinery rotor systems using Oil-Free foil air bearing technology. Foil air bearings have been used in turbomachinery, primarily air cycle machines, for the past four decades to eliminate the need for oil lubrication. The goal of applying this bearing technology to other classes of turbomachinery has prompted the fabrication of this test rig. The facility gives bearing designers the capability to test potential bearing designs with shafts that simulate the rotating components of a target machine without the high cost of building "make-and-break" hardware. The data collected from this rig can be used to make design changes to the shaft and bearings in subsequent design iterations. This paper describes the new test rig and demonstrates its capabilities through the initial run with a simulated shaft system.
The Structural Consequences of Big Data-Driven Education.
Zeide, Elana
2017-06-01
Educators and commenters who evaluate big data-driven learning environments focus on specific questions: whether automated education platforms improve learning outcomes, invade student privacy, and promote equality. This article puts aside separate unresolved-and perhaps unresolvable-issues regarding the concrete effects of specific technologies. It instead examines how big data-driven tools alter the structure of schools' pedagogical decision-making, and, in doing so, change fundamental aspects of America's education enterprise. Technological mediation and data-driven decision-making have a particularly significant impact in learning environments because the education process primarily consists of dynamic information exchange. In this overview, I highlight three significant structural shifts that accompany school reliance on data-driven instructional platforms that perform core school functions: teaching, assessment, and credentialing. First, virtual learning environments create information technology infrastructures featuring constant data collection, continuous algorithmic assessment, and possibly infinite record retention. This undermines the traditional intellectual privacy and safety of classrooms. Second, these systems displace pedagogical decision-making from educators serving public interests to private, often for-profit, technology providers. They constrain teachers' academic autonomy, obscure student evaluation, and reduce parents' and students' ability to participate or challenge education decision-making. Third, big data-driven tools define what "counts" as education by mapping the concepts, creating the content, determining the metrics, and setting desired learning outcomes of instruction. These shifts cede important decision-making to private entities without public scrutiny or pedagogical examination. In contrast to the public and heated debates that accompany textbook choices, schools often adopt education technologies ad hoc. Given education's crucial impact on individual and collective success, educators and policymakers must consider the implications of data-driven education proactively and explicitly.
2013-12-13
sequent analyses have also conditioned growth in psychological assets on various deployment indices and demographic factors (e.g., gender , age). In...for different subgroups (e.g., gender , age, education, and marital status). The Penn team is currently studying the impact of com- bat deployments on...information to bear on issues which have widespread implications for the DoD. ACKNOWLEDGMENTS Order of authorship was determined by a coin flip. Loryana
Study on key technologies of optimization of big data for thermal power plant performance
NASA Astrophysics Data System (ADS)
Mao, Mingyang; Xiao, Hong
2018-06-01
Thermal power generation accounts for 70% of China's power generation, the pollutants accounted for 40% of the same kind of emissions, thermal power efficiency optimization needs to monitor and understand the whole process of coal combustion and pollutant migration, power system performance data show explosive growth trend, The purpose is to study the integration of numerical simulation of big data technology, the development of thermal power plant efficiency data optimization platform and nitrogen oxide emission reduction system for the thermal power plant to improve efficiency, energy saving and emission reduction to provide reliable technical support. The method is big data technology represented by "multi-source heterogeneous data integration", "large data distributed storage" and "high-performance real-time and off-line computing", can greatly enhance the energy consumption capacity of thermal power plants and the level of intelligent decision-making, and then use the data mining algorithm to establish the boiler combustion mathematical model, mining power plant boiler efficiency data, combined with numerical simulation technology to find the boiler combustion and pollutant generation rules and combustion parameters of boiler combustion and pollutant generation Influence. The result is to optimize the boiler combustion parameters, which can achieve energy saving.
High Resolution Flare Observations with the 1.6 m Telescope at Big Bear Solar Observatory
NASA Astrophysics Data System (ADS)
Wang, H.
2017-12-01
This talk presents some exciting new results of 1.6m Goode Solar Telescope (GST, formally named as NST) at Big Bear Solar Observatory (BBSO). I will report: (1) Flare ribbons and post-flare loops are observed in the scale of around 100 to 200 km. (2) the sudden flare-induced rotation of a sunspot. It is clearly observed that the rotation is non-uniform over the sunspot: as the flare ribbon sweeps across, its different portions accelerate at different times corresponding to peaks of flare hard X-ray emission. The rotation may be driven by the surface Lorentz-force change due to the back reaction of coronal magnetic restructuring and is accompanied by a downward Poynting flux. (3) We found the clear evidence that electron streaming down during a flare can induce extra transient transverse magnetic field that cause apparent rotation only at the propagating ribbon front. Sometimes they are associated with so called negative flares in HeI 10830 and D3 lines. (4) We found evidence that episodes of precursor brightenings are initiated at a small-scale magnetic channel (a form of opposite polarity fluxes) with multiple polarity inversions and enhanced magnetic fluxes and currents, lying near the footpoints of sheared magnetic loops. The low-atmospheric origin of these precursor emissions is corroborated by microwave spectra.
Small and big quality in health care.
Lillrank, Paul Martin
2015-01-01
The purpose of this paper is to clarify healthcare quality's ontological and epistemological foundations; and examine how these lead to different measurements and technologies. Conceptual analysis. Small quality denotes conformance to ex ante requirements. Big quality includes product and service design, based on customer requirements and expectations. Healthcare quality can be divided into three areas: clinical decision making; patient safety; and patient experience, each with distinct measurement and improvement technologies. The conceptual model is expected to bring clarity to constructing specific definitions, measures, objectives and technologies for improving healthcare. This paper claims that before healthcare quality can be defined, measured and integrated into systems, it needs to be clearly separated into ontologically and epistemologically different parts.
Oil-Free Shaft Support System Rotordynamics: Past, Present, and Future Challenges and Opportunities
NASA Technical Reports Server (NTRS)
DellaCorte, Christopher
2011-01-01
Recent breakthroughs in Oil-Free technologies have enabled new high-speed rotor systems and turbomachinery. Such technologies can include compliant-surface gas bearings, magnetic bearings, and advanced solid lubricants and tribo-materials. This presentation briefly reviews critical technology developments and the current state-of-the-art, emerging Oil-Free rotor systems and discusses obstacles preventing more widespread use. Key examples of "best practices" for deploying Oil-Free technologies will be presented and remaining major technical questions surrounding Oil-Free technologies will be brought forward.
Who Owns Educational Theory? Big Data, Algorithms and the Expert Power of Education Data Science
ERIC Educational Resources Information Center
Williamson, Ben
2017-01-01
"Education data science" is an emerging methodological field which possesses the algorithm-driven technologies required to generate insights and knowledge from educational big data. This article consists of an analysis of the Lytics Lab, Stanford University's laboratory for research and development in learning analytics, and the Center…
After the Big Bang: What's Next in Design Education? Time to Relax?
ERIC Educational Resources Information Center
Fleischmann, Katja
2015-01-01
The article "Big Bang technology: What's next in design education, radical innovation or incremental change?" (Fleischmann, 2013) appeared in the "Journal of Learning Design" Volume 6, Issue 3 in 2013. Two years on, Associate Professor Fleischmann reflects upon her original article within this article. Although it has only been…
The Big Bang: UK Young Scientists' and Engineers' Fair 2010
ERIC Educational Resources Information Center
Allison, Simon
2010-01-01
The Big Bang: UK Young Scientists' and Engineers' Fair is an annual three-day event designed to promote science, technology, engineering and maths (STEM) careers to young people aged 7-19 through experiential learning. It is supported by stakeholders from business and industry, government and the community, and brings together people from various…
Big 6 Tips: Teaching Information Problem Solving. #1 Task Definition: What Needs To Be Done.
ERIC Educational Resources Information Center
Eisenberg, Michael
1997-01-01
Explains task definition which is the first stage in the Big 6, an approach to information and technology skills instruction. Highlights include defining the problem; identifying the information requirements of the problem; transferability from curriculum-based problems to everyday tasks; and task definition logs kept by students. (LRW)
Relevance of eHealth standards for big data interoperability in radiology and beyond.
Marcheschi, Paolo
2017-06-01
The aim of this paper is to report on the implementation of radiology and related information technology standards to feed big data repositories and so to be able to create a solid substrate on which to operate with analysis software. Digital Imaging and Communications in Medicine (DICOM) and Health Level 7 (HL7) are the major standards for radiology and medical information technology. They define formats and protocols to transmit medical images, signals, and patient data inside and outside hospital facilities. These standards can be implemented but big data expectations are stimulating a new approach, simplifying data collection and interoperability, seeking reduction of time to full implementation inside health organizations. Virtual Medical Record, DICOM Structured Reporting and HL7 Fast Healthcare Interoperability Resources (FHIR) are changing the way medical data are shared among organization and they will be the keys to big data interoperability. Until we do not find simple and comprehensive methods to store and disseminate detailed information on the patient's health we will not be able to get optimum results from the analysis of those data.
Research Progresses and Suggestions of Manufacturing Technologies of Engine Bearing Bushes
NASA Astrophysics Data System (ADS)
Cao, J.; Yin, Z. W.; Li, H. L.; Y Gao, G.
2017-12-01
Bearing bush is a key part of diesel engine, and its performance directly influences the life of whole machine. Several manufacturing technologies of bearing bush such as centrifugal casting, sintering, electroplating and magnetron sputtering have been overviewed. Their bond strength, porosity, production efficient, layer thickness, frictional coefficient and corresponding materials analyzed and compared. Results show that the porosity and oxidation of sintering and centrifugal casting are higher than that of other two methods. However, the production efficiency and coating thickness are better than that of electroplating and magnetron sputtering. Based on above comparisons and discussions, the improvements of all manufacturing technologies are suggested and supersonic cold spraying is suggested. It is proved that cold spraying technology is the best choice in the future with the developing of low frictional materials.
International Collaboration Patterns and Effecting Factors of Emerging Technologies
Bai, Xu; Liu, Yun
2016-01-01
With the globalization of the world economy, international innovation collaboration has taken place all over the world. This study selects three emerging technologies (3D printing, big data and carbon nanotubes and graphene technology) among 20 countries as the research objects, using three patent-based indicators and network relationship analysis to reflect international collaboration patterns. Then we integrate empirical analyses to show effecting factors of international collaboration degrees by using panel data. The results indicate that while 3D printing technology is associated with a “balanced collaboration” mode, big data technology is more accurately described by a radial pattern, centered on the United States, and carbon nanotubes and graphene technology exhibits “small-world” characteristics in this respect. It also shows that the factors GDP per capita (GPC), R&D expenditure (RDE) and the export of global trade value (ETV) negatively affect the level of international collaboration. It could be useful for China and other developing countries to make international scientific and technological collaboration strategies and policies in the future. PMID:27911926
NASA Technical Reports Server (NTRS)
Anderson, W. J.
1980-01-01
The considered investigations deal with some of the more important present day and future bearing requirements, and design methodologies available for coping with them. Solutions to many forthcoming bearing problems lie in the utilization of the most advanced materials, design methods, and lubrication techniques. Attention is given to materials for rolling element bearings, numerical analysis techniques and design methodology for rolling element bearing load support systems, lubrication of rolling element bearings, journal bearing design for high speed turbomachinery, design and energy losses in the case of turbulent flow bearings, and fluid film bearing response to dynamic loading.
[Big Data: the great opportunities and challenges to microbiome and other biomedical research].
Xu, Zhenjiang
2015-02-01
With the development of high-throughput technologies, biomedical data has been increasing exponentially in an explosive manner. This brings enormous opportunities and challenges to biomedical researchers on how to effectively utilize big data. Big data is different from traditional data in many ways, described as 3Vs - volume, variety and velocity. From the perspective of biomedical research, here I introduced the characteristics of big data, such as its messiness, re-usage and openness. Focusing on microbiome research of meta-analysis, the author discussed the prospective principles in data collection, challenges of privacy protection in data management, and the scalable tools in data analysis with examples from real life.
GREENE, CASEY S.; TAN, JIE; UNG, MATTHEW; MOORE, JASON H.; CHENG, CHAO
2017-01-01
Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the “big data” era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both “machine learning” algorithms as well as “unsupervised” and “supervised” examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. PMID:27908398
GREENE, CASEY S.; TAN, JIE; UNG, MATTHEW; MOORE, JASON H.; CHENG, CHAO
2017-01-01
Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the “big data” era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both “machine learning” algorithms as well as “unsupervised” and “supervised” examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. PMID:24799088
Big Science and the Large Hadron Collider
NASA Astrophysics Data System (ADS)
Giudice, Gian Francesco
2012-03-01
The Large Hadron Collider (LHC), the particle accelerator operating at CERN, is probably the most complex and ambitious scientific project ever accomplished by humanity. The sheer size of the enterprise, in terms of financial and human resources, naturally raises the question whether society should support such costly basic-research programs. I address this question by first reviewing the process that led to the emergence of Big Science and the role of large projects in the development of science and technology. I then compare the methodologies of Small and Big Science, emphasizing their mutual linkage. Finally, after examining the cost of Big Science projects, I highlight several general aspects of their beneficial implications for society.
Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao
2014-12-01
Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. © 2014 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Waggoner, L. A.; Capalbo, S. M.; Talbott, J.
2007-05-01
Within the Big Sky region, including Montana, Idaho, South Dakota, Wyoming and the Pacific Northwest, industry is developing new coal-fired power plants using the abundant coal and other fossil-based resources. Of crucial importance to future development programs are robust carbon mitigation plans that include a technical and economic assessment of regional carbon sequestration opportunities. The objective of the Big Sky Carbon Sequestration Partnership (BSCSP) is to promote the development of a regional framework and infrastructure required to validate and deploy carbon sequestration technologies. Initial work compiled sources and potential sinks for carbon dioxide (CO2) in the Big Sky Region and developed the online Carbon Atlas. Current efforts couple geologic and terrestrial field validation tests with market assessments, economic analysis and regulatory and public outreach. The primary geological efforts are in the demonstration of carbon storage in mafic/basalt formations, a geology not yet well characterized but with significant long-term storage potential in the region and other parts of the world; and in the Madison Formation, a large carbonate aquifer in Wyoming and Montana. Terrestrial sequestration relies on management practices and technologies to remove atmospheric CO2 to storage in trees, plants, and soil. This indirect sequestration method can be implemented today and is on the front-line of voluntary, market-based approaches to reduce CO2 emissions. Details of pilot projects are presented including: new technologies, challenges and successes of projects and potential for commercial-scale deployment.
Healthcare and the Roles of the Medical Profession in the Big Data Era*1
YAMAMOTO, Yuji
2016-01-01
The accumulation of large amounts of healthcare information is in progress, and society is about to enter the Health Big Data era by linking such data. Medical professionals’ daily tasks in clinical practice have become more complicated due to information overload, accelerated technological development, and the expansion of conceptual frameworks for medical care. Further, their responsibilities are more challenging and their workload is consistently increasing. As medical professionals enter the Health Big Data era, we need to reevaluate the fundamental significance and role of medicine and investigate ways to utilize this available information and technology. For example, a data analysis on diabetes patients has already shed light on the status of accessibility to physicians and the treatment response rate. In time, large amounts of health data will help find solutions including new effective treatment that could not be discovered by conventional means. Despite the vastness of accumulated data and analyses, their interpretation is necessarily conducted by attending physicians who communicate these findings to patients face to face; this task cannot be replaced by technology. As medical professionals, we must take the initiative to evaluate the framework of medicine in the Health Big Data era, study the ideal approach for clinical practitioners within this framework, and spread awareness to the public about our framework and approach while implementing them. PMID:28299246
-Omic and Electronic Health Record Big Data Analytics for Precision Medicine.
Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D; Venugopalan, Janani; Hoffman, Ryan; Wang, May D
2017-02-01
Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of healthcare. In this paper, we present -omic and EHR data characteristics, associated challenges, and data analytics including data preprocessing, mining, and modeling. To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Big data analytics is able to address -omic and EHR data challenges for paradigm shift toward precision medicine. Big data analytics makes sense of -omic and EHR data to improve healthcare outcome. It has long lasting societal impact.
Recent Development in Big Data Analytics for Business Operations and Risk Management.
Choi, Tsan-Ming; Chan, Hing Kai; Yue, Xiaohang
2017-01-01
"Big data" is an emerging topic and has attracted the attention of many researchers and practitioners in industrial systems engineering and cybernetics. Big data analytics would definitely lead to valuable knowledge for many organizations. Business operations and risk management can be a beneficiary as there are many data collection channels in the related industrial systems (e.g., wireless sensor networks, Internet-based systems, etc.). Big data research, however, is still in its infancy. Its focus is rather unclear and related studies are not well amalgamated. This paper aims to present the challenges and opportunities of big data analytics in this unique application domain. Technological development and advances for industrial-based business systems, reliability and security of industrial systems, and their operational risk management are examined. Important areas for future research are also discussed and revealed.
Cell Phones ≠ Self and Other Problems with Big Data Detection and Containment during Epidemics.
Erikson, Susan L
2018-03-08
Evidence from Sierra Leone reveals the significant limitations of big data in disease detection and containment efforts. Early in the 2014-2016 Ebola epidemic in West Africa, media heralded HealthMap's ability to detect the outbreak from newsfeeds. Later, big data-specifically, call detail record data collected from millions of cell phones-was hyped as useful for stopping the disease by tracking contagious people. It did not work. In this article, I trace the causes of big data's containment failures. During epidemics, big data experiments can have opportunity costs: namely, forestalling urgent response. Finally, what counts as data during epidemics must include that coming from anthropological technologies because they are so useful for detection and containment. © 2018 The Authors Medical Anthropology Quarterly published by Wiley Periodicals, Inc. on behalf of American Anthropological Association.
Ulloa, Emilio C; Hammett, Julia F; O'Neal, Danielle N; Lydston, Emily E; Leon Aramburo, Leslie F
2016-12-01
Intimate partner violence (IPV) is a major public health concern. Thus, it is vital to identify factors, such as individuals' personality traits, that may place men and women at risk for experiencing IPV. This study used data from Wave 4 of the National Longitudinal Study of Adolescent Health (N = 7,187), to examine the association between the Big Five personality traits and IPV perpetration and victimization among men and women. High openness, extraversion, and neuroticism emerged as the three most important risk factors associated with IPV. Although risk factors were found to be relatively similar for IPV perpetration and IPV victimization, some gender differences emerged, showing that extraversion was only connected to IPV for women but not for men. The present findings may bear important considerations for researchers and practitioners working with individuals and couples affected by IPV.
NASA Astrophysics Data System (ADS)
Robinson, Niall; Tomlinson, Jacob; Prudden, Rachel; Hilson, Alex; Arribas, Alberto
2017-04-01
The Met Office Informatics Lab is a small multidisciplinary team which sits between science, technology and design. Our mission is simply "to make Met Office data useful" - a deliberately broad objective. Our prototypes often trial cutting edge technologies, and so far have included projects such as virtual reality data visualisation in the web browser, bots and natural language interfaces, and artificially intelligent weather warnings. In this talk we focus on our latest project, Jade, a big data analysis platform in the cloud. It is a powerful, flexible and simple to use implementation which makes extensive use of technologies such as Jupyter, Dask, containerisation, Infrastructure as Code, and auto-scaling. Crucially, Jade is flexible enough to be used for a diverse set of applications: it can present weather forecast information to meteorologists and allow climate scientists to analyse big data sets, but it is also effective for analysing non-geospatial data. As well as making data useful, the Informatics Lab also trials new working practises. In this presentation, we will talk about our experience of making a group like the Lab successful.
[Traditional Chinese Medicine data management policy in big data environment].
Liang, Yang; Ding, Chang-Song; Huang, Xin-di; Deng, Le
2018-02-01
As traditional data management model cannot effectively manage the massive data in traditional Chinese medicine(TCM) due to the uncertainty of data object attributes as well as the diversity and abstraction of data representation, a management strategy for TCM data based on big data technology is proposed. Based on true characteristics of TCM data, this strategy could solve the problems of the uncertainty of data object attributes in TCM information and the non-uniformity of the data representation by using modeless properties of stored objects in big data technology. Hybrid indexing mode was also used to solve the conflicts brought by different storage modes in indexing process, with powerful capabilities in query processing of massive data through efficient parallel MapReduce process. The theoretical analysis provided the management framework and its key technology, while its performance was tested on Hadoop by using several common traditional Chinese medicines and prescriptions from practical TCM data source. Result showed that this strategy can effectively solve the storage problem of TCM information, with good performance in query efficiency, completeness and robustness. Copyright© by the Chinese Pharmaceutical Association.
Big Data in HEP: A comprehensive use case study
NASA Astrophysics Data System (ADS)
Gutsche, Oliver; Cremonesi, Matteo; Elmer, Peter; Jayatilaka, Bo; Kowalkowski, Jim; Pivarski, Jim; Sehrish, Saba; Mantilla Surez, Cristina; Svyatkovskiy, Alexey; Tran, Nhan
2017-10-01
Experimental Particle Physics has been at the forefront of analyzing the worlds largest datasets for decades. The HEP community was the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems collectively called Big Data technologies have emerged to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and promise a fresh look at analysis of very large datasets and could potentially reduce the time-to-physics with increased interactivity. In this talk, we present an active LHC Run 2 analysis, searching for dark matter with the CMS detector, as a testbed for Big Data technologies. We directly compare the traditional NTuple-based analysis with an equivalent analysis using Apache Spark on the Hadoop ecosystem and beyond. In both cases, we start the analysis with the official experiment data formats and produce publication physics plots. We will discuss advantages and disadvantages of each approach and give an outlook on further studies needed.
EnviroAtlas - Big Game Hunting Recreation Demand by 12-Digit HUC in the Conterminous United States
This EnviroAtlas dataset includes the total number of recreational days per year demanded by people ages 18 and over for big game hunting by location in the contiguous United States. Big game includes deer, elk, bear, and wild turkey. These values are based on 2010 population distribution, 2011 U.S. Fish and Wildlife Service (FWS) Fish, Hunting, and Wildlife-Associated Recreation (FHWAR) survey data, and 2011 U.S. Department of Agriculture (USDA) Forest Service National Visitor Use Monitoring program data, and have been summarized by 12-digit hydrologic unit code (HUC). This dataset was produced by the US EPA to support research and online mapping activities related to the EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets).
Acoustic detection of railcar roller bearing defects. Phase I, Laboratory test.
DOT National Transportation Integrated Search
2003-06-01
A series of tests were performed at the Bearing Test Facility at the Transportation Technology Center, Inc. (TTCI) in Pueblo, Colorado, to gather acoustic and acceleration emissions for a number of roller bearing defect types designated by the rail i...
Single-cell Transcriptome Study as Big Data
Yu, Pingjian; Lin, Wei
2016-01-01
The rapid growth of single-cell RNA-seq studies (scRNA-seq) demands efficient data storage, processing, and analysis. Big-data technology provides a framework that facilitates the comprehensive discovery of biological signals from inter-institutional scRNA-seq datasets. The strategies to solve the stochastic and heterogeneous single-cell transcriptome signal are discussed in this article. After extensively reviewing the available big-data applications of next-generation sequencing (NGS)-based studies, we propose a workflow that accounts for the unique characteristics of scRNA-seq data and primary objectives of single-cell studies. PMID:26876720
Military Aviation Fluids and Lubricants Workshop 2006 (Postprint)
2006-06-01
Blended in Oil at 1-3 Wt. % Reacts Readily With Current Bearing Steels ( M50 , etc.) Does Not React Easily With Stainless Bearing Steels Other...Additives for Advanced Bearing Steel , Lois Gschwender, AFRL 1530 – 1550 New and Innovative Gas turbine Engine Oil Additive Technology, Rich Sapienza/Bill...Selected corrosion- prone, 52100 steel tapered bearings - Timken Bearing Co.- and used F-16 pump pistons in jar storage – Submerged parts • Two water
[Big data and their perspectives in radiation therapy].
Guihard, Sébastien; Thariat, Juliette; Clavier, Jean-Baptiste
2017-02-01
The concept of big data indicates a change of scale in the use of data and data aggregation into large databases through improved computer technology. One of the current challenges in the creation of big data in the context of radiation therapy is the transformation of routine care items into dark data, i.e. data not yet collected, and the fusion of databases collecting different types of information (dose-volume histograms and toxicity data for example). Processes and infrastructures devoted to big data collection should not impact negatively on the doctor-patient relationship, the general process of care or the quality of the data collected. The use of big data requires a collective effort of physicians, physicists, software manufacturers and health authorities to create, organize and exploit big data in radiotherapy and, beyond, oncology. Big data involve a new culture to build an appropriate infrastructure legally and ethically. Processes and issues are discussed in this article. Copyright © 2016 Société Française du Cancer. Published by Elsevier Masson SAS. All rights reserved.
de Montjoye, Yves-Alexandre; Pentland, Alex Sandy
2016-03-18
Sánchez et al.'s textbook k-anonymization example does not prove, or even suggest, that location and other big-data data sets can be anonymized and of general use. The synthetic data set that they "successfully anonymize" bears no resemblance to modern high-dimensional data sets on which their methods fail. Moving forward, deidentification should not be considered a useful basis for policy. Copyright © 2016, American Association for the Advancement of Science.
ERIC Educational Resources Information Center
Cheng, Liang; Zhang, Wen; Wang, Jiechen; Li, Manchun; Zhong, Lishan
2014-01-01
Geographic information science (GIS) features a wide range of disciplines and has broad applicability. Challenges associated with rapidly developing GIS technology and the currently limited teaching and practice materials hinder universities from cultivating highly skilled GIS graduates. Based on the idea of "small core, big network," a…
Using Big Data to Predict Student Dropouts: Technology Affordances for Research
ERIC Educational Resources Information Center
Niemi, David; Gitin, Elena
2012-01-01
An underlying theme of this paper is that it can be easier and more efficient to conduct valid and effective research studies in online environments than in traditional classrooms. Taking advantage of the "big data" available in an online university, we conducted a study in which a massive online database was used to predict student…
ERIC Educational Resources Information Center
Walker, Rod
1998-01-01
Within diverse outdoor educational activities, a core experience of connection with the earth balances self, others, and nature with elements of ritual. Most effective when experiential, integrated, and technologically simple, the core experience's educative power lies in awakening awareness of interconnectedness between human and nonhuman life.…
Integration of magnetic bearings in the design of advanced gas turbine engines
NASA Technical Reports Server (NTRS)
Storace, Albert F.; Sood, Devendra K.; Lyons, James P.; Preston, Mark A.
1994-01-01
Active magnetic bearings provide revolutionary advantages for gas turbine engine rotor support. These advantages include tremendously improved vibration and stability characteristics, reduced power loss, improved reliability, fault-tolerance, and greatly extended bearing service life. The marriage of these advantages with innovative structural network design and advanced materials utilization will permit major increases in thrust to weight performance and structural efficiency for future gas turbine engines. However, obtaining the maximum payoff requires two key ingredients. The first key ingredient is the use of modern magnetic bearing technologies such as innovative digital control techniques, high-density power electronics, high-density magnetic actuators, fault-tolerant system architecture, and electronic (sensorless) position estimation. This paper describes these technologies. The second key ingredient is to go beyond the simple replacement of rolling element bearings with magnetic bearings by incorporating magnetic bearings as an integral part of the overall engine design. This is analogous to the proper approach to designing with composites, whereby the designer tailors the geometry and load carrying function of the structural system or component for the composite instead of simply substituting composites in a design originally intended for metal material. This paper describes methodologies for the design integration of magnetic bearings in gas turbine engines.
Spectral Regression Based Fault Feature Extraction for Bearing Accelerometer Sensor Signals
Xia, Zhanguo; Xia, Shixiong; Wan, Ling; Cai, Shiyu
2012-01-01
Bearings are not only the most important element but also a common source of failures in rotary machinery. Bearing fault prognosis technology has been receiving more and more attention recently, in particular because it plays an increasingly important role in avoiding the occurrence of accidents. Therein, fault feature extraction (FFE) of bearing accelerometer sensor signals is essential to highlight representative features of bearing conditions for machinery fault diagnosis and prognosis. This paper proposes a spectral regression (SR)-based approach for fault feature extraction from original features including time, frequency and time-frequency domain features of bearing accelerometer sensor signals. SR is a novel regression framework for efficient regularized subspace learning and feature extraction technology, and it uses the least squares method to obtain the best projection direction, rather than computing the density matrix of features, so it also has the advantage in dimensionality reduction. The effectiveness of the SR-based method is validated experimentally by applying the acquired vibration signals data to bearings. The experimental results indicate that SR can reduce the computation cost and preserve more structure information about different bearing faults and severities, and it is demonstrated that the proposed feature extraction scheme has an advantage over other similar approaches. PMID:23202017
Making big data useful for health care: a summary of the inaugural mit critical data conference.
Badawi, Omar; Brennan, Thomas; Celi, Leo Anthony; Feng, Mengling; Ghassemi, Marzyeh; Ippolito, Andrea; Johnson, Alistair; Mark, Roger G; Mayaud, Louis; Moody, George; Moses, Christopher; Naumann, Tristan; Pimentel, Marco; Pollard, Tom J; Santos, Mauro; Stone, David J; Zimolzak, Andrew
2014-08-22
With growing concerns that big data will only augment the problem of unreliable research, the Laboratory of Computational Physiology at the Massachusetts Institute of Technology organized the Critical Data Conference in January 2014. Thought leaders from academia, government, and industry across disciplines-including clinical medicine, computer science, public health, informatics, biomedical research, health technology, statistics, and epidemiology-gathered and discussed the pitfalls and challenges of big data in health care. The key message from the conference is that the value of large amounts of data hinges on the ability of researchers to share data, methodologies, and findings in an open setting. If empirical value is to be from the analysis of retrospective data, groups must continuously work together on similar problems to create more effective peer review. This will lead to improvement in methodology and quality, with each iteration of analysis resulting in more reliability.
Making Big Data Useful for Health Care: A Summary of the Inaugural MIT Critical Data Conference
2014-01-01
With growing concerns that big data will only augment the problem of unreliable research, the Laboratory of Computational Physiology at the Massachusetts Institute of Technology organized the Critical Data Conference in January 2014. Thought leaders from academia, government, and industry across disciplines—including clinical medicine, computer science, public health, informatics, biomedical research, health technology, statistics, and epidemiology—gathered and discussed the pitfalls and challenges of big data in health care. The key message from the conference is that the value of large amounts of data hinges on the ability of researchers to share data, methodologies, and findings in an open setting. If empirical value is to be from the analysis of retrospective data, groups must continuously work together on similar problems to create more effective peer review. This will lead to improvement in methodology and quality, with each iteration of analysis resulting in more reliability. PMID:25600172
Reducing Racial Disparities in Breast Cancer Care: The Role of 'Big Data'.
Reeder-Hayes, Katherine E; Troester, Melissa A; Meyer, Anne-Marie
2017-10-15
Advances in a wide array of scientific technologies have brought data of unprecedented volume and complexity into the oncology research space. These novel big data resources are applied across a variety of contexts-from health services research using data from insurance claims, cancer registries, and electronic health records, to deeper and broader genomic characterizations of disease. Several forms of big data show promise for improving our understanding of racial disparities in breast cancer, and for powering more intelligent and far-reaching interventions to close the racial gap in breast cancer survival. In this article we introduce several major types of big data used in breast cancer disparities research, highlight important findings to date, and discuss how big data may transform breast cancer disparities research in ways that lead to meaningful, lifesaving changes in breast cancer screening and treatment. We also discuss key challenges that may hinder progress in using big data for cancer disparities research and quality improvement.
Volume and Value of Big Healthcare Data.
Dinov, Ivo D
Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions.
Volume and Value of Big Healthcare Data
Dinov, Ivo D.
2016-01-01
Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions. PMID:26998309
Big data and visual analytics in anaesthesia and health care.
Simpao, A F; Ahumada, L M; Rehman, M A
2015-09-01
Advances in computer technology, patient monitoring systems, and electronic health record systems have enabled rapid accumulation of patient data in electronic form (i.e. big data). Organizations such as the Anesthesia Quality Institute and Multicenter Perioperative Outcomes Group have spearheaded large-scale efforts to collect anaesthesia big data for outcomes research and quality improvement. Analytics--the systematic use of data combined with quantitative and qualitative analysis to make decisions--can be applied to big data for quality and performance improvements, such as predictive risk assessment, clinical decision support, and resource management. Visual analytics is the science of analytical reasoning facilitated by interactive visual interfaces, and it can facilitate performance of cognitive activities involving big data. Ongoing integration of big data and analytics within anaesthesia and health care will increase demand for anaesthesia professionals who are well versed in both the medical and the information sciences. © The Author 2015. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Big Data and the Future of Radiology Informatics.
Kansagra, Akash P; Yu, John-Paul J; Chatterjee, Arindam R; Lenchik, Leon; Chow, Daniel S; Prater, Adam B; Yeh, Jean; Doshi, Ankur M; Hawkins, C Matthew; Heilbrun, Marta E; Smith, Stacy E; Oselkin, Martin; Gupta, Pushpender; Ali, Sayed
2016-01-01
Rapid growth in the amount of data that is electronically recorded as part of routine clinical operations has generated great interest in the use of Big Data methodologies to address clinical and research questions. These methods can efficiently analyze and deliver insights from high-volume, high-variety, and high-growth rate datasets generated across the continuum of care, thereby forgoing the time, cost, and effort of more focused and controlled hypothesis-driven research. By virtue of an existing robust information technology infrastructure and years of archived digital data, radiology departments are particularly well positioned to take advantage of emerging Big Data techniques. In this review, we describe four areas in which Big Data is poised to have an immediate impact on radiology practice, research, and operations. In addition, we provide an overview of the Big Data adoption cycle and describe how academic radiology departments can promote Big Data development. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Rein, Robert; Memmert, Daniel
2016-01-01
Until recently tactical analysis in elite soccer were based on observational data using variables which discard most contextual information. Analyses of team tactics require however detailed data from various sources including technical skill, individual physiological performance, and team formations among others to represent the complex processes underlying team tactical behavior. Accordingly, little is known about how these different factors influence team tactical behavior in elite soccer. In parts, this has also been due to the lack of available data. Increasingly however, detailed game logs obtained through next-generation tracking technologies in addition to physiological training data collected through novel miniature sensor technologies have become available for research. This leads however to the opposite problem where the shear amount of data becomes an obstacle in itself as methodological guidelines as well as theoretical modelling of tactical decision making in team sports is lacking. The present paper discusses how big data and modern machine learning technologies may help to address these issues and aid in developing a theoretical model for tactical decision making in team sports. As experience from medical applications show, significant organizational obstacles regarding data governance and access to technologies must be overcome first. The present work discusses these issues with respect to tactical analyses in elite soccer and propose a technological stack which aims to introduce big data technologies into elite soccer research. The proposed approach could also serve as a guideline for other sports science domains as increasing data size is becoming a wide-spread phenomenon.
Gas Foil Bearings for Space Propulsion Nuclear Electric Power Generation
NASA Technical Reports Server (NTRS)
Howard, Samuel A.; DellaCorte, Christopher
2006-01-01
The choice of power conversion technology is critical in directing the design of a space vehicle for the future NASA mission to Mars. One candidate design consists of a foil bearing supported turbo alternator driven by a helium-xenon gas mixture heated by a nuclear reactor. The system is a closed-loop, meaning there is a constant volume of process fluid that is sealed from the environment. Therefore, foil bearings are proposed due to their ability to use the process gas as a lubricant. As such, the rotor dynamics of a foil bearing supported rotor is an important factor in the eventual design. The current work describes a rotor dynamic analysis to assess the viability of such a system. A brief technology background, assumptions, analyses, and conclusions are discussed in this report. The results indicate that a foil bearing supported turbo alternator is possible, although more work will be needed to gain knowledge about foil bearing behavior in helium-xenon gas.
Özdemir, Vural; Dove, Edward S; Gürsoy, Ulvi K; Şardaş, Semra; Yıldırım, Arif; Yılmaz, Şenay Görücü; Ömer Barlas, I; Güngör, Kıvanç; Mete, Alper; Srivastava, Sanjeeva
2017-01-01
No field in science and medicine today remains untouched by Big Data, and psychiatry is no exception. Proteomics is a Big Data technology and a next generation biomarker, supporting novel system diagnostics and therapeutics in psychiatry. Proteomics technology is, in fact, much older than genomics and dates to the 1970s, well before the launch of the international Human Genome Project. While the genome has long been framed as the master or "elite" executive molecule in cell biology, the proteome by contrast is humble. Yet the proteome is critical for life-it ensures the daily functioning of cells and whole organisms. In short, proteins are the blue-collar workers of biology, the down-to-earth molecules that we cannot live without. Since 2010, proteomics has found renewed meaning and international attention with the launch of the Human Proteome Project and the growing interest in Big Data technologies such as proteomics. This article presents an interdisciplinary technology foresight analysis and conceptualizes the terms "environtome" and "social proteome". We define "environtome" as the entire complement of elements external to the human host, from microbiome, ambient temperature and weather conditions to government innovation policies, stock market dynamics, human values, political power and social norms that collectively shape the human host spatially and temporally. The "social proteome" is the subset of the environtome that influences the transition of proteomics technology to innovative applications in society. The social proteome encompasses, for example, new reimbursement schemes and business innovation models for proteomics diagnostics that depart from the "once-a-life-time" genotypic tests and the anticipated hype attendant to context and time sensitive proteomics tests. Building on the "nesting principle" for governance of complex systems as discussed by Elinor Ostrom, we propose here a 3-tiered organizational architecture for Big Data science such as proteomics. The proposed nested governance structure is comprised of (a) scientists, (b) ethicists, and (c) scholars in the nascent field of "ethics-of-ethics", and aims to cultivate a robust social proteome for personalized medicine. Ostrom often noted that such nested governance designs offer assurance that political power embedded in innovation processes is distributed evenly and is not concentrated disproportionately in a single overbearing stakeholder or person. We agree with this assessment and conclude by underscoring the synergistic value of social and biological proteomes to realize the full potentials of proteomics science for personalized medicine in psychiatry in the present era of Big Data.
Factors Influencing EFL Novice Teachers' Adoption of Technologies in Classroom Practice
ERIC Educational Resources Information Center
Dinh, Huong Thi Bao
2009-01-01
A primary research conducted with 12 Vietnamese teachers of English using questionnaires and semi-structured interviews has revealed that big investment into technological infrastructure and the top-down approach of implementing technological change in English teaching are not a guarantee for the adoption of technology by English teachers in their…
Costa, Fabricio F
2014-04-01
The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science. Copyright © 2013 Elsevier Ltd. All rights reserved.
Big Data in the Industry - Overview of Selected Issues
NASA Astrophysics Data System (ADS)
Gierej, Sylwia
2017-12-01
This article reviews selected issues related to the use of Big Data in the industry. The aim is to define the potential scope and forms of using large data sets in manufacturing companies. By systematically reviewing scientific and professional literature, selected issues related to the use of mass data analytics in production were analyzed. A definition of Big Data was presented, detailing its main attributes. The importance of mass data processing technology in the development of Industry 4.0 concept has been highlighted. Subsequently, attention was paid to issues such as production process optimization, decision making and mass production individualisation, and indicated the potential for large volumes of data. As a result, conclusions were drawn regarding the potential of using Big Data in the industry.
-Omic and Electronic Health Records Big Data Analytics for Precision Medicine
Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D.; Venugopalan, Janani; Hoffman, Ryan; Wang, May D.
2017-01-01
Objective Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of health care. Methods In this article, we present -omic and EHR data characteristics, associated challenges, and data analytics including data pre-processing, mining, and modeling. Results To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Conclusion Big data analytics is able to address –omic and EHR data challenges for paradigm shift towards precision medicine. Significance Big data analytics makes sense of –omic and EHR data to improve healthcare outcome. It has long lasting societal impact. PMID:27740470
HARNESSING BIG DATA FOR PRECISION MEDICINE: INFRASTRUCTURES AND APPLICATIONS.
Yu, Kun-Hsing; Hart, Steven N; Goldfeder, Rachel; Zhang, Qiangfeng Cliff; Parker, Stephen C J; Snyder, Michael
2017-01-01
Precision medicine is a health management approach that accounts for individual differences in genetic backgrounds and environmental exposures. With the recent advancements in high-throughput omics profiling technologies, collections of large study cohorts, and the developments of data mining algorithms, big data in biomedicine is expected to provide novel insights into health and disease states, which can be translated into personalized disease prevention and treatment plans. However, petabytes of biomedical data generated by multiple measurement modalities poses a significant challenge for data analysis, integration, storage, and result interpretation. In addition, patient privacy preservation, coordination between participating medical centers and data analysis working groups, as well as discrepancies in data sharing policies remain important topics of discussion. In this workshop, we invite experts in omics integration, biobank research, and data management to share their perspectives on leveraging big data to enable precision medicine.Workshop website: http://tinyurl.com/PSB17BigData; HashTag: #PSB17BigData.
The Role of Tribology in the Development of an Oil-Free Turbocharger
NASA Technical Reports Server (NTRS)
Dellacorte, Christopher
1997-01-01
Gas-turbine-based aeropropulsion engines are technologically mature. Thus, as with any mature technology, revolutionary approaches will be needed to achieve the significant performance gains that will keep the U.S. propulsion manufacturers well ahead of foreign competition. One such approach is the development of oil-free turbomachinery utilizing advanced foil air bearings, seals, and solid lubricants. By eliminating oil-lubricated bearings and seals and supporting an engine rotor on an air film, significant improvements can be realized. For example, the entire oil system including pipes, lines, filters, cooler, and tanks could be removed, thereby saving considerable weight. Since air has no thermal decomposition temperature, engine systems could operate without excessive cooling. Also, since air bearings have no diameter-rpm fatigue limits (D-N limits), engines could be designed to operate at much higher speeds and higher density, which would result in a smaller aeropropulsion package. Because of recent advances in compliant foil air bearings and high temperature solid lubricants, these technologies can be applied to oil-free turbomachinery. In an effort to develop these technologies and to demonstrate a project along the path to an oil-free gas turbine engine, NASA has undertaken the development of an oil-free turbocharger for a heavy duty diesel engine. This turbomachine can reach 120000 rpm at a bearing temperature of 540 C (1000 F) and, in comparison to oil-lubricated bearings, can increase efficiency by 10 to 15 percent because of reduced friction. In addition, because there are no oil lubricants, there are no seal-leakage-induced emissions.
Bigger data for big data: from Twitter to brain-computer interfaces.
Roesch, Etienne B; Stahl, Frederic; Gaber, Mohamed Medhat
2014-02-01
We are sympathetic with Bentley et al.'s attempt to encompass the wisdom of crowds in a generative model, but posit that a successful attempt at using big data will include more sensitive measurements, more varied sources of information, and will also build from the indirect information available through technology, from ancillary technical features to data from brain-computer interfaces.
This report documents the activities performed and the results obtained from the first six months of the arsenic removal treatment technology demonstration project at the Big Sauk Lake Mobile Home Park (BSLMHP) in Sauk Centre, MN. The objectives of the project are to evaluate the...
Acute Kidney Injury and Big Data.
Sutherland, Scott M; Goldstein, Stuart L; Bagshaw, Sean M
2018-01-01
The recognition of a standardized, consensus definition for acute kidney injury (AKI) has been an important milestone in critical care nephrology, which has facilitated innovation in prevention, quality of care, and outcomes research among the growing population of hospitalized patients susceptible to AKI. Concomitantly, there have been substantial advances in "big data" technologies in medicine, including electronic health records (EHR), data registries and repositories, and data management and analytic methodologies. EHRs are increasingly being adopted, clinical informatics is constantly being refined, and the field of EHR-enabled care improvement and research has grown exponentially. While these fields have matured independently, integrating the two has the potential to redefine and integrate AKI-related care and research. AKI is an ideal condition to exploit big data health care innovation for several reasons: AKI is common, increasingly encountered in hospitalized settings, imposes meaningful risk for adverse events and poor outcomes, has incremental cost implications, and has been plagued by suboptimal quality of care. In this concise review, we discuss the potential applications of big data technologies, particularly modern EHR platforms and health data repositories, to transform our capacity for AKI prediction, detection, and care quality. © 2018 S. Karger AG, Basel.
[Big data, medical language and biomedical terminology systems].
Schulz, Stefan; López-García, Pablo
2015-08-01
A variety of rich terminology systems, such as thesauri, classifications, nomenclatures and ontologies support information and knowledge processing in health care and biomedical research. Nevertheless, human language, manifested as individually written texts, persists as the primary carrier of information, in the description of disease courses or treatment episodes in electronic medical records, and in the description of biomedical research in scientific publications. In the context of the discussion about big data in biomedicine, we hypothesize that the abstraction of the individuality of natural language utterances into structured and semantically normalized information facilitates the use of statistical data analytics to distil new knowledge out of textual data from biomedical research and clinical routine. Computerized human language technologies are constantly evolving and are increasingly ready to annotate narratives with codes from biomedical terminology. However, this depends heavily on linguistic and terminological resources. The creation and maintenance of such resources is labor-intensive. Nevertheless, it is sensible to assume that big data methods can be used to support this process. Examples include the learning of hierarchical relationships, the grouping of synonymous terms into concepts and the disambiguation of homonyms. Although clear evidence is still lacking, the combination of natural language technologies, semantic resources, and big data analytics is promising.
Update on Thales flexure bearing coolers and drive electronics
NASA Astrophysics Data System (ADS)
Willems, D.; Benschop, T.; v. d. Groep, W.; Mullié, J.; v. d. Weijden, H.; Tops, M.
2009-05-01
Thales Cryogenics has a long background in delivering cryogenic coolers with an MTTF far above 20.000 hrs for military, civil and space programs. Developments in these markets required continuous update of the flexure bearing cooler portfolio for new and emerging applications. The cooling requirements of new application have not only their influence on the size of the compressor, cold finger and cooling technology used but also on the integration and control of the cooler in the application. Thales Cryogenics developed a compact Cooler Drive Electronics based on DSP technology that could be used for driving linear flexure bearing coolers with extreme temperature stability and with additional diagnostics inside the CDE. This CDE has a wide application and can be modified to specific customer requirements. During the presentation the latest developments in flexure bearing cooler technology will be presented both for Stirling and Pulse Tube coolers. Also the relation between the most important recent detector requirements and possible available solutions on cryocooler level will be presented.
Conceptual Design and Feasibility of Foil Bearings for Rotorcraft Engines: Hot Core Bearings
NASA Technical Reports Server (NTRS)
Howard, Samuel A.
2007-01-01
Recent developments in gas foil bearing technology have led to numerous advanced high-speed rotating system concepts, many of which have become either commercial products or experimental test articles. Examples include oil-free microturbines, motors, generators and turbochargers. The driving forces for integrating gas foil bearings into these high-speed systems are the benefits promised by removing the oil lubrication system. Elimination of the oil system leads to reduced emissions, increased reliability, and decreased maintenance costs. Another benefit is reduced power plant weight. For rotorcraft applications, this would be a major advantage, as every pound removed from the propulsion system results in a payload benefit.. Implementing foil gas bearings throughout a rotorcraft gas turbine engine is an important long-term goal that requires overcoming numerous technological hurdles. Adequate thrust bearing load capacity and potentially large gearbox applied radial loads are among them. However, by replacing the turbine end, or hot section, rolling element bearing with a gas foil bearing many of the above benefits can be realized. To this end, engine manufacturers are beginning to explore the possibilities of hot section gas foil bearings in propulsion engines. This overview presents a logical follow-on activity by analyzing a conceptual rotorcraft engine to determine the feasibility of a foil bearing supported core. Using a combination of rotordynamic analyses and a load capacity model, it is shown to be reasonable to consider a gas foil bearing core section. In addition, system level foil bearing testing capabilities at NASA Glenn Research Center are presented along with analysis work being conducted under NRA Cooperative Agreements.
The Ethics of Big Data: Current and Foreseeable Issues in Biomedical Contexts.
Mittelstadt, Brent Daniel; Floridi, Luciano
2016-04-01
The capacity to collect and analyse data is growing exponentially. Referred to as 'Big Data', this scientific, social and technological trend has helped create destabilising amounts of information, which can challenge accepted social and ethical norms. Big Data remains a fuzzy idea, emerging across social, scientific, and business contexts sometimes seemingly related only by the gigantic size of the datasets being considered. As is often the case with the cutting edge of scientific and technological progress, understanding of the ethical implications of Big Data lags behind. In order to bridge such a gap, this article systematically and comprehensively analyses academic literature concerning the ethical implications of Big Data, providing a watershed for future ethical investigations and regulations. Particular attention is paid to biomedical Big Data due to the inherent sensitivity of medical information. By means of a meta-analysis of the literature, a thematic narrative is provided to guide ethicists, data scientists, regulators and other stakeholders through what is already known or hypothesised about the ethical risks of this emerging and innovative phenomenon. Five key areas of concern are identified: (1) informed consent, (2) privacy (including anonymisation and data protection), (3) ownership, (4) epistemology and objectivity, and (5) 'Big Data Divides' created between those who have or lack the necessary resources to analyse increasingly large datasets. Critical gaps in the treatment of these themes are identified with suggestions for future research. Six additional areas of concern are then suggested which, although related have not yet attracted extensive debate in the existing literature. It is argued that they will require much closer scrutiny in the immediate future: (6) the dangers of ignoring group-level ethical harms; (7) the importance of epistemology in assessing the ethics of Big Data; (8) the changing nature of fiduciary relationships that become increasingly data saturated; (9) the need to distinguish between 'academic' and 'commercial' Big Data practices in terms of potential harm to data subjects; (10) future problems with ownership of intellectual property generated from analysis of aggregated datasets; and (11) the difficulty of providing meaningful access rights to individual data subjects that lack necessary resources. Considered together, these eleven themes provide a thorough critical framework to guide ethical assessment and governance of emerging Big Data practices.
Connecting crustal seismicity and earthquake-driven stress evolution in Southern California
Pollitz, Fred; Cattania, Camilla
2017-01-01
Tectonic stress in the crust evolves during a seismic cycle, with slow stress accumulation over interseismic periods, episodic stress steps at the time of earthquakes, and transient stress readjustment during a postseismic period that may last months to years. Static stress transfer to surrounding faults has been well documented to alter regional seismicity rates over both short and long time scales. While static stress transfer is instantaneous and long lived, postseismic stress transfer driven by viscoelastic relaxation of the ductile lower crust and mantle leads to additional, slowly varying stress perturbations. Both processes may be tested by comparing a decade-long record of regional seismicity to predicted time-dependent seismicity rates based on a stress evolution model that includes viscoelastic stress transfer. Here we explore crustal stress evolution arising from the seismic cycle in Southern California from 1981 to 2014 using five M≥6.5 source quakes: the M7.3 1992 Landers, M6.5 1992 Big Bear, M6.7 1994 Big Bear, M7.1 1999 Hector Mine, and M7.2 2010 El Mayor-Cucapah earthquakes. We relate the stress readjustment in the surrounding crust generated by each quake to regional seismicity using rate-and-state friction theory. Using a log likelihood approach, we quantify the potential to trigger seismicity of both static and viscoelastic stress transfer, finding that both processes have systematically shaped the spatial pattern of Southern California seismicity since 1992.
Akperova, G A
2014-11-01
IThe purpose of this study was to evaluate of the efficiency of RDBH-method and Big DyeTM Terminator technology in an accurate diagnosis of β-thalassemia and the allelic polymorphism of β-globin cluster. It was done a complete hematology analysis (HB, MCH, MCV, MCHC, RBC, Hct, HbA2, HbF, Serum iron, Serum ferritin at four children (males, 6-10 years old) and their parents. Molecular analysis included Reverse Dot-Blot Hybridization StripAssay (RDBH) and DNA sequencing on ABI PRISM Big DyeTM Terminator. Hematologic and molecular parameters were contradictory. The homozygosity for β0-thalassemia (β0IVS2.1[G>A] and β0codon 8[-AA]) at three boys with the mild clinical manifestation and heterozygosity of their parents for mutations, and the absence of β-globin mutations at parents and a boy who holds monthly transfusion was established by RDBH-analysis. DNA sequencing by technology Big DyeTM Terminator showed polymorphism at positions -551 and -521 of Cap5'-region (-650-250) - (AT)7(T)7 and (AT)8(T)5. Application of the integrated clinical-molecular approach is an ideal method for an accurate diagnosis, identification of asymptomatic carriers and a reduce of the risk of complications from β-thalassemia, moreover screening of γG-gene and the level of fetal hemoglobin in early childhood will help manage of β-thalassemia clinic and prevent heavy consequences of the disease.
Psycho-informatics: Big Data shaping modern psychometrics.
Markowetz, Alexander; Błaszkiewicz, Konrad; Montag, Christian; Switala, Christina; Schlaepfer, Thomas E
2014-04-01
For the first time in history, it is possible to study human behavior on great scale and in fine detail simultaneously. Online services and ubiquitous computational devices, such as smartphones and modern cars, record our everyday activity. The resulting Big Data offers unprecedented opportunities for tracking and analyzing behavior. This paper hypothesizes the applicability and impact of Big Data technologies in the context of psychometrics both for research and clinical applications. It first outlines the state of the art, including the severe shortcomings with respect to quality and quantity of the resulting data. It then presents a technological vision, comprised of (i) numerous data sources such as mobile devices and sensors, (ii) a central data store, and (iii) an analytical platform, employing techniques from data mining and machine learning. To further illustrate the dramatic benefits of the proposed methodologies, the paper then outlines two current projects, logging and analyzing smartphone usage. One such study attempts to thereby quantify severity of major depression dynamically; the other investigates (mobile) Internet Addiction. Finally, the paper addresses some of the ethical issues inherent to Big Data technologies. In summary, the proposed approach is about to induce the single biggest methodological shift since the beginning of psychology or psychiatry. The resulting range of applications will dramatically shape the daily routines of researches and medical practitioners alike. Indeed, transferring techniques from computer science to psychiatry and psychology is about to establish Psycho-Informatics, an entire research direction of its own. Copyright © 2013 Elsevier Ltd. All rights reserved.
Orbit transfer vehicle engine technology program. Task B-6 high speed turbopump bearings
NASA Technical Reports Server (NTRS)
1992-01-01
Bearing types were evaluated for use on the Orbit Transfer Vehicle (OTV) high pressure fuel pump. The high speed, high load, and long bearing life requirements dictated selection of hydrostatic bearings as the logical candidate for this engine. Design and fabrication of a bearing tester to evaluate these cryogenic hydrostatic bearings was then conducted. Detailed analysis, evaluation of bearing materials, and design of the hydrostatic bearings were completed resulting in fabrication of Carbon P5N and Kentanium hydrostatic bearings. Rotordynamic analyses determined the exact bearing geometry chosen. Instrumentation was evaluated and data acquisition methods were determined for monitoring shaft motion up to speeds in excess of 200,000 RPM in a cryogenic atmosphere. Fabrication of all hardware was completed, but assembly and testing was conducted outside of this contract.
Some Big Questions about Design in Educational Technology
ERIC Educational Resources Information Center
Gibbons, Andrew S.
2016-01-01
This article asks five questions that lead us to the foundations of design practice in educational technology. Design processes structure time, space, place, activity, role, goal, and resource. For educational technology to advance in its understanding of design practice, it must question whether we have clear conceptions of how abstract…
ERIC Educational Resources Information Center
Ekman, Richard
2005-01-01
Almost everyone on campus today grasps the benefits of easy availability of information technology, but for college presidents, the expectations for information technology have been high from the early days. The grail in futurist dreams has been a machine that "thinks," using a very big base of information to sift evidence, make judgments, and…
Disruptive Innovation in Air Measurement Technology: Reality or Hype?
This presentation is a big picture overview on the changing state of air measurement technology in the world, with a focus on the introduction of low-cost sensors into the market place. The presentation discusses how these new technologies may be a case study in disruptive innov...
Small Technology--Big Impact. Practical Options for Development
ERIC Educational Resources Information Center
Academy for Educational Development, 2009
2009-01-01
Technology has dramatically changed the world--now almost anyone can "move" at Internet-speed; people who were marginalized are able to find information on acquiring micro-loans to start businesses, and villages previously unconnected to the telecommunications grid now have affordable cell phone access. As technology becomes easier to…
NASA Astrophysics Data System (ADS)
Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.
2016-08-01
The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.
A better sequence-read simulator program for metagenomics.
Johnson, Stephen; Trost, Brett; Long, Jeffrey R; Pittet, Vanessa; Kusalik, Anthony
2014-01-01
There are many programs available for generating simulated whole-genome shotgun sequence reads. The data generated by many of these programs follow predefined models, which limits their use to the authors' original intentions. For example, many models assume that read lengths follow a uniform or normal distribution. Other programs generate models from actual sequencing data, but are limited to reads from single-genome studies. To our knowledge, there are no programs that allow a user to generate simulated data following non-parametric read-length distributions and quality profiles based on empirically-derived information from metagenomics sequencing data. We present BEAR (Better Emulation for Artificial Reads), a program that uses a machine-learning approach to generate reads with lengths and quality values that closely match empirically-derived distributions. BEAR can emulate reads from various sequencing platforms, including Illumina, 454, and Ion Torrent. BEAR requires minimal user input, as it automatically determines appropriate parameter settings from user-supplied data. BEAR also uses a unique method for deriving run-specific error rates, and extracts useful statistics from the metagenomic data itself, such as quality-error models. Many existing simulators are specific to a particular sequencing technology; however, BEAR is not restricted in this way. Because of its flexibility, BEAR is particularly useful for emulating the behaviour of technologies like Ion Torrent, for which no dedicated sequencing simulators are currently available. BEAR is also the first metagenomic sequencing simulator program that automates the process of generating abundances, which can be an arduous task. BEAR is useful for evaluating data processing tools in genomics. It has many advantages over existing comparable software, such as generating more realistic reads and being independent of sequencing technology, and has features particularly useful for metagenomics work.
Medical big data: promise and challenges.
Lee, Choong Ho; Yoon, Hyung-Jin
2017-03-01
The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.
Medical big data: promise and challenges
Lee, Choong Ho; Yoon, Hyung-Jin
2017-01-01
The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology. PMID:28392994
Big Geo Data Management: AN Exploration with Social Media and Telecommunications Open Data
NASA Astrophysics Data System (ADS)
Arias Munoz, C.; Brovelli, M. A.; Corti, S.; Zamboni, G.
2016-06-01
The term Big Data has been recently used to define big, highly varied, complex data sets, which are created and updated at a high speed and require faster processing, namely, a reduced time to filter and analyse relevant data. These data is also increasingly becoming Open Data (data that can be freely distributed) made public by the government, agencies, private enterprises and among others. There are at least two issues that can obstruct the availability and use of Open Big Datasets: Firstly, the gathering and geoprocessing of these datasets are very computationally intensive; hence, it is necessary to integrate high-performance solutions, preferably internet based, to achieve the goals. Secondly, the problems of heterogeneity and inconsistency in geospatial data are well known and affect the data integration process, but is particularly problematic for Big Geo Data. Therefore, Big Geo Data integration will be one of the most challenging issues to solve. With these applications, we demonstrate that is possible to provide processed Big Geo Data to common users, using open geospatial standards and technologies. NoSQL databases like MongoDB and frameworks like RASDAMAN could offer different functionalities that facilitate working with larger volumes and more heterogeneous geospatial data sources.
Preliminary Analysis for an Optimized Oil-Free Rotorcraft Engine Concept
NASA Technical Reports Server (NTRS)
Howard, Samuel A.; Bruckner, Robert J.; DellaCorte, Christopher; Radil, Kevin C.
2008-01-01
Recent developments in gas foil bearing technology have led to numerous advanced high-speed rotating system concepts, many of which have become either commercial products or experimental test articles. Examples include Oil-Free microturbines, motors, generators and turbochargers. The driving forces for integrating gas foil bearings into these high-speed systems are the benefits promised by removing the oil lubrication system. Elimination of the oil system leads to reduced emissions, increased reliability, and decreased maintenance costs. Another benefit is reduced power plant weight. For rotorcraft applications, this would be a major advantage, as every pound removed from the propulsion system results in a payload benefit. Implementing foil gas bearings throughout a rotorcraft gas turbine engine is an important long-term goal that requires overcoming numerous technological hurdles. Adequate thrust bearing load capacity and potentially large gearbox applied radial loads are among them. However, by replacing the turbine end, or hot section, rolling element bearing with a gas foil bearing many of the above benefits can be realized. To this end, engine manufacturers are beginning to explore the possibilities of hot section gas foil bearings in propulsion engines. This paper presents a logical follow-on activity by analyzing a conceptual rotorcraft engine to determine the feasibility of a foil bearing supported core. Using a combination of rotordynamic analyses and a load capacity model, it is shown to be reasonable to consider a gas foil bearing core section.
Social customer relationship management: taking advantage of Web 2.0 and Big Data technologies.
Orenga-Roglá, Sergio; Chalmeta, Ricardo
2016-01-01
The emergence of Web 2.0 and Big Data technologies has allowed a new customer relationship strategy based on interactivity and collaboration called Social Customer Relationship Management (Social CRM) to be created. This enhances customer engagement and satisfaction. The implementation of Social CRM is a complex task that involves different organisational, human and technological aspects. However, there is a lack of methodologies to assist companies in these processes. This paper shows a novel methodology that helps companies to implement Social CRM, taking into account different aspects such as social customer strategy, the Social CRM performance measurement system, the Social CRM business processes, or the Social CRM computer system. The methodology was applied to one company in order to validate and refine it.
Song, Malin; Wang, Shuhong
2017-01-01
This study examined the stimulative effects of Chinese enterprises' participation in the global value chain (GVC) on the progress of their green technologies. Using difference-in-difference panel models with big data of Chinese enterprises, we measured influencing factors such as enterprise participation degree, enterprise scale, corporate ownership, and research and development (R&D) investment. The results revealed that participation in the GVC can considerably improve the green technology levels in all enterprises, except state-owned ones. However, the older an enterprise, the higher the sluggishness is likely to be in its R&D activities; this is particularly true for state-owned enterprises. The findings provide insights into the strategy of actively addressing Chinese enterprises' predicament of being restricted to the lower end of the GVC.
Extravehicular Space Suit Bearing Technology Development Research
NASA Astrophysics Data System (ADS)
Pang, Yan; Liu, Xiangyang; Guanghui, Xie
2017-03-01
Pressure bearing has been acting an important role in the EVA (extravehicular activity) suit as a main mobility component. EVA suit bearing has its unique traits on the material, dustproof design, seal, interface, lubrication, load and performance. This paper states the peculiarity and development of the pressure bearing on the construction design element, load and failure mode, and performance and test from the point view of structure design. The status and effect of EVA suit pressure bearing is introduced in the paper. This analysis method can provide reference value for our country’s EVA suit pressure bearing design and development.
Third International Symposium on Magnetic Suspension Technology. Part 2
NASA Technical Reports Server (NTRS)
Groom, Nelson J. (Editor); Britcher, Colin P. (Editor)
1996-01-01
In order to examine the state of technology of all areas of magnetic suspension and to review recent developments in sensors, controls, superconducting magnet technology, and design/implementation practices, the Third International Symposium on Magnetic Suspension Technology was held at the Holiday Inn Capital Plaza in Tallahassee, Florida on 13-15 Dec. 1995. The symposium included 19 sessions in which a total of 55 papers were presented. The technical sessions covered the areas of bearings, superconductivity, vibration isolation, maglev, controls, space applications, general applications, bearing/actuator design, modeling, precision applications, electromagnetic launch and hypersonic maglev, applications of superconductivity, and sensors.
Second International Symposium on Magnetic Suspension Technology, part 1
NASA Technical Reports Server (NTRS)
Groom, Nelson J. (Editor); Britcher, Colin P. (Editor)
1994-01-01
In order to examine the state of technology of all areas of magnetic suspension and to review related recent developments in sensors and controls approaches, superconducting magnet technology, and design/implementation practices, the Second International Symposium on Magnetic Suspension Technology was held. The symposium included 18 technical sessions in which 44 papers were presented. The technical sessions covered the areas of bearings, bearing modeling, controls, vibration isolation, micromachines, superconductivity, wind tunnel magnetic suspension systems, magnetically levitated trains (MAGLEV), rotating machinery and energy storage, and applications. A list of attendees appears at the end of the document.
Third International Symposium on Magnetic Suspension Technology
NASA Technical Reports Server (NTRS)
Groom, Nelson J. (Editor); Britcher, Colin P. (Editor)
1996-01-01
In order to examine the state of technology of all areas of magnetic suspension and to review recent developments in sensors, controls, superconducting magnet technology, and design/implementation practices, the Third International Symposium on Magnetic Suspension Technology was held at the Holiday Inn Capital Plaza in Tallahassee, Florida on 13-15 Dec. 1995. The symposium included 19 sessions in which a total of 55 papers were presented. The technical sessions covered the areas of bearings, superconductivity, vibration isolation, maglev, controls, space applications, general applications, bearing/actuator design, modeling, precision applications, electromagnetic launch and hypersonic maglev, applications of superconductivity, and sensors.
Social Media, Big Data, and Mental Health: Current Advances and Ethical Implications.
Conway, Mike; O'Connor, Daniel
2016-06-01
Mental health (including substance abuse) is the fifth greatest contributor to the global burden of disease, with an economic cost estimated to be US $2.5 trillion in 2010, and expected to double by 2030. Developing information systems to support and strengthen population-level mental health monitoring forms a core part of the World Health Organization's Comprehensive Action Plan 2013-2020. In this paper, we review recent work that utilizes social media "big data" in conjunction with associated technologies like natural language processing and machine learning to address pressing problems in population-level mental health surveillance and research, focusing both on technological advances and core ethical challenges.
Feltus, Frank A; Breen, Joseph R; Deng, Juan; Izard, Ryan S; Konger, Christopher A; Ligon, Walter B; Preuss, Don; Wang, Kuang-Ching
2015-01-01
In the last decade, high-throughput DNA sequencing has become a disruptive technology and pushed the life sciences into a distributed ecosystem of sequence data producers and consumers. Given the power of genomics and declining sequencing costs, biology is an emerging "Big Data" discipline that will soon enter the exabyte data range when all subdisciplines are combined. These datasets must be transferred across commercial and research networks in creative ways since sending data without thought can have serious consequences on data processing time frames. Thus, it is imperative that biologists, bioinformaticians, and information technology engineers recalibrate data processing paradigms to fit this emerging reality. This review attempts to provide a snapshot of Big Data transfer across networks, which is often overlooked by many biologists. Specifically, we discuss four key areas: 1) data transfer networks, protocols, and applications; 2) data transfer security including encryption, access, firewalls, and the Science DMZ; 3) data flow control with software-defined networking; and 4) data storage, staging, archiving and access. A primary intention of this article is to orient the biologist in key aspects of the data transfer process in order to frame their genomics-oriented needs to enterprise IT professionals.
Big Data in HEP: A comprehensive use case study
Gutsche, Oliver; Cremonesi, Matteo; Elmer, Peter; ...
2017-11-23
Experimental Particle Physics has been at the forefront of analyzing the worlds largest datasets for decades. The HEP community was the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems collectively called Big Data technologies have emerged to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and promise a fresh look at analysis of very large datasets and could potentially reduce the time-to-physics with increased interactivity.more » In this talk, we present an active LHC Run 2 analysis, searching for dark matter with the CMS detector, as a testbed for Big Data technologies. We directly compare the traditional NTuple-based analysis with an equivalent analysis using Apache Spark on the Hadoop ecosystem and beyond. In both cases, we start the analysis with the official experiment data formats and produce publication physics plots. Lastly, we will discuss advantages and disadvantages of each approach and give an outlook on further studies needed.« less
Big Data in HEP: A comprehensive use case study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutsche, Oliver; Cremonesi, Matteo; Elmer, Peter
Experimental Particle Physics has been at the forefront of analyzing the worlds largest datasets for decades. The HEP community was the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems collectively called Big Data technologies have emerged to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and promise a fresh look at analysis of very large datasets and could potentially reduce the time-to-physics with increased interactivity.more » In this talk, we present an active LHC Run 2 analysis, searching for dark matter with the CMS detector, as a testbed for Big Data technologies. We directly compare the traditional NTuple-based analysis with an equivalent analysis using Apache Spark on the Hadoop ecosystem and beyond. In both cases, we start the analysis with the official experiment data formats and produce publication physics plots. Lastly, we will discuss advantages and disadvantages of each approach and give an outlook on further studies needed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herrera, G.L.
1999-11-01
Harnessing the wind is not a new concept to Texans. But it is a concept that has evolved over the years from one of pumping water to fill stock tanks for watering livestock to one of providing electricity for the people of Texas. This evolution has occurred due to improved micro-siting techniques that help identify robust wind resource sites and wind turbine technology that improves wind capture and energy conversion efficiencies. Over the last seven to ten years this siting technology and wind turbine technology have significantly reduced the bus-bar cost associated with wind generation. On December 2, 1998, atmore » a public dedication of the Big Spring Wind Project, the first of 42 Vestas V47 wind turbines was released for commercial operation. Since that date an additional fifteen V47 Turbines have been placed into service. It is expected that the Big Spring Wind Project will be complete and released of full operation prior to the summer peak-load season of 1999. As of the writing of this paper (January 1999) the Vestas V47 turbines have performed as expected with excellent availability and, based on foregoing resource analysis, better than expected output.« less
The QuEST for multi-sensor big data ISR situation understanding
NASA Astrophysics Data System (ADS)
Rogers, Steven; Culbertson, Jared; Oxley, Mark; Clouse, H. Scott; Abayowa, Bernard; Patrick, James; Blasch, Erik; Trumpfheller, John
2016-05-01
The challenges for providing war fighters with the best possible actionable information from diverse sensing modalities using advances in big-data and machine learning are addressed in this paper. We start by presenting intelligence, surveillance, and reconnaissance (ISR) related big-data challenges associated with the Third Offset Strategy. Current approaches to big-data are shown to be limited with respect to reasoning/understanding. We present a discussion of what meaning making and understanding require. We posit that for human-machine collaborative solutions to address the requirements for the strategy a new approach, Qualia Exploitation of Sensor Technology (QuEST), will be required. The requirements for developing a QuEST theory of knowledge are discussed and finally, an engineering approach for achieving situation understanding is presented.
This report documents the activities performed and the results obtained from the one-year arsenic removal treatment technology demonstration project at the Big Sauk Lake Mobile Home Park (BSLMHP) in Sauk Centre, MN. The objectives of the project are to evaluate (1) the effective...
Huang, Ying; Zhang, Yi; Youtie, Jan; Porter, Alan L.; Wang, Xuefeng
2016-01-01
How do funding agencies ramp-up their capabilities to support research in a rapidly emerging area? This paper addresses this question through a comparison of research proposals awarded by the US National Science Foundation (NSF) and the National Natural Science Foundation of China (NSFC) in the field of Big Data. Big data is characterized by its size and difficulties in capturing, curating, managing and processing it in reasonable periods of time. Although Big Data has its legacy in longstanding information technology research, the field grew very rapidly over a short period. We find that the extent of interdisciplinarity is a key aspect in how these funding agencies address the rise of Big Data. Our results show that both agencies have been able to marshal funding to support Big Data research in multiple areas, but the NSF relies to a greater extent on multi-program funding from different fields. We discuss how these interdisciplinary approaches reflect the research hot-spots and innovation pathways in these two countries. PMID:27219466
Macedonia, Christian R; Johnson, Clark T; Rajapakse, Indika
2017-02-01
Technical advances in science have had broad implications in reproductive and women's health care. Recent innovations in population-level data collection and storage have made available an unprecedented amount of data for analysis while computational technology has evolved to permit processing of data previously thought too dense to study. "Big data" is a term used to describe data that are a combination of dramatically greater volume, complexity, and scale. The number of variables in typical big data research can readily be in the thousands, challenging the limits of traditional research methodologies. Regardless of what it is called, advanced data methods, predictive analytics, or big data, this unprecedented revolution in scientific exploration has the potential to dramatically assist research in obstetrics and gynecology broadly across subject matter. Before implementation of big data research methodologies, however, potential researchers and reviewers should be aware of strengths, strategies, study design methods, and potential pitfalls. Examination of big data research examples contained in this article provides insight into the potential and the limitations of this data science revolution and practical pathways for its useful implementation.
Huang, Ying; Zhang, Yi; Youtie, Jan; Porter, Alan L; Wang, Xuefeng
2016-01-01
How do funding agencies ramp-up their capabilities to support research in a rapidly emerging area? This paper addresses this question through a comparison of research proposals awarded by the US National Science Foundation (NSF) and the National Natural Science Foundation of China (NSFC) in the field of Big Data. Big data is characterized by its size and difficulties in capturing, curating, managing and processing it in reasonable periods of time. Although Big Data has its legacy in longstanding information technology research, the field grew very rapidly over a short period. We find that the extent of interdisciplinarity is a key aspect in how these funding agencies address the rise of Big Data. Our results show that both agencies have been able to marshal funding to support Big Data research in multiple areas, but the NSF relies to a greater extent on multi-program funding from different fields. We discuss how these interdisciplinary approaches reflect the research hot-spots and innovation pathways in these two countries.
Quality of Big Data in health care.
Sukumar, Sreenivas R; Natarajan, Ramachandran; Ferrell, Regina K
2015-01-01
The current trend in Big Data analytics and in particular health information technology is toward building sophisticated models, methods and tools for business, operational and clinical intelligence. However, the critical issue of data quality required for these models is not getting the attention it deserves. The purpose of this paper is to highlight the issues of data quality in the context of Big Data health care analytics. The insights presented in this paper are the results of analytics work that was done in different organizations on a variety of health data sets. The data sets include Medicare and Medicaid claims, provider enrollment data sets from both public and private sources, electronic health records from regional health centers accessed through partnerships with health care claims processing entities under health privacy protected guidelines. Assessment of data quality in health care has to consider: first, the entire lifecycle of health data; second, problems arising from errors and inaccuracies in the data itself; third, the source(s) and the pedigree of the data; and fourth, how the underlying purpose of data collection impact the analytic processing and knowledge expected to be derived. Automation in the form of data handling, storage, entry and processing technologies is to be viewed as a double-edged sword. At one level, automation can be a good solution, while at another level it can create a different set of data quality issues. Implementation of health care analytics with Big Data is enabled by a road map that addresses the organizational and technological aspects of data quality assurance. The value derived from the use of analytics should be the primary determinant of data quality. Based on this premise, health care enterprises embracing Big Data should have a road map for a systematic approach to data quality. Health care data quality problems can be so very specific that organizations might have to build their own custom software or data quality rule engines. Today, data quality issues are diagnosed and addressed in a piece-meal fashion. The authors recommend a data lifecycle approach and provide a road map, that is more appropriate with the dimensions of Big Data and fits different stages in the analytical workflow.
Design and Analysis of Embedded I&C for a Fully Submerged Magnetically Suspended Impeller Pump
Melin, Alexander M.; Kisner, Roger A.
2018-04-03
Improving nuclear reactor power system designs and fuel-processing technologies for safer and more efficient operation requires the development of new component designs. In particular, many of the advanced reactor designs such as the molten salt reactors and high-temperature gas-cooled reactors have operating environments beyond the capability of most currently available commercial components. To address this gap, new cross-cutting technologies need to be developed that will enable design, fabrication, and reliable operation of new classes of reactor components. The Advanced Sensor Initiative of the Nuclear Energy Enabling Technologies initiative is investigating advanced sensor and control designs that are capable of operatingmore » in these extreme environments. Under this initiative, Oak Ridge National Laboratory (ORNL) has been developing embedded instrumentation and control (I&C) for extreme environments. To develop, test, and validate these new sensing and control techniques, ORNL is building a pump test bed that utilizes submerged magnetic bearings to levitate the shaft. The eventual goal is to apply these techniques to a high-temperature (700°C) canned rotor pump that utilizes active magnetic bearings to eliminate the need for mechanical bearings and seals. The technologies will benefit the Next Generation Power Plant, Advanced Reactor Concepts, and Small Modular Reactor programs. In this paper, we will detail the design and analysis of the embedded I&C test bed with submerged magnetic bearings, focusing on the interplay between the different major systems. Then we will analyze the forces on the shaft and their role in the magnetic bearing design. Next, we will develop the radial and thrust bearing geometries needed to meet the operational requirements of the test bed. In conclusion, we will present some initial system identification results to validate the theoretical models of the test bed dynamics.« less
Design and Analysis of Embedded I&C for a Fully Submerged Magnetically Suspended Impeller Pump
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melin, Alexander M.; Kisner, Roger A.
Improving nuclear reactor power system designs and fuel-processing technologies for safer and more efficient operation requires the development of new component designs. In particular, many of the advanced reactor designs such as the molten salt reactors and high-temperature gas-cooled reactors have operating environments beyond the capability of most currently available commercial components. To address this gap, new cross-cutting technologies need to be developed that will enable design, fabrication, and reliable operation of new classes of reactor components. The Advanced Sensor Initiative of the Nuclear Energy Enabling Technologies initiative is investigating advanced sensor and control designs that are capable of operatingmore » in these extreme environments. Under this initiative, Oak Ridge National Laboratory (ORNL) has been developing embedded instrumentation and control (I&C) for extreme environments. To develop, test, and validate these new sensing and control techniques, ORNL is building a pump test bed that utilizes submerged magnetic bearings to levitate the shaft. The eventual goal is to apply these techniques to a high-temperature (700°C) canned rotor pump that utilizes active magnetic bearings to eliminate the need for mechanical bearings and seals. The technologies will benefit the Next Generation Power Plant, Advanced Reactor Concepts, and Small Modular Reactor programs. In this paper, we will detail the design and analysis of the embedded I&C test bed with submerged magnetic bearings, focusing on the interplay between the different major systems. Then we will analyze the forces on the shaft and their role in the magnetic bearing design. Next, we will develop the radial and thrust bearing geometries needed to meet the operational requirements of the test bed. In conclusion, we will present some initial system identification results to validate the theoretical models of the test bed dynamics.« less
The research and application of the power big data
NASA Astrophysics Data System (ADS)
Zhang, Suxiang; Zhang, Dong; Zhang, Yaping; Cao, Jinping; Xu, Huiming
2017-01-01
Facing the increasing environment crisis, how to improve energy efficiency is the important problem. Power big data is main support tool to realize demand side management and response. With the promotion of smart power consumption, distributed clean energy and electric vehicles etc get wide application; meanwhile, the continuous development of the Internet of things technology, more applications access the endings in the grid power link, which leads to that a large number of electric terminal equipment, new energy access smart grid, and it will produce massive heterogeneous and multi-state electricity data. These data produce the power grid enterprise's precious wealth, as the power big data. How to transform it into valuable knowledge and effective operation becomes an important problem, it needs to interoperate in the smart grid. In this paper, we had researched the various applications of power big data and integrate the cloud computing and big data technology, which include electricity consumption online monitoring, the short-term power load forecasting and the analysis of the energy efficiency. Based on Hadoop, HBase and Hive etc., we realize the ETL and OLAP functions; and we also adopt the parallel computing framework to achieve the power load forecasting algorithms and propose a parallel locally weighted linear regression model; we study on energy efficiency rating model to comprehensive evaluate the level of energy consumption of electricity users, which allows users to understand their real-time energy consumption situation, adjust their electricity behavior to reduce energy consumption, it provides decision-making basis for the user. With an intelligent industrial park as example, this paper complete electricity management. Therefore, in the future, power big data will provide decision-making support tools for energy conservation and emissions reduction.
Özdemir, Vural; Hekim, Nezih
2018-01-01
Driverless cars with artificial intelligence (AI) and automated supermarkets run by collaborative robots (cobots) working without human supervision have sparked off new debates: what will be the impacts of extreme automation, turbocharged by the Internet of Things (IoT), AI, and the Industry 4.0, on Big Data and omics implementation science? The IoT builds on (1) broadband wireless internet connectivity, (2) miniaturized sensors embedded in animate and inanimate objects ranging from the house cat to the milk carton in your smart fridge, and (3) AI and cobots making sense of Big Data collected by sensors. Industry 4.0 is a high-tech strategy for manufacturing automation that employs the IoT, thus creating the Smart Factory. Extreme automation until "everything is connected to everything else" poses, however, vulnerabilities that have been little considered to date. First, highly integrated systems are vulnerable to systemic risks such as total network collapse in the event of failure of one of its parts, for example, by hacking or Internet viruses that can fully invade integrated systems. Second, extreme connectivity creates new social and political power structures. If left unchecked, they might lead to authoritarian governance by one person in total control of network power, directly or through her/his connected surrogates. We propose Industry 5.0 that can democratize knowledge coproduction from Big Data, building on the new concept of symmetrical innovation. Industry 5.0 utilizes IoT, but differs from predecessor automation systems by having three-dimensional (3D) symmetry in innovation ecosystem design: (1) a built-in safe exit strategy in case of demise of hyperconnected entrenched digital knowledge networks. Importantly, such safe exists are orthogonal-in that they allow "digital detox" by employing pathways unrelated/unaffected by automated networks, for example, electronic patient records versus material/article trails on vital medical information; (2) equal emphasis on both acceleration and deceleration of innovation if diminishing returns become apparent; and (3) next generation social science and humanities (SSH) research for global governance of emerging technologies: "Post-ELSI Technology Evaluation Research" (PETER). Importantly, PETER considers the technology opportunity costs, ethics, ethics-of-ethics, framings (epistemology), independence, and reflexivity of SSH research in technology policymaking. Industry 5.0 is poised to harness extreme automation and Big Data with safety, innovative technology policy, and responsible implementation science, enabled by 3D symmetry in innovation ecosystem design.
m-Health 2.0: New perspectives on mobile health, machine learning and big data analytics.
Istepanian, Robert S H; Al-Anzi, Turki
2018-06-08
Mobile health (m-Health) has been repeatedly called the biggest technological breakthrough of our modern times. Similarly, the concept of big data in the context of healthcare is considered one of the transformative drivers for intelligent healthcare delivery systems. In recent years, big data has become increasingly synonymous with mobile health, however key challenges of 'Big Data and mobile health', remain largely untackled. This is becoming particularly important with the continued deluge of the structured and unstructured data sets generated on daily basis from the proliferation of mobile health applications within different healthcare systems and products globally. The aim of this paper is of twofold. First we present the relevant big data issues from the mobile health (m-Health) perspective. In particular we discuss these issues from the technological areas and building blocks (communications, sensors and computing) of mobile health and the newly defined (m-Health 2.0) concept. The second objective is to present the relevant rapprochement issues of big m-Health data analytics with m-Health. Further, we also present the current and future roles of machine and deep learning within the current smart phone centric m-health model. The critical balance between these two important areas will depend on how different stakeholder from patients, clinicians, healthcare providers, medical and m-health market businesses and regulators will perceive these developments. These new perspectives are essential for better understanding the fine balance between the new insights of how intelligent and connected the future mobile health systems will look like and the inherent risks and clinical complexities associated with the big data sets and analytical tools used in these systems. These topics will be subject for extensive work and investigations in the foreseeable future for the areas of data analytics, computational and artificial intelligence methods applied for mobile health. Copyright © 2018 Elsevier Inc. All rights reserved.
2004-04-15
Technology derived by NASA for monitoring control gyros in the Skylab program is directly applicable to the problems of fault detection of railroad wheel bearings. Marhsall Space Flight Center's scientists have developed a detection concept based on the fact that bearing defects excite resonant frequency of rolling elements of the bearing as they impact the defect. By detecting resonant frequency and subsequently analyzing the character of this signal, bearing defects may be detected and identified as to source.
Cryogenic Magnetic Bearing Test Facility (CMBTF)
NASA Technical Reports Server (NTRS)
1992-01-01
The Cryogenic Magnetic Bearing Test Facility (CMBTF) was designed and built to evaluate compact, lightweight magnetic bearings for use in the SSME's (space shuttle main engine) liquid oxygen and liquid hydrogen turbopumps. State of the art and tradeoff studies were conducted which indicated that a hybrid permanent magnet bias homopolar magnetic bearing design would be smaller, lighter, and much more efficient than conventional industrial bearings. A test bearing of this type was designed for the test rig for use at both room temperature and cryogenic temperature (-320 F). The bearing was fabricated from state-of-the-art materials and incorporated into the CMBTF. Testing at room temperature was accomplished at Avcon's facility. These preliminary tests indicated that this magnetic bearing is a feasible alternative to older bearing technologies. Analyses showed that the hybrid magnetic bearing is one-third the weight, considerably smaller, and uses less power than previous generations of magnetic bearings.
Review of FY 2001 Development Work for Vitrification of Sodium Bearing Waste
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, Dean Dalton; Barnes, Charles Marshall
2002-09-01
Treatment of sodium-bearing waste (SBW) at the Idaho Nuclear Technology and Engineering Center (INTEC) within the Idaho National Engineering and Environmental Laboratory is mandated by the Settlement Agreement between the Department of Energy and the State of Idaho. This report discusses significant findings from vitrification technology development during 2001 and their impacts on the design basis for SBW vitrification.
ERIC Educational Resources Information Center
Budinski, Natalija; Milinkovic, Dragica
2017-01-01
The availability of technology has a big impact on education, and that is the main reason for discussing the use of technologies in mathematical education in our paper. The availability of technology influences how mathematical contents could be presented to students. We present the benefits of learning mathematical concepts through real life…
NASA Technical Reports Server (NTRS)
Kuiper, T. B. H.; Pasachoff, J. M.
1973-01-01
Comparison of observations of type III impulsive radio bursts made at the Clark Lake Radio Observatory with high-spatial-resolution cinematographic observations taken at the Big Bear Solar Observatory. Use of the log-periodic radio interferometer makes it possible to localize the radio emission uniquely. This study concentrates on the particularly active region close to the limb on May 22, 1970. Sixteen of the 17 groups were associated with some H alpha activity, 11 of them with the start of such activity.
NASA Astrophysics Data System (ADS)
Title, A. M.; Tarbell, T. D.; Topka, K. P.; Shine, R. A.; Simon, G. W.; Zirin, H.; SOUP Team
The SOUP flow fields have been compared with carefully aligned magnetograms taken at the BBSO before, during, and after the SOUP images. The magnetic field is observed to exist in locations where either the flow is convergent or on the boundaries of the outflow from a flow cell center. Streamlines calculated from the flow field agree very well with the observed motions of the magnetic field in the BBSO magnetogram movies.
Earthquakes, November-December 1992
Person, W.J.
1993-01-01
There were two major earthquakes (7.0≤M<8.0) during the last two months of the year, a magntidue 7.5 earthquake on December 12 in the Flores region, Indonesia, and a magnitude 7.0 earthquake on December 20 in the Banda Sea. Earthquakes caused fatalities in China and Indonesia. The greatest number of deaths (2,500) for the year occurred in Indonesia. In Switzerland, six people were killed by an accidental explosion recoreded by seismographs. In teh United States, a magnitude 5.3 earthquake caused slight damage at Big Bear in southern California.
Balsa Tower Walls Brave "Big Buster"
ERIC Educational Resources Information Center
Granlund, George
2008-01-01
Like many technology teachers, the author, a technology education teacher at Arthur Hill High School in Saginaw, Michigan, tries to stretch his budget by "milking" each student activity for maximum benefit. In the technology department, they use balsa wood towers to teach the basics of structural engineering. To get the most from their materials,…
Mooresville Middle School Snags Web Site of the Month
ERIC Educational Resources Information Center
Tech Directions, 2008
2008-01-01
This has been a big year for Kim Kulawik, a technology education teacher at Mooresville (NC) Middle School: He has been named a 2008 North Carolina Technology Student Association Advisor of the Year, a 2008 International Technology Association Teacher of the Year--and now he and his students are receiving the December 2008 "Tech…
Assessing Acceptance toward Wiki Technology in the Context of Higher Education
ERIC Educational Resources Information Center
Altanopoulou, Panagiota; Tselios, Nikolaos
2017-01-01
This study investigated undergraduate students' intention to use wiki technology. An extension of the Technology Acceptance Model (TAM) has been used by taking into account not only students' wiki perceived utility and usability, but also Big Five personality characteristics and two other variables, social norms, and facilitating conditions, as…
Big Bang Technology: What's Next in Design Education, Radical Innovation or Incremental Change?
ERIC Educational Resources Information Center
Fleischmann, Katja
2013-01-01
Since the introduction of digital media, design education has been challenged by the ongoing advancement of technology. Technological change has created unprecedented possibilities for designers to engage in the broadening realm of interactive digital media. The increasing sophistication of interactivity has brought a complexity which needs to be…
Amanov, Auezhan; Pyoun, Young-Shik; Cho, In-Shik; Lee, Chang-Soon; Park, In-Gyu
2011-01-01
One of the primary remedies for tribological problems is surface modification. The reduction of the friction between the ball and the raceway of bearings is a very important goal of the development of bearing technology. A low friction has a positive effect in terms of the extension of the fatigue life, avoidance of a temperature rise, and prevention of premature failure of bearings. Therefore, this research sought to investigate the effects of micro-tracks and micro-dimples on the tribological characteristics at the contact point between the ball and the raceway of thrust ball bearings (TBBs). The ultrasonic nanocrystal surface modification (UNSM) technology was applied using different intervals (feed rates) to the TBB raceway surface to create micro-tracks and micro-dimples. The friction coefficient after UNSM at 50 microm intervals showed marked sensitivity and a significant reduction of 30%. In this study, the results showed that more micro-dimples yield a lower friction coefficient.
NASA Technical Reports Server (NTRS)
Dowson, D.; Hamrock, B. J.
1981-01-01
The familiar precision rolling-element bearings of the twentieth century are products of exacting technology and sophisticated science. Their very effectiveness and basic simplicity of form may discourage further interest in their history and development. Yet the full story covers a large portion of recorded history and surprising evidence of an early recognition of the advantages of rolling motion over sliding action and progress toward the development of rolling-element bearings. The development of rolling-element bearings is followed from the earliest civilizations to the end of the eighteenth century. The influence of general technological developments, particularly those concerned with the movement of large building blocks, road transportation, instruments, water-raising equipment, and windmills are discussed, together with the emergence of studies of the nature of rolling friction and the impact of economic factors. By 1800 the essential features of ball and rolling-element bearings had emerged and it only remained for precision manufacture and mass production to confirm the value of these fascinating machine elements.
3D Printing and Biofabrication for Load Bearing Tissue Engineering.
Jeong, Claire G; Atala, Anthony
2015-01-01
Cell-based direct biofabrication and 3D bioprinting is becoming a dominant technological platform and is suggested as a new paradigm for twenty-first century tissue engineering. These techniques may be our next step in surpassing the hurdles and limitations of conventional scaffold-based tissue engineering, and may offer the industrial potential of tissue engineered products especially for load bearing tissues. Here we present a topically focused review regarding the fundamental concepts, state of the art, and perspectives of this new technology and field of biofabrication and 3D bioprinting, specifically focused on tissue engineering of load bearing tissues such as bone, cartilage, osteochondral and dental tissue engineering.
Big Data in Plant Science: Resources and Data Mining Tools for Plant Genomics and Proteomics.
Popescu, George V; Noutsos, Christos; Popescu, Sorina C
2016-01-01
In modern plant biology, progress is increasingly defined by the scientists' ability to gather and analyze data sets of high volume and complexity, otherwise known as "big data". Arguably, the largest increase in the volume of plant data sets over the last decade is a consequence of the application of the next-generation sequencing and mass-spectrometry technologies to the study of experimental model and crop plants. The increase in quantity and complexity of biological data brings challenges, mostly associated with data acquisition, processing, and sharing within the scientific community. Nonetheless, big data in plant science create unique opportunities in advancing our understanding of complex biological processes at a level of accuracy without precedence, and establish a base for the plant systems biology. In this chapter, we summarize the major drivers of big data in plant science and big data initiatives in life sciences with a focus on the scope and impact of iPlant, a representative cyberinfrastructure platform for plant science.
Big-BOE: Fusing Spanish Official Gazette with Big Data Technology.
Basanta-Val, Pablo; Sánchez-Fernández, Luis
2018-06-01
The proliferation of new data sources, stemmed from the adoption of open-data schemes, in combination with an increasing computing capacity causes the inception of new type of analytics that process Internet of things with low-cost engines to speed up data processing using parallel computing. In this context, the article presents an initiative, called BIG-Boletín Oficial del Estado (BOE), designed to process the Spanish official government gazette (BOE) with state-of-the-art processing engines, to reduce computation time and to offer additional speed up for big data analysts. The goal of including a big data infrastructure is to be able to process different BOE documents in parallel with specific analytics, to search for several issues in different documents. The application infrastructure processing engine is described from an architectural perspective and from performance, showing evidence on how this type of infrastructure improves the performance of different types of simple analytics as several machines cooperate.
Intelligent Control of Micro Grid: A Big Data-Based Control Center
NASA Astrophysics Data System (ADS)
Liu, Lu; Wang, Yanping; Liu, Li; Wang, Zhiseng
2018-01-01
In this paper, a structure of micro grid system with big data-based control center is introduced. Energy data from distributed generation, storage and load are analized through the control center, and from the results new trends will be predicted and applied as a feedback to optimize the control. Therefore, each step proceeded in micro grid can be adjusted and orgnized in a form of comprehensive management. A framework of real-time data collection, data processing and data analysis will be proposed by employing big data technology. Consequently, a integrated distributed generation and a optimized energy storage and transmission process can be implemented in the micro grid system.
Monkman, H.; Petersen, C.; Weber, J.; Borycki, E. M.; Adams, S.; Collins, S.
2014-01-01
Summary Objectives While big data offers enormous potential for improving healthcare delivery, many of the existing claims concerning big data in healthcare are based on anecdotal reports and theoretical vision papers, rather than scientific evidence based on empirical research. Historically, the implementation of health information technology has resulted in unintended consequences at the individual, organizational and social levels, but these unintended consequences of collecting data have remained unaddressed in the literature on big data. The objective of this paper is to provide insights into big data from the perspective of people, social and organizational considerations. Method We draw upon the concept of persona to define the digital persona as the intersection of data, tasks and context for different user groups. We then describe how the digital persona can serve as a framework to understanding sociotechnical considerations of big data implementation. We then discuss the digital persona in the context of micro, meso and macro user groups across the 3 Vs of big data. Results We provide insights into the potential benefits and challenges of applying big data approaches to healthcare as well as how to position these approaches to achieve health system objectives such as patient safety or patient-engaged care delivery. We also provide a framework for defining the digital persona at a micro, meso and macro level to help understand the user contexts of big data solutions. Conclusion While big data provides great potential for improving healthcare delivery, it is essential that we consider the individual, social and organizational contexts of data use when implementing big data solutions. PMID:25123726
Kuziemsky, C E; Monkman, H; Petersen, C; Weber, J; Borycki, E M; Adams, S; Collins, S
2014-08-15
While big data offers enormous potential for improving healthcare delivery, many of the existing claims concerning big data in healthcare are based on anecdotal reports and theoretical vision papers, rather than scientific evidence based on empirical research. Historically, the implementation of health information technology has resulted in unintended consequences at the individual, organizational and social levels, but these unintended consequences of collecting data have remained unaddressed in the literature on big data. The objective of this paper is to provide insights into big data from the perspective of people, social and organizational considerations. We draw upon the concept of persona to define the digital persona as the intersection of data, tasks and context for different user groups. We then describe how the digital persona can serve as a framework to understanding sociotechnical considerations of big data implementation. We then discuss the digital persona in the context of micro, meso and macro user groups across the 3 Vs of big data. We provide insights into the potential benefits and challenges of applying big data approaches to healthcare as well as how to position these approaches to achieve health system objectives such as patient safety or patient-engaged care delivery. We also provide a framework for defining the digital persona at a micro, meso and macro level to help understand the user contexts of big data solutions. While big data provides great potential for improving healthcare delivery, it is essential that we consider the individual, social and organizational contexts of data use when implementing big data solutions.
The Positive Effects of Technology on Teaching and Student Learning
ERIC Educational Resources Information Center
Costley, Kevin C.
2014-01-01
Technology is such a big part of the world of which we live. Many of the jobs that did not require technology use in years past do require the use of technology today. Many more homes have computers than in years past and increasing numbers of people know how to use them. Technology is being used by children and adults on a daily basis by way of…
The epidemiology of bearing surface usage in total hip arthroplasty in the United States.
Bozic, Kevin J; Kurtz, Steven; Lau, Edmund; Ong, Kevin; Chiu, Vanessa; Vail, Thomas P; Rubash, Harry E; Berry, Daniel J
2009-07-01
Hard-on-hard bearings offer the potential to improve the survivorship of total hip arthroplasty implants. However, the specific indications for the use of these advanced technologies remain controversial. The purpose of this study was to characterize the epidemiology of bearing surface utilization in total hip arthroplasty in the United States with respect to patient, hospital, geographic, and payer characteristics. The Nationwide Inpatient Sample database was used to analyze bearing type and demographic characteristics associated with 112,095 primary total hip arthroplasties performed in the United States between October 1, 2005, and December 31, 2006. The prevalence of each type of total hip arthroplasty bearing was calculated for population subgroups as a function of age, sex, census region, payer class, and hospital type. The most commonly reported bearing was metal-on-polyethylene (51%) followed by metal-on-metal (35%) and ceramic-on-ceramic (14%). Metal-on-polyethylene bearings were most commonly reported in female Medicare patients who were sixty-five to seventy-four years old, while metal-on-metal and ceramic-on-ceramic bearings were most commonly reported in privately insured male patients who were less than sixty-five years old. Thirty-three percent of patients over sixty-five years old had a hard-on-hard bearing reported. There was substantial regional variation in bearing usage; the highest prevalence of metal-on-polyethylene bearings was reported in the Northeast and at nonteaching hospitals, and the highest prevalence of metal-on-metal bearings was reported in the South and at teaching hospitals. The usage of total hip arthroplasty bearings varies considerably by patient characteristics, hospital type, and geographic location throughout the United States. Despite uncertain advantages in older patients, hard-on-hard bearings are commonly used in patients over the age of sixty-five years. Further study is necessary to define the appropriate indications for these advanced technologies in total hip arthroplasty.
The Need for a Definition of Big Data for Nursing Science: A Case Study of Disaster Preparedness.
Wong, Ho Ting; Chiang, Vico Chung Lim; Choi, Kup Sze; Loke, Alice Yuen
2016-10-17
The rapid development of technology has made enormous volumes of data available and achievable anytime and anywhere around the world. Data scientists call this change a data era and have introduced the term "Big Data", which has drawn the attention of nursing scholars. Nevertheless, the concept of Big Data is quite fuzzy and there is no agreement on its definition among researchers of different disciplines. Without a clear consensus on this issue, nursing scholars who are relatively new to the concept may consider Big Data to be merely a dataset of a bigger size. Having a suitable definition for nurse researchers in their context of research and practice is essential for the advancement of nursing research. In view of the need for a better understanding on what Big Data is, the aim in this paper is to explore and discuss the concept. Furthermore, an example of a Big Data research study on disaster nursing preparedness involving six million patient records is used for discussion. The example demonstrates that a Big Data analysis can be conducted from many more perspectives than would be possible in traditional sampling, and is superior to traditional sampling. Experience gained from the process of using Big Data in this study will shed light on future opportunities for conducting evidence-based nursing research to achieve competence in disaster nursing.
The Need for a Definition of Big Data for Nursing Science: A Case Study of Disaster Preparedness
Wong, Ho Ting; Chiang, Vico Chung Lim; Choi, Kup Sze; Loke, Alice Yuen
2016-01-01
The rapid development of technology has made enormous volumes of data available and achievable anytime and anywhere around the world. Data scientists call this change a data era and have introduced the term “Big Data”, which has drawn the attention of nursing scholars. Nevertheless, the concept of Big Data is quite fuzzy and there is no agreement on its definition among researchers of different disciplines. Without a clear consensus on this issue, nursing scholars who are relatively new to the concept may consider Big Data to be merely a dataset of a bigger size. Having a suitable definition for nurse researchers in their context of research and practice is essential for the advancement of nursing research. In view of the need for a better understanding on what Big Data is, the aim in this paper is to explore and discuss the concept. Furthermore, an example of a Big Data research study on disaster nursing preparedness involving six million patient records is used for discussion. The example demonstrates that a Big Data analysis can be conducted from many more perspectives than would be possible in traditional sampling, and is superior to traditional sampling. Experience gained from the process of using Big Data in this study will shed light on future opportunities for conducting evidence-based nursing research to achieve competence in disaster nursing. PMID:27763525
Increasing the Mobility of Dismounted Marines
2009-10-01
actually been the inspiration for military UGV development programs, including the Defense Advanced Research Projects Agency’s (DARPA) Legged Squad Support...or wheel-based; only the BigDog is a leg -based system. This presented BigDog with certain advantages (particularly involving its ability to traverse...1,000 pounds) Website: http://www.dtiweb.net/index.html 50 General Discussion: Ranlo The Ranlo – named after Defense Technologies, Inc.’s ( DTI
Meteor Observations as Big Data Citizen Science
NASA Astrophysics Data System (ADS)
Gritsevich, M.; Vinkovic, D.; Schwarz, G.; Nina, A.; Koschny, D.; Lyytinen, E.
2016-12-01
Meteor science represents an excellent example of the citizen science project, where progress in the field has been largely determined by amateur observations. Over the last couple of decades technological advancements in observational techniques have yielded drastic improvements in the quality, quantity and diversity of meteor data, while even more ambitious instruments are about to become operational. This empowers meteor science to boost its experimental and theoretical horizons and seek more advanced scientific goals. We review some of the developments that push meteor science into the Big Data era that requires more complex methodological approaches through interdisciplinary collaborations with other branches of physics and computer science. We argue that meteor science should become an integral part of large surveys in astronomy, aeronomy and space physics, and tackle the complexity of micro-physics of meteor plasma and its interaction with the atmosphere. The recent increased interest in meteor science triggered by the Chelyabinsk fireball helps in building the case for technologically and logistically more ambitious meteor projects. This requires developing new methodological approaches in meteor research, with Big Data science and close collaboration between citizen science, geoscience and astronomy as critical elements. We discuss possibilities for improvements and promote an opportunity for collaboration in meteor science within the currently established BigSkyEarth http://bigskyearth.eu/ network.
A. Palmgren Revisited: A Basis for Bearing Life Prediction
NASA Technical Reports Server (NTRS)
Zaretsky, Erwin V.
1997-01-01
Bearing technology, as well as the bearing industry, began to develop with the invention of the bicycle in the 1850's. At the same time, high-quality steel was made possible by the Bessemer process. In 1881, H. Hertz published his contact stress analysis. By 1902, R. Stribeck had published his work based on Hertz theory to calculate the maximum load of a radially loaded ball bearing. By 1920, all of the rolling bearing types used today were being manufactured. AISI 52100 bearing steel became the material of choice for these bearings. Beginning in 1918, engineers directed their attention to predicting the lives of these bearings. In 1924, A. Palmgren published a paper outlining his approach to bearing life prediction. This paper was the basis for the Lundberg-Palmgren life theory published in 1947. A critical review of the 1924 Palmgren paper is presented here together with a discussion of its effect on bearing life prediction.
Second International Symposium on Magnetic Suspension Technology, part 2
NASA Technical Reports Server (NTRS)
Groom, Nelson J. (Editor); Britcher, Colin P. (Editor)
1994-01-01
In order to examine the state of technology of all areas of magnetic suspension and to review related recent developments in sensors and controls approaches, superconducting magnet technology, and design/implementation practices, the 2nd International Symposium on Magnetic Suspension Technology was held at the Westin Hotel in Seattle, WA, on 11-13 Aug. 1993. The symposium included 18 technical sessions in which 44 papers were presented. The technical sessions covered the areas of bearings, bearing modelling, controls, vibration isolation, micromachines, superconductivity, wind tunnel magnetic suspension systems, magnetically levitated trains (MAGLEV), rotating machinery and energy storage, and applications. A list of attendees appears at the end of the document.
Unlocking the Power of Big Data at the National Institutes of Health.
Coakley, Meghan F; Leerkes, Maarten R; Barnett, Jason; Gabrielian, Andrei E; Noble, Karlynn; Weber, M Nick; Huyen, Yentram
2013-09-01
The era of "big data" presents immense opportunities for scientific discovery and technological progress, with the potential to have enormous impact on research and development in the public sector. In order to capitalize on these benefits, there are significant challenges to overcome in data analytics. The National Institute of Allergy and Infectious Diseases held a symposium entitled "Data Science: Unlocking the Power of Big Data" to create a forum for big data experts to present and share some of the creative and innovative methods to gleaning valuable knowledge from an overwhelming flood of biological data. A significant investment in infrastructure and tool development, along with more and better-trained data scientists, may facilitate methods for assimilation of data and machine learning, to overcome obstacles such as data security, data cleaning, and data integration.
Unlocking the Power of Big Data at the National Institutes of Health
Coakley, Meghan F.; Leerkes, Maarten R.; Barnett, Jason; Gabrielian, Andrei E.; Noble, Karlynn; Weber, M. Nick
2013-01-01
Abstract The era of “big data” presents immense opportunities for scientific discovery and technological progress, with the potential to have enormous impact on research and development in the public sector. In order to capitalize on these benefits, there are significant challenges to overcome in data analytics. The National Institute of Allergy and Infectious Diseases held a symposium entitled “Data Science: Unlocking the Power of Big Data” to create a forum for big data experts to present and share some of the creative and innovative methods to gleaning valuable knowledge from an overwhelming flood of biological data. A significant investment in infrastructure and tool development, along with more and better-trained data scientists, may facilitate methods for assimilation of data and machine learning, to overcome obstacles such as data security, data cleaning, and data integration. PMID:27442200
A New Foil Air Bearing Test Rig for Use to 700 C and 70,000 rpm
NASA Technical Reports Server (NTRS)
DellaCorte, Chris
1997-01-01
A new test rig has been developed for evaluating foil air bearings at high temperatures and speeds. These bearings are self acting hydrodynamic air bearings which have been successfully applied to a variety of turbomachinery operating up to 650 C. This unique test rig is capable of measuring bearing torque during start-up, shut-down and high speed operation. Load capacity and general performance characteristics, such as durability, can be measured at temperatures to 700 C and speeds to 70,000 rpm. This paper describes the new test rig and demonstrates its capabilities through the preliminary characterization of several bearings. The bearing performance data from this facility can be used to develop advanced turbomachinery incorporating high temperature oil-free air bearing technology.
A Systems Approach to the Solid Lubrication of Foil Air Bearings for Oil-Free Turbomachinery
NASA Technical Reports Server (NTRS)
DellaCorte, Christopher; Zaldana, Antonio R.; Radil, Kevin C.
2002-01-01
Foil air bearings are self-acting hydrodynamic bearings which rely upon solid lubricants to reduce friction and minimize wear during sliding which occurs at start-up and shut-down when surface speeds are too low to allow the formation of a hydrodynamic air film. This solid lubrication is typically accomplished by coating the non-moving foil surface with a thin, soft polymeric film. The following paper introduces a systems approach in which the solid lubrication is provided by a combination of self lubricating shaft coatings coupled with various wear resistant and lubricating foil coatings. The use of multiple materials, each providing different functions is modeled after oil-lubricated hydrodynamic sleeve bearing technology which utilizes various coatings and surface treatments in conjunction with oil lubricants to achieve optimum performance. In this study, room temperature load capacity tests are performed on journal foil air bearings operating at 14,000 rpm. Different shaft and foil coating technologies such as plasma sprayed composites, ceramic, polymer and inorganic lubricant coatings are evaluated as foil bearing lubricants. The results indicate that bearing performance is improved through the individual use of the lubricants and treatments tested. Further, combining several solid lubricants together yielded synergistically better results than any material alone.
Murphy, Sean M.; Cox, John J.; Clark, Joseph D.; Augustine, Benjamin J.; Hast, John T.; Gibbs, Dan; Strunk, Michael; Dobey, Steven
2015-01-01
Animal reintroductions are important tools of wildlife management to restore species to their historical range, and they can also create unique opportunities to study population dynamics and genetics from founder events. We used non-invasive hair sampling in a systematic, closed-population capture-mark-recapture (CMR) study design at the Big South Fork (BSF) area in Kentucky during 2010 and Tennessee during 2012 to estimate the demographic and genetic characteristics of the black bear (Ursus americanus) population that resulted from a reintroduced founding population of 18 bears in 1998. We estimated 38 (95% CI: 31–66) and 190 (95% CI: 170–219) bears on the Kentucky and Tennessee study areas, respectively. Based on the Tennessee abundance estimate alone, the mean annual growth rate was 18.3% (95% CI: 17.4–19.5%) from 1998 to 2012. We also compared the genetic characteristics of bears sampled during 2010–2012 to bears in the population during 2000–2002, 2–4 years following reintroduction, and to the source population. We found that the level of genetic diversity since reintroduction as indicated by expected heterozygosity (HE) remained relatively constant (HE(source, 2004) = 0.763, HE(BSF, 2000–2002) = 0.729, HE(BSF, 2010–2012) = 0.712) and the effective number of breeders (NB) remained low but had increased since reintroduction in the absence of sufficient immigration (NB(BSF, 2000–2002) = 12, NB(BSF, 2010–2012) = 35). This bear population appears to be genetically isolated, but contrary to our expectations, we did not find evidence of genetic diversity loss or other deleterious genetic effects typically observed from small founder groups. We attribute that to high initial genetic diversity in the founder group combined with overlapping generations and rapid population growth. Although the population remains relatively small, the reintroduction using a small founder group appears to be demographically and genetically sustainable.
[Contribution and challenges of Big Data in oncology].
Saintigny, Pierre; Foy, Jean-Philippe; Ferrari, Anthony; Cassier, Philippe; Viari, Alain; Puisieux, Alain
2017-03-01
Since the first draft of the human genome sequence published in 2001, the cost of sequencing has dramatically decreased. The development of new technologies such as next generation sequencing led to a comprehensive characterization of a large number of tumors of various types as well as to significant advances in precision medicine. Despite the valuable information this technological revolution has allowed to produce, the vast amount of data generated resulted in the emergence of new challenges for the biomedical community, such as data storage, processing and mining. Here, we describe the contribution and challenges of Big Data in oncology. Copyright © 2016 Société Française du Cancer. Published by Elsevier Masson SAS. All rights reserved.
Social Media, Big Data, and Mental Health: Current Advances and Ethical Implications
Conway, Mike; O’Connor, Daniel
2016-01-01
Mental health (including substance abuse) is the fifth greatest contributor to the global burden of disease, with an economic cost estimated to be US $2.5 trillion in 2010, and expected to double by 2030. Developing information systems to support and strengthen population-level mental health monitoring forms a core part of the World Health Organization’s Comprehensive Action Plan 2013–2020. In this paper, we review recent work that utilizes social media “big data” in conjunction with associated technologies like natural language processing and machine learning to address pressing problems in population-level mental health surveillance and research, focusing both on technological advances and core ethical challenges. PMID:27042689
Performance of Simple Gas Foil Thrust Bearings in Air
NASA Technical Reports Server (NTRS)
Bruckner, Robert J.
2012-01-01
Foil bearings are self-acting hydrodynamics devices used to support high speed rotating machinery. The advantages that they offer to process fluid lubricated machines include: high rotational speed capability, no auxiliary lubrication system, non-contacting high speed operation, and improved damping as compared to rigid hydrodynamic bearings. NASA has had a sporadic research program in this technology for almost 6 decades. Advances in the technology and understanding of foil journal bearings have enabled several new commercial products in recent years. These products include oil-free turbochargers for both heavy trucks and automobiles, high speed electric motors, microturbines for distributed power generation, and turbojet engines. However, the foil thrust bearing has not received a complimentary level of research and therefore has become the weak link of oil-free turbomachinery. In an effort to both provide machine designers with basic performance parameters and to elucidate the underlying physics of foil thrust bearings, NASA Glenn Research Center has completed an effort to experimentally measure the performance of simple gas foil thrust bearing in air. The database includes simple bump foil supported thrust bearings with full geometry and manufacturing techniques available to the user. Test conditions consist of air at ambient pressure and temperatures up to 500 C and rotational speeds to 55,000 rpm. A complete set of axial load, frictional torque, and rotational speed is presented for two different compliant sub-structures and inter-pad gaps. Data obtained from commercially available foil thrust bearings both with and without active cooling is presented for comparison. A significant observation made possible by this data set is the speed-load capacity characteristic of foil thrust bearings. Whereas for the foil journal bearing the load capacity increases linearly with rotational speed, the foil thrust bearing operates in the hydrodynamic high speed limit. In this case, the load capacity is constant and in fact often decreases with speed if other factors such as thermal conditions and runner distortions are permitted to dominate the bearing performance.
2010-05-01
Type of Lubrication for a Tilting Pad Thrust Bearing ,” ASME Journal of Lubrication Technology, 96 Ser F (1), pp. 22-27. [9] Gregory, R.S., 1974...1986, “Measurements of Maximum Temperature in Tilting - Pad Thrust Bearings ,” Technical Preprints - Presented at the ASLE 41st Annual Meeting. (ASLE...Safar [7] provides a modified Reynolds number analysis on hydrostatic thrust bearing performance parameters including the effects of tilt . Finally, San
2008-06-24
CMG torque amplification property in which a small amount of CMG gimbal motor input torque results in a relatively large slewing torque gives it a... properties these actuators provide [74–77]. 2.1.3 Magnetic Levitation and Bearing Technology Magnetic bearings for flywheel rotor suspension has a rich...2.1: Example of Magnetic and Ball Bearing Properties [21] Bearing ID OD Height Radial Static Max Speed (mm) (mm) (mm) Capacity(N) (RPM) MB-R-25-205 25
Application of high performance asynchronous socket communication in power distribution automation
NASA Astrophysics Data System (ADS)
Wang, Ziyu
2017-05-01
With the development of information technology and Internet technology, and the growing demand for electricity, the stability and the reliable operation of power system have been the goal of power grid workers. With the advent of the era of big data, the power data will gradually become an important breakthrough to guarantee the safe and reliable operation of the power grid. So, in the electric power industry, how to efficiently and robustly receive the data transmitted by the data acquisition device, make the power distribution automation system be able to execute scientific decision quickly, which is the pursuit direction in power grid. In this paper, some existing problems in the power system communication are analysed, and with the help of the network technology, a set of solutions called Asynchronous Socket Technology to the problem in network communication which meets the high concurrency and the high throughput is proposed. Besides, the paper also looks forward to the development direction of power distribution automation in the era of big data and artificial intelligence.
Quantifying the Modern City: Emerging Technologies and Big Data for Active Living Research.
Adlakha, Deepti
2017-01-01
Opportunities and infrastructure for active living are an important aspect of a community's design, livability, and health. Features of the built environment influence active living and population levels of physical activity, but objective study of the built environment influence on active living behaviors is challenging. The use of emerging technologies for active living research affords new and promising means to obtain objective data on physical activity behaviors and improve the precision and accuracy of measurements. This is significant for physical activity promotion because precise measurements can enable detailed examinations of where, when, and how physical activity behaviors actually occur, thus enabling more effective targeting of particular behavior settings and environments. The aim of this focused review is to provide an overview of trends in emerging technologies that can profoundly change our ability to understand environmental determinants of active living. It discusses novel technological approaches and big data applications to measure and track human behaviors that may have broad applications across the fields of urban planning, public health, and spatial epidemiology.
The Datafication of Everything - Even Toilets.
Lun, Kwok-Chan
2018-04-22
Health informatics has benefitted from the development of Info-Communications Technology (ICT) over the last fifty years. Advances in ICT in healthcare have now started to spur advances in Data Technology as hospital information systems, electronic health and medical records, mobile devices, social media and Internet Of Things (IOT) are making a substantial impact on the generation of data. It is timely for healthcare institutions to recognize data as a corporate asset and promote a data-driven culture within the institution. It is both strategic and timely for IMIA, as an international organization in health informatics, to take the lead to promote a data-driven culture in healthcare organizations. This can be achieved by expanding the terms of reference of its existing Working Group on Data Mining and Big Data Analysis to include (1) data analytics with special reference to healthcare, (2) big data tools and solutions, (3) bridging information technology and data technology and (4) data quality issues and challenges. Georg Thieme Verlag KG Stuttgart.
GEOSS: Addressing Big Data Challenges
NASA Astrophysics Data System (ADS)
Nativi, S.; Craglia, M.; Ochiai, O.
2014-12-01
In the sector of Earth Observation, the explosion of data is due to many factors including: new satellite constellations, the increased capabilities of sensor technologies, social media, crowdsourcing, and the need for multidisciplinary and collaborative research to face Global Changes. In this area, there are many expectations and concerns about Big Data. Vendors have attempted to use this term for their commercial purposes. It is necessary to understand whether Big Data is a radical shift or an incremental change for the existing digital infrastructures. This presentation tries to explore and discuss the impact of Big Data challenges and new capabilities on the Global Earth Observation System of Systems (GEOSS) and particularly on its common digital infrastructure called GCI. GEOSS is a global and flexible network of content providers allowing decision makers to access an extraordinary range of data and information at their desk. The impact of the Big Data dimensionalities (commonly known as 'V' axes: volume, variety, velocity, veracity, visualization) on GEOSS is discussed. The main solutions and experimentation developed by GEOSS along these axes are introduced and analyzed. GEOSS is a pioneering framework for global and multidisciplinary data sharing in the Earth Observation realm; its experience on Big Data is valuable for the many lessons learned.
Software Developed for Analyzing High- Speed Rolling-Element Bearings
NASA Technical Reports Server (NTRS)
Fleming, David P.
2005-01-01
COBRA-AHS (Computer Optimized Ball & Roller Bearing Analysis--Advanced High Speed, J.V. Poplawski & Associates, Bethlehem, PA) is used for the design and analysis of rolling element bearings operating at high speeds under complex mechanical and thermal loading. The code estimates bearing fatigue life by calculating three-dimensional subsurface stress fields developed within the bearing raceways. It provides a state-of-the-art interactive design environment for bearing engineers within a single easy-to-use design-analysis package. The code analyzes flexible or rigid shaft systems containing up to five bearings acted upon by radial, thrust, and moment loads in 5 degrees of freedom. Bearing types include high-speed ball, cylindrical roller, and tapered roller bearings. COBRA-AHS is the first major upgrade in 30 years of such commercially available bearing software. The upgrade was developed under a Small Business Innovation Research contract from the NASA Glenn Research Center, and incorporates the results of 30 years of NASA and industry bearing research and technology.
Roller Bearing Health Monitoring Using CPLE Frequency Analysis Method
NASA Technical Reports Server (NTRS)
Jong, Jen-Yi; Jones, Jess H.
2007-01-01
This paper describes a unique vibration signature analysis technique Coherence Phase Line Enhancer (CPLE) Frequency Analysis - for roller bearing health monitoring. Defects of roller bearing (e.g. wear, foreign debris, crack in bearing supporting structure, etc.) can cause small bearing characteristic frequency shifts due to minor changes in bearing geometry. Such frequency shifts are often too small to detect by the conventional Power Spectral Density (PSD) due to its frequency bandwidth limitation. This Coherent Phase Line Enhancer technology has been evolving over the last few years and has culminated in the introduction of a new and novel frequency spectrum which is fully described in this paper. This CPLE technology uses a "key phasor" or speed probe as a preprocessor for this analysis. With the aid of this key phasor, this CPLE technology can develop a two dimensional frequency spectrum that preserves both amplitude and phase that is not normally obtained using conventional frequency analysis. This two-dimensional frequency transformation results in several newly defined spectral functions; i. e. CPLE-PSD, CPLE-Coherence and the CPLE-Frequency. This paper uses this CPLE frequency analysis to detect subtle, low level bearing related signals in the High Pressure Fuel Pump (HPFP) of the Space Shuttle Main Engine (SSME). For many rotating machinery applications, a key phasor is an essential measurement that is used in the detection of bearing related signatures. There are times however, when a key phasor is not available; i. e. during flight of any of the SSME turbopumps or on the SSME High Pressure Oxygen Turbopump (HPOTP) where no speed probe is present. In this case, the CPLE analysis approach can still be achieved using a novel Pseudo Key Phasor (PKP) technique to reconstruct a 1/Rev PKP signal directly from external vibration measurements. This paper develops this Pseudo Key Phasor technique and applies it to the SSME vibration data.
Feltus, Frank A.; Breen, Joseph R.; Deng, Juan; Izard, Ryan S.; Konger, Christopher A.; Ligon, Walter B.; Preuss, Don; Wang, Kuang-Ching
2015-01-01
In the last decade, high-throughput DNA sequencing has become a disruptive technology and pushed the life sciences into a distributed ecosystem of sequence data producers and consumers. Given the power of genomics and declining sequencing costs, biology is an emerging “Big Data” discipline that will soon enter the exabyte data range when all subdisciplines are combined. These datasets must be transferred across commercial and research networks in creative ways since sending data without thought can have serious consequences on data processing time frames. Thus, it is imperative that biologists, bioinformaticians, and information technology engineers recalibrate data processing paradigms to fit this emerging reality. This review attempts to provide a snapshot of Big Data transfer across networks, which is often overlooked by many biologists. Specifically, we discuss four key areas: 1) data transfer networks, protocols, and applications; 2) data transfer security including encryption, access, firewalls, and the Science DMZ; 3) data flow control with software-defined networking; and 4) data storage, staging, archiving and access. A primary intention of this article is to orient the biologist in key aspects of the data transfer process in order to frame their genomics-oriented needs to enterprise IT professionals. PMID:26568680
The Path of the Blind Watchmaker: A Model of Evolution
2011-04-06
computational biology has now reached the point that astronomy reached when it began to look backward in time to the Big Bang. Our goal is look backward in...treatment. We claim that computational biology has now reached the point that astronomy reached when it began to look backward in time to the Big...evolutionary process itself, in fact, created it. When astronomy reached a critical mass of theory, technology, and observational data, astronomers
Fall 2014 Data-Intensive Systems
2014-10-29
Oct 2014 © 2014 Carnegie Mellon University Big Data Systems NoSQL and horizontal scaling are changing architecture principles by creating...University Status LEAP4BD • Ready to pilot QuABase • Prototype is complete – covers 8 NoSQL /NewSQL implementations • Completing validation testing Big...machine learning to automate population of knowledge base • Initial focus on NoSQL /NewSQL technology domain • Extend to create knowledge bases in other
Can big business save health care?
Dunn, Philip
2007-01-01
Corporate America has decided to stop bellyaching about the cost and quality of the health care it helps buy for its employees. Now it's taking concrete action. Large employers such as Wal-Mart, Oracle, Cisco, BP America and many, many others are pressuring providers to meet performance standards, adopt information technology and transform the efficiency of their operations. Big Business wants value for its buck, and it's now putting money where its mouth is.
Welsh, Elaine; Jirotka, Marina; Gavaghan, David
2006-06-15
We examine recent developments in cross-disciplinary science and contend that a 'Big Science' approach is increasingly evident in the life sciences-facilitated by a breakdown of the traditional barriers between academic disciplines and the application of technologies across these disciplines. The first fruits of 'Big Biology' are beginning to be seen in, for example, genomics, (bio)-nanotechnology and systems biology. We suggest that this has profound implications for the research process and presents challenges both in technological design, in the provision of infrastructure and training, in the organization of research groups, and in providing suitable research funding mechanisms and reward systems. These challenges need to be addressed if the promise of this approach is to be fully realized. In this paper, we will draw on the work of social scientists to understand how these developments in science and technology relate to organizational culture, organizational change and the context of scientific work. We seek to learn from previous technological developments that seemed to offer similar potential for organizational and social change.
A peek into the future of radiology using big data applications.
Kharat, Amit T; Singhal, Shubham
2017-01-01
Big data is extremely large amount of data which is available in the radiology department. Big data is identified by four Vs - Volume, Velocity, Variety, and Veracity. By applying different algorithmic tools and converting raw data to transformed data in such large datasets, there is a possibility of understanding and using radiology data for gaining new knowledge and insights. Big data analytics consists of 6Cs - Connection, Cloud, Cyber, Content, Community, and Customization. The global technological prowess and per-capita capacity to save digital information has roughly doubled every 40 months since the 1980's. By using big data, the planning and implementation of radiological procedures in radiology departments can be given a great boost. Potential applications of big data in the future are scheduling of scans, creating patient-specific personalized scanning protocols, radiologist decision support, emergency reporting, virtual quality assurance for the radiologist, etc. Targeted use of big data applications can be done for images by supporting the analytic process. Screening software tools designed on big data can be used to highlight a region of interest, such as subtle changes in parenchymal density, solitary pulmonary nodule, or focal hepatic lesions, by plotting its multidimensional anatomy. Following this, we can run more complex applications such as three-dimensional multi planar reconstructions (MPR), volumetric rendering (VR), and curved planar reconstruction, which consume higher system resources on targeted data subsets rather than querying the complete cross-sectional imaging dataset. This pre-emptive selection of dataset can substantially reduce the system requirements such as system memory, server load and provide prompt results. However, a word of caution, "big data should not become "dump data" due to inadequate and poor analysis and non-structured improperly stored data. In the near future, big data can ring in the era of personalized and individualized healthcare.
Intelligent technologies in process of highly-precise products manufacturing
NASA Astrophysics Data System (ADS)
Vakhidova, K. L.; Khakimov, Z. L.; Isaeva, M. R.; Shukhin, V. V.; Labazanov, M. A.; Ignatiev, S. A.
2017-10-01
One of the main control methods of the surface layer of bearing parts is the eddy current testing method. Surface layer defects of bearing parts, like burns, cracks and some others, are reflected in the results of the rolling surfaces scan. The previously developed method for detecting defects from the image of the raceway was quite effective, but the processing algorithm is complicated and lasts for about 12 ... 16 s. The real non-stationary signals from an eddy current transducer (ECT) consist of short-time high-frequency and long-time low-frequency components, therefore a transformation is used for their analysis, which provides different windows for different frequencies. The wavelet transform meets these conditions. Based on aforesaid, a methodology for automatically detecting and recognizing local defects in bearing parts surface layer has been developed on the basis of wavelet analysis using integral estimates. Some of the defects are recognized by the amplitude component, otherwise an automatic transition to recognition by the phase component of information signals (IS) is carried out. The use of intelligent technologies in the manufacture of bearing parts will, firstly, significantly improve the quality of bearings, and secondly, significantly improve production efficiency by reducing (eliminating) rejections in the manufacture of products, increasing the period of normal operation of the technological equipment (inter-adjustment period), the implementation of the system of Flexible facilities maintenance, as well as reducing production costs.
Biosecurity in the age of Big Data: a conversation with the FBI.
You, Edward; Kozminski, Keith G
2015-11-05
New scientific frontiers and emerging technologies within the life sciences pose many global challenges to society. Big Data is a premier example, especially with respect to individual, national, and international security. Here a Special Agent of the Federal Bureau of Investigation discusses the security implications of Big Data and the need for security in the life sciences. © 2015 Kozminski. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
Big data in pharmacy practice: current use, challenges, and the future.
Ma, Carolyn; Smith, Helen Wong; Chu, Cherie; Juarez, Deborah T
2015-01-01
Pharmacy informatics is defined as the use and integration of data, information, knowledge, technology, and automation in the medication-use process for the purpose of improving health outcomes. The term "big data" has been coined and is often defined in three V's: volume, velocity, and variety. This paper describes three major areas in which pharmacy utilizes big data, including: 1) informed decision making (clinical pathways and clinical practice guidelines); 2) improved care delivery in health care settings such as hospitals and community pharmacy practice settings; and 3) quality performance measurement for the Centers for Medicare and Medicaid and medication management activities such as tracking medication adherence and medication reconciliation.
Big data in pharmacy practice: current use, challenges, and the future
Ma, Carolyn; Smith, Helen Wong; Chu, Cherie; Juarez, Deborah T
2015-01-01
Pharmacy informatics is defined as the use and integration of data, information, knowledge, technology, and automation in the medication-use process for the purpose of improving health outcomes. The term “big data” has been coined and is often defined in three V’s: volume, velocity, and variety. This paper describes three major areas in which pharmacy utilizes big data, including: 1) informed decision making (clinical pathways and clinical practice guidelines); 2) improved care delivery in health care settings such as hospitals and community pharmacy practice settings; and 3) quality performance measurement for the Centers for Medicare and Medicaid and medication management activities such as tracking medication adherence and medication reconciliation. PMID:29354523
The 2025 Big "G" Geriatrician: Defining Job Roles to Guide Fellowship Training.
Simpson, Deborah; Leipzig, Rosanne M; Sauvigné, Karen
2017-10-01
Changes in health care that are already in progress, including value- and population-based care, use of new technologies for care, big data and machine learning, and the patient as consumer and decision maker, will determine the job description for geriatricians practicing in 2025. Informed by these future certainties, 115 geriatrics educators attending the 2016 Donald W. Reynolds Foundation Annual meeting identified five 2025 geriatrician job roles: complexivist; consultant; health system leader and innovator; functional preventionist; and educator for big "G" and little "g" providers. By identifying these job roles, geriatrics fellowship training can be preemptively redesigned. © 2017, Copyright the Authors Journal compilation © 2017, The American Geriatrics Society.
NASA Astrophysics Data System (ADS)
Dong, Yumin; Xiao, Shufen; Ma, Hongyang; Chen, Libo
2016-12-01
Cloud computing and big data have become the developing engine of current information technology (IT) as a result of the rapid development of IT. However, security protection has become increasingly important for cloud computing and big data, and has become a problem that must be solved to develop cloud computing. The theft of identity authentication information remains a serious threat to the security of cloud computing. In this process, attackers intrude into cloud computing services through identity authentication information, thereby threatening the security of data from multiple perspectives. Therefore, this study proposes a model for cloud computing protection and management based on quantum authentication, introduces the principle of quantum authentication, and deduces the quantum authentication process. In theory, quantum authentication technology can be applied in cloud computing for security protection. This technology cannot be cloned; thus, it is more secure and reliable than classical methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCormick, Frederick B.; Shalf, John; Mitchell, Alan
This report captures the initial conclusions of the DOE seven National Lab team collaborating on the “Solving the Information Technology Energy Challenge Beyond Moore’s Law” initiative from the DOE Big Idea Summit III held in April of 2016. The seven Labs held a workshop in Albuquerque, NM in late July 2016 and gathered 40 researchers into 5 working groups: 4 groups spanning the levels of the co-design framework shown below, and a 5th working group focused on extending and advancing manufacturing approaches and coupling their constraints to all of the framework levels. These working groups have identified unique capabilities withinmore » the Labs to support the key challenges of this Beyond Moore’s Law Computing (BMC) vision, as well as example first steps and potential roadmaps for technology development.« less
Entomological Collections in the Age of Big Data.
Short, Andrew Edward Z; Dikow, Torsten; Moreau, Corrie S
2018-01-07
With a million described species and more than half a billion preserved specimens, the large scale of insect collections is unequaled by those of any other group. Advances in genomics, collection digitization, and imaging have begun to more fully harness the power that such large data stores can provide. These new approaches and technologies have transformed how entomological collections are managed and utilized. While genomic research has fundamentally changed the way many specimens are collected and curated, advances in technology have shown promise for extracting sequence data from the vast holdings already in museums. Efforts to mainstream specimen digitization have taken root and have accelerated traditional taxonomic studies as well as distribution modeling and global change research. Emerging imaging technologies such as microcomputed tomography and confocal laser scanning microscopy are changing how morphology can be investigated. This review provides an overview of how the realization of big data has transformed our field and what may lie in store.
The Opportunity and Challenge of The Age of Big Data
NASA Astrophysics Data System (ADS)
Yunguo, Hong
2017-11-01
The arrival of large data age has gradually expanded the scale of information industry in China, which has created favorable conditions for the expansion of information technology and computer network. Based on big data the computer system service function is becoming more and more perfect, and the efficiency of data processing in the system is improving, which provides important guarantee for the implementation of production plan in various industries. At the same time, the rapid development of fields such as Internet of things, social tools, cloud computing and the widen of information channel, these make the amount of data is increase, expand the influence range of the age of big data, we need to take the opportunities and challenges of the age of big data correctly, use data information resources effectively. Based on this, this paper will study the opportunities and challenges of the era of large data.
The application of Big Data in medicine: current implications and future directions.
Austin, Christopher; Kusumoto, Fred
2016-10-01
Since the mid 1980s, the world has experienced an unprecedented explosion in the capacity to produce, store, and communicate data, primarily in digital formats. Simultaneously, access to computing technologies in the form of the personal PC, smartphone, and other handheld devices has mirrored this growth. With these enhanced capabilities of data storage and rapid computation as well as real-time delivery of information via the internet, the average daily consumption of data by an individual has grown exponentially. Unbeknownst to many, Big Data has silently crept into our daily routines and, with continued development of cheap data storage and availability of smart devices both regionally and in developing countries, the influence of Big Data will continue to grow. This influence has also carried over to healthcare. This paper will provide an overview of Big Data, its benefits, potential pitfalls, and the projected impact on the future of medicine in general and cardiology in particular.
Kruse, Christian
2018-06-01
To review current practices and technologies within the scope of "Big Data" that can further our understanding of diabetes mellitus and osteoporosis from large volumes of data. "Big Data" techniques involving supervised machine learning, unsupervised machine learning, and deep learning image analysis are presented with examples of current literature. Supervised machine learning can allow us to better predict diabetes-induced osteoporosis and understand relative predictor importance of diabetes-affected bone tissue. Unsupervised machine learning can allow us to understand patterns in data between diabetic pathophysiology and altered bone metabolism. Image analysis using deep learning can allow us to be less dependent on surrogate predictors and use large volumes of images to classify diabetes-induced osteoporosis and predict future outcomes directly from images. "Big Data" techniques herald new possibilities to understand diabetes-induced osteoporosis and ascertain our current ability to classify, understand, and predict this condition.
Systems biology for nursing in the era of big data and precision health.
Founds, Sandra
2017-12-02
The systems biology framework was previously synthesized with the person-environment-health-nursing metaparadigm. The purpose of this paper is to present a nursing discipline-specific perspective of the association of systems biology with big data and precision health. The fields of systems biology, big data, and precision health are now overviewed, from origins through expansions, with examples of what is being done by nurses in each area of science. Technological advances continue to expand omics and other varieties of big data that inform the person's phenotype and health outcomes for precision care. Meanwhile, millions of participants in the United States are being recruited for health-care research initiatives aimed at building the information commons of digital health data. Implications and opportunities abound via conceptualizing the integration of these fields through the nursing metaparadigm. Copyright © 2017 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Eisenberg, Mike; Johnson, Doug; Berkowitz, Bob
2010-01-01
There is clear and widespread agreement among the public and educators that all students need to be proficient technology users. Technology literacy is among the attributes that appear in nearly every set of "21st Century Skills." However, while districts spend a great deal of money on technology, there seems to be only a vague notion of what…
Computer Technology and Social Issues.
ERIC Educational Resources Information Center
Garson, G. David
Computing involves social issues and political choices. Issues such as privacy, computer crime, gender inequity, disemployment, and electronic democracy versus "Big Brother" are addressed in the context of efforts to develop a national public policy for information technology. A broad range of research and case studies are examined in an…
Gearbox Instrumentation for the Investigation of Bearing Axial Cracking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keller, Jonathan A; Lambert, Scott R
Failures in gearbox bearings have been the primary source of reliability issues for wind turbine drivetrains, leading to costly downtime and unplanned maintenance. The most common failure mode is attributed to so-called axial cracks or white-etching cracks, which primarily affect the intermediate and high-speed-stage bearings. The high-speed-shaft and bearing loads and sliding will be measured with a specially instrumented gearbox installed in a 1.5-megawatt turbine at the National Wind Technology Center in an upcoming test campaign. Additional instrumentation will also measure the tribological environment of these bearings, including bearing temperatures, lubricant temperature and water content, air temperature and humidity, andmore » stray electrical current across the bearings. This paper fully describes the instrumentation package and summarizes initial results.« less
OpenCourseWare and Open Educational Resources: The Next Big Thing in Technology-Enhanced Education?
ERIC Educational Resources Information Center
Caudill, Jason G.
2012-01-01
OpenCourseWare (OCW) and Open Educational Resources (OER) are two new and closely related educational technologies. Both provide open access to learning materials for students and instructors via the Internet. These are for the moment still very young technologies. While they have grown dramatically in just ten years there is still relatively…
NASA Technical Reports Server (NTRS)
Sindlinger, R. S.
1977-01-01
A 3-axis active attitude control system with only one rotating part was developed using a momentum wheel with magnetic gimballing capability as a torque actuator for all three body axes. A brief description of magnetic bearing technology is given. It is concluded that based on this technology an integrated energy storage/attitude control system with one air of counterrotating rings could reduce the complexity and weight of conventional systems.
Courses of Action to Optimize Heavy Bearings Cages
NASA Astrophysics Data System (ADS)
Szekely, V. G.
2016-11-01
The global expansion in the industrial, economically and technological context determines the need to develop products, technologies, processes and methods which ensure increased performance, lower manufacturing costs and synchronization of the main costs reported to the elementary values which correspond to utilization”. The development trend of the heavy bearing industry and the wide use of bearings determines the necessity of choosing the most appropriate material for a given application in order to meet the cumulative requirements of durability, reliability, strength, etc. Evaluation of commonly known or new materials represents a fundamental criterion, in order to choose the materials based on the cost, machinability and the technological process. In order to ensure the most effective basis for the decision, regarding the heavy bearing cage, in the first stage the functions of the product are established and in a further step a comparative analysis of the materials is made in order to establish the best materials which satisfy the product functions. The decision for selecting the most appropriate material is based largely on the overlapping of the material costs and manufacturing process during which the half-finished material becomes a finished product. The study is orientated towards a creative approach, especially towards innovation and reengineering by using specific techniques and methods applied in inventics. The main target is to find new efficient and reliable constructive and/or technological solutions which are consistent with the concept of sustainable development.
Boubela, Roland N.; Kalcher, Klaudius; Huf, Wolfgang; Našel, Christian; Moser, Ewald
2016-01-01
Technologies for scalable analysis of very large datasets have emerged in the domain of internet computing, but are still rarely used in neuroimaging despite the existence of data and research questions in need of efficient computation tools especially in fMRI. In this work, we present software tools for the application of Apache Spark and Graphics Processing Units (GPUs) to neuroimaging datasets, in particular providing distributed file input for 4D NIfTI fMRI datasets in Scala for use in an Apache Spark environment. Examples for using this Big Data platform in graph analysis of fMRI datasets are shown to illustrate how processing pipelines employing it can be developed. With more tools for the convenient integration of neuroimaging file formats and typical processing steps, big data technologies could find wider endorsement in the community, leading to a range of potentially useful applications especially in view of the current collaborative creation of a wealth of large data repositories including thousands of individual fMRI datasets. PMID:26778951
Creation a Geo Big Data Outreach and Training Collaboratory for Wildfire Community
NASA Astrophysics Data System (ADS)
Altintas, I.; Sale, J.; Block, J.; Cowart, C.; Crawl, D.
2015-12-01
A major challenge for the geoscience community is the training and education of current and next generation big data geoscientists. In wildfire research, there are an increasing number of tools, middleware and techniques to use for data science related to wildfires. The necessary computing infrastructures are often within reach and most of the software tools for big data are freely available. But what has been lacking is a transparent platform and training program to produce data science experts who can use these integrated tools effectively. Scientists well versed to take advantage of big data technologies in geoscience applications is of critical importance to the future of research and knowledge advancement. To address this critical need, we are developing learning modules to teach process-based thinking to capture the value of end-to-end systems of reusable blocks of knowledge and integrate the tools and technologies used in big data analysis in an intuitive manner. WIFIRE is an end-to-end cyberinfrastructure for dynamic data-driven simulation, prediction and visualization of wildfire behavior.To this end, we are openly extending an environment we have built for "big data training" (biobigdata.ucsd.edu) to similar MOOC-based approaches to the wildfire community. We are building an environment that includes training modules for distributed platforms and systems, Big Data concepts, and scalable workflow tools, along with other basics of data science including data management, reproducibility and sharing of results. We also plan to provide teaching modules with analytical and dynamic data-driven wildfire behavior modeling case studies which address the needs not only of standards-based K-12 science education but also the needs of a well-educated and informed citizenry.Another part our outreach mission is to educate our community on all aspects of wildfire research. One of the most successful ways of accomplishing this is through high school and undergraduate student internships. Students have worked closely with WIFIRE researchers on various projects including the development of statistical models of wildfire ignition probabilities for southern California, and the development of a smartphone app for crowd-sourced wildfire reporting through social networks such as Twitter and Facebook.
The Prospect of Internet of Things and Big Data Analytics in Transportation System
NASA Astrophysics Data System (ADS)
Noori Hussein, Waleed; Kamarudin, L. M.; Hussain, Haider N.; Zakaria, A.; Badlishah Ahmed, R.; Zahri, N. A. H.
2018-05-01
Internet of Things (IoT); the new dawn technology that describes how data, people and interconnected physical objects act based on communicated information, and big data analytics have been adopted by diverse domains for varying purposes. Manufacturing, agriculture, banks, oil and gas, healthcare, retail, hospitality, and food services are few of the sectors that have adopted and massively utilized IoT and big data analytics. The transportation industry is also an early adopter, with significant attendant effects on its processes of tracking shipment, freight monitoring, and transparent warehousing. This is recorded in countries like England, Singapore, Portugal, and Germany, while Malaysia is currently assessing the potentials and researching a purpose-driven adoption and implementation. This paper, based on review of related literature, presents a summary of the inherent prospects in adopting IoT and big data analytics in the Malaysia transportation system. Efficient and safe port environment, predictive maintenance and remote management, boundary-less software platform and connected ecosystem, among others, are the inherent benefits in the IoT and big data analytics for the Malaysia transportation system.
STIFTER, Janet; YAO, Yingwei; LOPEZ, Karen Dunn; KHOKHAR, Ashfaq; WILKIE, Diana J.; KEENAN, Gail M.
2015-01-01
The influence of the staffing variable nurse continuity on patient outcomes has been rarely studied and with inconclusive results. Multiple definitions and an absence of systematic methods for measuring the influence of continuity have resulted in its exclusion from nurse-staffing studies and conceptual models. We present a new conceptual model and an innovative use of health information technology to measure nurse continuity and to demonstrate the potential for bringing the results of big data science back to the bedside. Understanding the power of big data to address critical clinical issues may foster a new direction for nursing administration theory development. PMID:26244480
NASA Technical Reports Server (NTRS)
Zirin, H.; Tanaka, K.
1972-01-01
Analysis is made of observations of the August, 1972 flares at Big Bear and Tel Aviv, involving monochromatic movies, magnetograms, and spectra. In each flare the observations fit a model of particle acceleration in the chromosphere with emission produced by impart and by heating by the energetic electrons and protons. The region showed twisted flux and high gradients from birth, and flares appear due to strong magnetic shears and gradients across the neutral line produced by sunspot motions. Post flare loops show a strong change from sheared, force-free fields parallel to potential-field-like loops, perpendicular to the neutral line above the surface.
1990-01-01
Department of the Army RAND Corp. Accesion Fo, NTIS CRA&I Technical review by DTIC ! A6 0Undnno s: ced 0 David K. Home tl IC I Robert W. Tinney By Dist...changes to the Sixth Quadrennial Review of Military Compensation. viii COMPARISON OF RETENTION PATTERNS FOR NATIONAL GUARD AND ARMY RESERVE UNITS...duty time-- movies , volleyball, and time to go to the PX at Big Bear. Because of the work schedule, the soldiers did not have any free time in the
Health care prices, the federal budget, and economic growth.
Monaco, R M; Phelps, J H
1995-01-01
Rising health care spending, led by rising prices, has had an enormous impact on the economy, especially on the federal budget. Our work shows that if rapid growth in health care prices continues, under current institutional arrangements, real economic growth and employment will be lower during the next two decades than if health price inflation were somehow reduced. How big the losses are and which sectors bear the brunt of the costs vary depending on how society chooses to fund the federal budget deficit that stems from the rising cost of federal health care programs.
Nanotechnology: The Incredible Invisible World
ERIC Educational Resources Information Center
Roberts, Amanda S.
2011-01-01
The concept of nanotechnology was first introduced in 1959 by Richard Feynman at a meeting of the American Physical Society. Nanotechnology opens the door to an exciting new science/technology/engineering field. The possibilities for the uses of this technology should inspire the imagination to think big. Many are already pursuing such feats…
Adult Literacy and Technology Newsletter. Vol. 3, Nos. 1-4.
ERIC Educational Resources Information Center
Gueble, Ed, Ed.
1989-01-01
This document consists of four issues of a newsletter focused on the spectrum of technology use in literacy instruction. The first issue contains the following articles: "Five 'Big' Systems and One 'Little' Option" (Weisberg); "Computer Use Patterns at Blackfeet Community College" (Hill); "Software Review: Educational Activities' Science Series"…
Learning and Teaching Information Technology--Computer Skills in Context. ERIC Digest.
ERIC Educational Resources Information Center
Eisenberg, Michael B.; Johnson, Doug
This digest describes an integrated approach to teaching computer skills in K-12 schools. The introductory section discusses the importance of integrating information skills into the curriculum. "Technology Skills for Information Problem Solving: A Curriculum Based on the Big6 Skills Approach" (Michael B. Eisenberg, Doug Johnson, and…
Getting Results: Small Changes, Big Cohorts and Technology
ERIC Educational Resources Information Center
Kenney, Jacqueline L.
2012-01-01
This paper presents an example of constructive alignment in practice. Integrated technology supports were deployed to increase the consistency between learning objectives, activities and assessment and to foster student-centred, higher-order learning processes in the unit. Modifications took place over nine iterations of a second-year Marketing…
ERIC Educational Resources Information Center
Roberts-Mahoney, Heather; Means, Alexander J.; Garrison, Mark J.
2016-01-01
Advanced by powerful venture philanthropies, educational technology companies, and the US Department of Education, a growing movement to apply "big data" through "learning analytics" to create "personalized learning" is currently underway in K-12 education in the United States. While scholars have offered various…
ERIC Educational Resources Information Center
Milner, Jacob
2005-01-01
In districts big and small across the U.S., students, teachers, and administrators alike have come to appreciate the benefits of wireless technology. Because the technology delivers Internet signals on airborne radio frequencies, wireless networking allows users of all portable devices to move freely on a school's campus and stay connected to the…
NASA Technical Reports Server (NTRS)
DellaCorte, Christopher; Radil, Kevin C.; Bruckner, Robert J.; Howard, S. Adam
2007-01-01
Foil gas bearings are self-acting hydrodynamic bearings made from sheet metal foils comprised of at least two layers. The innermost top foil layer traps a gas pressure film that supports a load while a layer or layers underneath provide an elastic foundation. Foil bearings are used in many lightly loaded, high-speed turbo-machines such as compressors used for aircraft pressurization, and small micro-turbines. Foil gas bearings provide a means to eliminate the oil system leading to reduced weight and enhanced temperature capability. The general lack of familiarity of the foil bearing design and manufacturing process has hindered their widespread dissemination. This paper reviews the publicly available literature to demonstrate the design, fabrication and performance testing of both first and second generation bump style foil bearings. It is anticipated that this paper may serve as an effective starting point for new development activities employing foil bearing technology.
Gao, Zheyu; Lin, Jing; Wang, Xiufeng; Xu, Xiaoqiang
2017-05-24
Rolling bearings are widely used in rotating equipment. Detection of bearing faults is of great importance to guarantee safe operation of mechanical systems. Acoustic emission (AE), as one of the bearing monitoring technologies, is sensitive to weak signals and performs well in detecting incipient faults. Therefore, AE is widely used in monitoring the operating status of rolling bearing. This paper utilizes Empirical Wavelet Transform (EWT) to decompose AE signals into mono-components adaptively followed by calculation of the correlated kurtosis (CK) at certain time intervals of these components. By comparing these CK values, the resonant frequency of the rolling bearing can be determined. Then the fault characteristic frequencies are found by spectrum envelope. Both simulation signal and rolling bearing AE signals are used to verify the effectiveness of the proposed method. The results show that the new method performs well in identifying bearing fault frequency under strong background noise.
Acoustic detection of rail car roller bearing defects. Phase III, System evaluation test.
DOT National Transportation Integrated Search
2003-08-01
In July 1999, Transportation Technology Center, Inc. (TTCI), a subsidiary of the Association of American Railroads (AAR), conducted a system evaluation test as part of the Federal Railroad Administrations (FRA) Improved Freight Car Roller Bearing ...
EarthServer: Cross-Disciplinary Earth Science Through Data Cube Analytics
NASA Astrophysics Data System (ADS)
Baumann, P.; Rossi, A. P.
2016-12-01
The unprecedented increase of imagery, in-situ measurements, and simulation data produced by Earth (and Planetary) Science observations missions bears a rich, yet not leveraged potential for getting insights from integrating such diverse datasets and transform scientific questions into actual queries to data, formulated in a standardized way.The intercontinental EarthServer [1] initiative is demonstrating new directions for flexible, scalable Earth Science services based on innovative NoSQL technology. Researchers from Europe, the US and Australia have teamed up to rigorously implement the concept of the datacube. Such a datacube may have spatial and temporal dimensions (such as a satellite image time series) and may unite an unlimited number of scenes. Independently from whatever efficient data structuring a server network may perform internally, users (scientist, planners, decision makers) will always see just a few datacubes they can slice and dice.EarthServer has established client [2] and server technology for such spatio-temporal datacubes. The underlying scalable array engine, rasdaman [3,4], enables direct interaction, including 3-D visualization, common EO data processing, and general analytics. Services exclusively rely on the open OGC "Big Geo Data" standards suite, the Web Coverage Service (WCS). Conversely, EarthServer has shaped and advanced WCS based on the experience gained. The first phase of EarthServer has advanced scalable array database technology into 150+ TB services. Currently, Petabyte datacubes are being built for ad-hoc and cross-disciplinary querying, e.g. using climate, Earth observation and ocean data.We will present the EarthServer approach, its impact on OGC / ISO / INSPIRE standardization, and its platform technology, rasdaman.References: [1] Baumann, et al. (2015) DOI: 10.1080/17538947.2014.1003106 [2] Hogan, P., (2011) NASA World Wind, Proceedings of the 2nd International Conference on Computing for Geospatial Research & Applications ACM. [3] Baumann, Peter, et al. (2014) In Proc. 10th ICDM, 194-201. [4] Dumitru, A. et al. (2014) In Proc ACM SIGMOD Workshop on Data Analytics in the Cloud (DanaC'2014), 1-4.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mumpower, J.L.
There are strong structural similarities between risks from technological hazards and big-purse state lottery games. Risks from technological hazards are often described as low-probability, high-consequence negative events. State lotteries could be equally well characterized as low-probability, high-consequence positive events. Typical communications about state lotteries provide a virtual strategic textbook for opponents of risky technologies. The same techniques can be used to sell lottery tickets or sell opposition to risky technologies. Eight basic principles are enumerated.
LVFS: A Big Data File Storage Bridge for the HPC Community
NASA Astrophysics Data System (ADS)
Golpayegani, N.; Halem, M.; Mauoka, E.; Fonseca, L. F.
2015-12-01
Merging Big Data capabilities into High Performance Computing architecture starts at the file storage level. Heterogeneous storage systems are emerging which offer enhanced features for dealing with Big Data such as the IBM GPFS storage system's integration into Hadoop Map-Reduce. Taking advantage of these capabilities requires file storage systems to be adaptive and accommodate these new storage technologies. We present the extension of the Lightweight Virtual File System (LVFS) currently running as the production system for the MODIS Level 1 and Atmosphere Archive and Distribution System (LAADS) to incorporate a flexible plugin architecture which allows easy integration of new HPC hardware and/or software storage technologies without disrupting workflows, system architectures and only minimal impact on existing tools. We consider two essential aspects provided by the LVFS plugin architecture needed for the future HPC community. First, it allows for the seamless integration of new and emerging hardware technologies which are significantly different than existing technologies such as Segate's Kinetic disks and Intel's 3DXPoint non-volatile storage. Second is the transparent and instantaneous conversion between new software technologies and various file formats. With most current storage system a switch in file format would require costly reprocessing and nearly doubling of storage requirements. We will install LVFS on UMBC's IBM iDataPlex cluster with a heterogeneous storage architecture utilizing local, remote, and Seagate Kinetic storage as a case study. LVFS merges different kinds of storage architectures to show users a uniform layout and, therefore, prevent any disruption in workflows, architecture design, or tool usage. We will show how LVFS will convert HDF data produced by applying machine learning algorithms to Xco2 Level 2 data from the OCO-2 satellite to produce CO2 surface fluxes into GeoTIFF for visualization.
Changes in Hardware in Order to Accommodate Compliant Foil Air Bearings of a Larger Size
NASA Technical Reports Server (NTRS)
Zeszotek, Michelle
2004-01-01
Compliant foil air bearings are at the forefront of the Oil-Free turbomachinery revolution of supporting gas turbine engines with air lubricated hydrodynamic bearings. Foil air bearings have existed for almost fifty years, yet their commercialization has been confined to relatively small, high-speed systems characterized by low temperatures and loads, such as in air cycle machines, turbocompressors and micro-turbines. Recent breakthroughs in foil air bearing design and solid lubricant coating technology, have caused a resurgence of research towards applying Oil-Free technology to more demanding applications on the scale of small and mid range aircraft gas turbine engines. In order to foster the transition of Oil-Free technology into gas turbine engines, in-house experiments need to be performed on foil air bearings to further the understanding of their complex operating principles. During my internship at NASA Glenn in the summer of 2003, a series of tests were performed to determine the internal temperature profile in a compliant bump- type foil journal air bearing operating at room temperature under various speeds and load conditions. From these tests, a temperature profile was compiled, indicating that the circumferential thermal gradients were negligible. The tests further indicated that both journal rotational speed and radial load are responsible for heat generation with speed playing a more significant role in the magnitude of the temperatures. As a result of the findings from the tests done during the summer of 2003, it was decided that further testing would need to be done, but with a bearing of a larger diameter. The bearing diameter would now be increased from two inches to three inches. All of the currently used testing apparatus was designed specifically for a bearing that was two inches in diameter. Thus, my project for the summer of 2004 was to focus specifically on the scatter shield put around the testing rig while running the bearings. Essentially I was to design a scatter shield that would be able to accommodate the three inch bearing and that would also meet all safety requirements. Furthermore, the new scatter shield also had to house a heater, used for high-speed and temperature testing. Using Solidworks, a computer aided modeling program, I was able to accomplish the task set out for me and designed the new scatter shield. Furthermore, I also guided the fabrication process. As a result of this containment shield being designed, the Oil-Free turbomachinery team now has the ability to test bearings of larger diameters. Finally, it is expected that these tests will provide information useful for the validation of future analytical modeling codes.
Karampinas, Panagiotis K; Papadelis, Eustratios G; Vlamis, John A; Basiliadis, Hlias; Pneumaticos, Spiros G
2017-07-01
Young patients feel that maintaining sport activities after total hip arthroplasty constitutes an important part of their quality of life. The majority of hip surgeons allow patients to return to low-impact activities, but significant caution is advised to taking part in high-impact activities. The purpose of this study is to compare and evaluate the post-operative return to daily living habits and sport activities following short-metaphyseal hip and high functional total hip arthroplasties (resurfacing and big femoral head arthroplasties). In a study design, 48 patients (55 hips) were enrolled in three different comparative groups, one with the short-metaphyseal arthroplasties, a second with high functional resurfacing arthroplasties and a third of big femoral head arthroplasties. Each patient experienced a clinical examination and evaluated with Harris Hip Score, WOMAC, Sf-36, UCLA activity score, satisfaction VAS, anteroposterior and lateral X-rays of the hip and were followed in an outpatient setting for 2 years. Statistical analysis revealed no notable differences between the three groups regarding their demographic data however significant differences have been found between preoperative and postoperative clinical scores of each group. Also, we fail to reveal any significant differences when comparing data of all three groups at the final 2 years postoperative control regarding their clinical scores. The overall outcome of all three groups was similar, all the patients were satisfied and returned to previous level of sport activities. Short metaphyseal hip arthroplasties in young patients intending to return to previous and even high impact sport activities, similar to high functional resurfacing, big femoral head arthroplasties. Short stems with hard on hard bearing surfaces might become an alternative to standard stems and hip resurfacing.
Milazzo, Mary Louise; Cajimat, Maria N B; Mauldin, Matthew R; Bennett, Stephen G; Hess, Barry D; Rood, Michael P; Conlan, Christopher A; Nguyen, Kiet; Wekesa, J Wakoli; Ramos, Ronald D; Bradley, Robert D; Fulhorst, Charles F
2015-02-01
The objective of this study was to advance our knowledge of the epizootiology of Bear Canyon virus and other Tacaribe serocomplex viruses (Arenaviridae) associated with wild rodents in California. Antibody (immunoglobulin G [IgG]) to a Tacaribe serocomplex virus was found in 145 (3.6%) of 3977 neotomine rodents (Cricetidae: Neotominae) captured in six counties in southern California. The majority (122 or 84.1%) of the 145 antibody-positive rodents were big-eared woodrats (Neotoma macrotis) or California mice (Peromyscus californicus). The 23 other antibody-positive rodents included a white-throated woodrat (N. albigula), desert woodrat (N. lepida), Bryant's woodrats (N. bryanti), brush mice (P. boylii), cactus mice (P. eremicus), and deer mice (P. maniculatus). Analyses of viral nucleocapsid protein gene sequence data indicated that Bear Canyon virus is associated with N. macrotis and/or P. californicus in Santa Barbara County, Los Angeles County, Orange County, and western Riverside County. Together, analyses of field data and antibody prevalence data indicated that N. macrotis is the principal host of Bear Canyon virus. Last, the analyses of viral nucleocapsid protein gene sequence data suggested that the Tacaribe serocomplex virus associated with N. albigula and N. lepida in eastern Riverside County represents a novel species (tentatively named "Palo Verde virus") in the genus Arenavirus.
Detection and Characterisation of Meteors as a Big Data Citizen Science project
NASA Astrophysics Data System (ADS)
Gritsevich, M.
2017-12-01
Out of a total around 50,000 meteorites currently known to science, the atmospheric passage was recorded instrumentally in only 30 cases with the potential to derive their atmospheric trajectories and pre-impact heliocentric orbits. Similarly, while the observations of meteors, add thousands of new entries per month to existing databases, it is extremely rare they lead to meteorite recovery. Meteor studies thus represent an excellent example of the Big Data citizen science project, where progress in the field largely depends on the prompt identification and characterisation of meteor events as well as on extensive and valuable contributions by amateur observers. Over the last couple of decades technological advancements in observational techniques have yielded drastic improvements in the quality, quantity and diversity of meteor data, while even more ambitious instruments are about to become operational. This empowers meteor science to boost its experimental and theoretical horizons and seek more advanced scientific goals. We review some of the developments that push meteor science into the Big Data era that requires more complex methodological approaches through interdisciplinary collaborations with other branches of physics and computer science. We argue that meteor science should become an integral part of large surveys in astronomy, aeronomy and space physics, and tackle the complexity of micro-physics of meteor plasma and its interaction with the atmosphere. The recent increased interest in meteor science triggered by the Chelyabinsk fireball helps in building the case for technologically and logistically more ambitious meteor projects. This requires developing new methodological approaches in meteor research, with Big Data science and close collaboration between citizen science, geoscience and astronomy as critical elements. We discuss possibilities for improvements and promote an opportunity for collaboration in meteor science within the currently established EU COST BigSkyEarth http://bigskyearth.eu/ network.
Active-Controlled Fluid Film Based on Wave-Bearing Technology
NASA Technical Reports Server (NTRS)
Dimofte, Florin; Hendricks, Robert C.
2011-01-01
It has been known since 1967 that the steady-state and dynamic performance, including the stability of a wave bearing, are highly dependent on the wave amplitude. A wave-bearing profile can be readily obtained by elastically distorting the stationary bearing sleeve surface. The force that distorts the elastic sleeve surface could be an applied force or pressure. The magnitude and response of the distorting force would be defined by the relation between the bearing surface stiffness and the bearing pressure, or load, in a feedback loop controller. Using such devices as piezoelectric or other electromechanical elements, one could step control or fully control the bearing. The selection between these systems depends on the manner in which the distortion forces are applied, the running speed, and the reaction time of the feedback loop. With these techniques, both liquid- (oil-) or gas- (air-) lubricated wave bearings could be controlled. This report gives some examples of the dependency of the bearing's performance on the wave amplitude. The analysis also was proven experimentally.
Hybrid hydrostatic/ball bearings in high-speed turbomachinery
NASA Technical Reports Server (NTRS)
Nielson, C. E.
1983-01-01
A high speed, high pressure liquid hydrogen turbopump was designed, fabricated, and tested under a previous contract. This design was then modified to incorporate hybrid hydrostatic/ball bearings on both the pump end and turbine end to replace the original conventional ball bearing packages. The design, analysis, turbopump modification, assembly, and testing of the turbopump with hybrid bearings is presented here. Initial design considerations and rotordynamic performance analysis was made to define expected turbopump operating characteristics and are reported. The results of testing the turbopump to speeds of 9215 rad/s (88,000 rpm) using a wide range of hydrostatic bearing supply pressures are presented. The hydrostatic bearing test data and the rotordynamic behavior of the turbopump was closely analyzed and are included in the report. The testing of hybrid hydrostatic/ball bearings on a turbopump to the high speed requirements has indicated the configuration concept is feasible. The program has presented a great deal of information on the technology requirements of integrating the hybrid bearing into high speed turbopump designs for improved bearing life.
77 FR 67808 - President's Council of Advisors on Science and Technology (PCAST)
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-14
... where understandings from the domains of science, technology, and innovation may bear on the policy... Science and Technology, and, Director, Office of Science and Technology Policy, Executive Office of the... update on its study of the Networking and Information Technology Research and Development (NITRD) program...
76 FR 70781 - President's Council of Advisors on Science and Technology
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-15
... OFFICE OF SCIENCE AND TECHNOLOGY POLICY President's Council of Advisors on Science and Technology... understandings from the domains of science, technology, and innovation may bear on the policy choices before the President. PCAST is administered by the Office of Science and Technology Policy (OSTP). PCAST is co-chaired...
76 FR 70779 - President's Council of Advisors on Science and Technology
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-15
... OFFICE OF SCIENCE AND TECHNOLOGY POLICY President's Council of Advisors on Science and Technology... understandings from the domains of science, technology, and innovation may bear on the policy choices before the President. PCAST is administered by the Office of Science and Technology Policy (OSTP). PCAST is co-chaired...
76 FR 75919 - President's Council of Advisors on Science and Technology Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-05
... OFFICE OF SCIENCE AND TECHNOLOGY POLICY President's Council of Advisors on Science and Technology Meeting AGENCY: Office of Science and Technology Policy. ACTION: Notice of meeting. SUMMARY: This notice... understandings from the domains of science, technology, and innovation may bear on the policy choices before the...
76 FR 70780 - President's Council of Advisors on Science and Technology
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-15
... OFFICE OF SCIENCE AND TECHNOLOGY POLICY President's Council of Advisors on Science and Technology... domains of science, technology, and innovation may bear on the policy choices before the President. PCAST is administered by the Office of Science and Technology Policy (OSTP). PCAST is co-chaired by Dr...
76 FR 72224 - President's Council of Advisors on Science and Technology Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-22
... OFFICE OF SCIENCE AND TECHNOLOGY POLICY President's Council of Advisors on Science and Technology Meeting AGENCY: Office of Science and Technology Policy. ACTION: Notice of meeting. SUMMARY: This notice... domains of science, technology, and innovation may bear on the policy choices before the President. PCAST...
DEVELOPING THE TRANSDISCIPLINARY AGING RESEARCH AGENDA: NEW DEVELOPMENTS IN BIG DATA.
Callaghan, Christian William
2017-07-19
In light of dramatic advances in big data analytics and the application of these advances in certain scientific fields, new potentialities exist for breakthroughs in aging research. Translating these new potentialities to research outcomes for aging populations, however, remains a challenge, as underlying technologies which have enabled exponential increases in 'big data' have not yet enabled a commensurate era of 'big knowledge,' or similarly exponential increases in biomedical breakthroughs. Debates also reveal differences in the literature, with some arguing big data analytics heralds a new era associated with the 'end of theory' or which makes the scientific method obsolete, where correlation supercedes causation, whereby science can advance without theory and hypotheses testing. On the other hand, others argue theory cannot be subordinate to data, no matter how comprehensive data coverage can ultimately become. Given these two tensions, namely between exponential increases in data absent exponential increases in biomedical research outputs, and between the promise of comprehensive data coverage and data-driven inductive versus theory-driven deductive modes of enquiry, this paper seeks to provide a critical review of certain theory and literature that offers useful perspectives of certain developments in big data analytics and their theoretical implications for aging research. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Heat Treatment Used to Strengthen Enabling Coating Technology for Oil-Free Turbomachinery
NASA Technical Reports Server (NTRS)
Edmonds, Brian J.; DellaCorte, Christopher
2002-01-01
The PS304 high-temperature solid lubricant coating is a key enabling technology for Oil- Free turbomachinery propulsion and power systems. Breakthroughs in the performance of advanced foil air bearings and improvements in computer-based finite element modeling techniques are the key technologies enabling the development of Oil-Free aircraft engines being pursued by the Oil-Free Turbomachinery team at the NASA Glenn Research Center. PS304 is a plasma spray coating applied to the surface of shafts operating against foil air bearings or in any other component requiring solid lubrication at high temperatures, where conventional materials such as graphite cannot function.
Magnetic bearings for a high-performance optical disk buffer, volume 1
NASA Technical Reports Server (NTRS)
Hockney, Richard; Adler, Karen; Anastas, George, Jr.; Downer, James; Flynn, Frederick; Goldie, James; Gondhalekar, Vijay; Hawkey, Timothy; Johnson, Bruce
1990-01-01
The innovation investigated in this project was the application of magnetic bearing technology to the translator head of an optical-disk data storage device. Both the capability for space-based applications and improved performance are expected to result. The phase 1 effort produced: (1) detailed specifications for both the translator-head and rotary-spindel bearings; (2) candidate hardware configurations for both bearings with detail definition for the translator head; (3) required characteristics for the magnetic bearing control loops; (4) position sensor selection; and (5) definition of the required electronic functions. The principal objective of Phase 2 was the design, fabrication, assembly, and test of the magnetic bearing system for the translator head. The scope of work included: (1) mechanical design of each of the required components; (2) electrical design of the required circuitry; (3) fabrication of the component parts and bread-board electronics; (4) generation of a test plan; and (5) integration of the prototype unit and performance testing. The project has confirmed the applicability of magnetic bearing technology to suspension of the translator head of the optical disk device, and demonstrated the achievement of all performance objectives. The magnetic bearing control loops perform well, achieving 100 Hz nominal bandwidth with phase margins between 37 and 63 degrees. The worst-case position resolution is 0.02 micron in the displacement loops and 1 micron rad in the rotation loops, The system is very robust to shock disturbances, recovering smoothly even when collisions occur between the translator and frame. The unique start-up/shut-down circuit has proven very effective.
Dave, Anushree
2017-12-01
This is a review of Patrick Meier's 2015 book, Digital Humanitarians: How Big Data Is Changing the Face of Humanitarian Response. The book explores the role of technologies such as high-resolution satellite imagery, online social media, drones, and artificial intelligence in humanitarian responses during disasters such as the 2010 Haiti earthquake. In this analysis, the book is examined using a humanitarian health ethics perspective.
Design and development of a medical big data processing system based on Hadoop.
Yao, Qin; Tian, Yu; Li, Peng-Fei; Tian, Li-Li; Qian, Yang-Ming; Li, Jing-Song
2015-03-01
Secondary use of medical big data is increasingly popular in healthcare services and clinical research. Understanding the logic behind medical big data demonstrates tendencies in hospital information technology and shows great significance for hospital information systems that are designing and expanding services. Big data has four characteristics--Volume, Variety, Velocity and Value (the 4 Vs)--that make traditional systems incapable of processing these data using standalones. Apache Hadoop MapReduce is a promising software framework for developing applications that process vast amounts of data in parallel with large clusters of commodity hardware in a reliable, fault-tolerant manner. With the Hadoop framework and MapReduce application program interface (API), we can more easily develop our own MapReduce applications to run on a Hadoop framework that can scale up from a single node to thousands of machines. This paper investigates a practical case of a Hadoop-based medical big data processing system. We developed this system to intelligently process medical big data and uncover some features of hospital information system user behaviors. This paper studies user behaviors regarding various data produced by different hospital information systems for daily work. In this paper, we also built a five-node Hadoop cluster to execute distributed MapReduce algorithms. Our distributed algorithms show promise in facilitating efficient data processing with medical big data in healthcare services and clinical research compared with single nodes. Additionally, with medical big data analytics, we can design our hospital information systems to be much more intelligent and easier to use by making personalized recommendations.
High-Resolution Infrared Filter System for Solar Spectroscopy and Polarimetry
NASA Astrophysics Data System (ADS)
Cao, W.; Ma, J.; Wang, J.; Goode, P. R.; Wang, H.; Denker, C.
2003-05-01
We report on the design of an imaging filter system working at the near infrared (NIR) of 1.56 μ m to obtain monochromatic images and to probe weak magnetic fields in different layers of the deep photosphere with high temporal resolution and spatial resolution at Big Bear Solar Observatory (BBSO). This filter system consists of an interference filter, a birefringent filter, and a Fabry-Pérot etalon. As the narrowest filter system, the infrared Fabry-Pérot plays an important role in achieving narrow band transmission and high throughput, maintaining wavelength tuning ability, and assuring stability and reliability. In this poster, we outline a set of methods for the evaluation and calibration of the near infrared Fabry-Pérot etalon. Two-dimensional characteristic maps of the near infrared Fabry-Pérot etalon, including full-width-at-half-maximum (FWHM), effective finesse, peak transmission, along with free spectral range, flatness, roughness, stability and repeatability were obtained with lab equipments. Finally, by utilizing these results, a detailed analysis of the filter performance for the Fe I 1.5648 μ m and Fe I 1.5652 μ m Zeeman sensitive lines is presented. These results will benefit the design of NIR spectro-polarimeter of Advanced Technology Solar Telescope (ATST).
Fluorine lubricated bearing technology
NASA Technical Reports Server (NTRS)
Mallaire, F. R.
1973-01-01
An experimental program was conducted to evaluate and select materials for ball bearings intended for use in liquid fluorine and/or FLOX. The ability of three different ball-separator materials, each containing nickel, to form and transfer a nickel fluoride film to provide effective lubrication at the required areas of a ball bearing operating in liquid fluorine was evaluated. In addition, solid lubrication of a ball bearing operating in liquid fluorine by either a fused fluoride coating applied to all surfaces of the ball separator or by a fluoride impregnation of porous sintered material ball separators was evaluated. Less bearing wear occurred when tests were conducted in the less reactive FLOX. Bearings fabricated from any of the materials tested would have relatively short wear lives and would require frequent replacement in a reusable engine.
ERIC Educational Resources Information Center
Middle Tennessee State Univ., Murfreesboro.
This proceedings of the sixth annual Mid-South Instructional Technology Conference contains the following papers: "They're Not Just Big Kids: Motivating Adult Learners" (Karen Jarrett Thoms); "A Computer Integrated Biology Laboratory Experience" (James B. Kring); "Building Web Sites for Mathematics Courses: Some Answers to…
Emerging Materials Technologies That Matter to Manufacturers
NASA Technical Reports Server (NTRS)
Misra, Ajay K.
2015-01-01
A brief overview of emerging materials technologies. Exploring the weight reduction benefit of replacing Carbon Fiber with Carbon Nanotube (CNT) in Polymer Composites. Review of the benign purification method developed for CNT sheets. The future of manufacturing will include the integration of computational material design and big data analytics, along with Nanomaterials as building blocks.
ERIC Educational Resources Information Center
Hartong, Sigrid
2016-01-01
The past few decades have witnessed a global enforcement of "governance by data" in education policy, including a significant increase of assessments and quantified evaluation. Within this context, this article focuses particularly on the intensifying evolvement of new (digital) information technologies and "mediated"…
ERIC Educational Resources Information Center
Earls, Alan R.
2000-01-01
Explores privacy issues raised by information technology at colleges and universities. Drawing on accounts and opinions of faculty and staff members, provides examples of current practices and policies on Internet and e-mail use and discusses the possible need for more developed policies. (EV)
Are Schools Getting a Big Enough Bang for Their Education Technology Buck?
ERIC Educational Resources Information Center
Boser, Ulrich
2013-01-01
Far too often, school leaders fail to consider how technology might dramatically improve teaching and learning, and schools frequently acquire digital devices without discrete learning goals and ultimately use these devices in ways that fail to adequately serve students, schools, or taxpayers. Because of a growing debate concerning spending on…
Using Gaming to Motivate Today's Technology-Dependent Students
ERIC Educational Resources Information Center
Petkov, Marin; Rogers, George E.
2011-01-01
In the past several decades, technology has become a big part of American society. It has changed the way people interact with one another as well as how they proceed with everyday life. However, K-12 educational systems have been resistive to change, with most American schools still using traditional instruction in the classroom, consisting…
Quantifying the Modern City: Emerging Technologies and Big Data for Active Living Research
Adlakha, Deepti
2017-01-01
Opportunities and infrastructure for active living are an important aspect of a community’s design, livability, and health. Features of the built environment influence active living and population levels of physical activity, but objective study of the built environment influence on active living behaviors is challenging. The use of emerging technologies for active living research affords new and promising means to obtain objective data on physical activity behaviors and improve the precision and accuracy of measurements. This is significant for physical activity promotion because precise measurements can enable detailed examinations of where, when, and how physical activity behaviors actually occur, thus enabling more effective targeting of particular behavior settings and environments. The aim of this focused review is to provide an overview of trends in emerging technologies that can profoundly change our ability to understand environmental determinants of active living. It discusses novel technological approaches and big data applications to measure and track human behaviors that may have broad applications across the fields of urban planning, public health, and spatial epidemiology. PMID:28611973
Efficient data management tools for the heterogeneous big data warehouse
NASA Astrophysics Data System (ADS)
Alekseev, A. A.; Osipova, V. V.; Ivanov, M. A.; Klimentov, A.; Grigorieva, N. V.; Nalamwar, H. S.
2016-09-01
The traditional RDBMS has been consistent for the normalized data structures. RDBMS served well for decades, but the technology is not optimal for data processing and analysis in data intensive fields like social networks, oil-gas industry, experiments at the Large Hadron Collider, etc. Several challenges have been raised recently on the scalability of data warehouse like workload against the transactional schema, in particular for the analysis of archived data or the aggregation of data for summary and accounting purposes. The paper evaluates new database technologies like HBase, Cassandra, and MongoDB commonly referred as NoSQL databases for handling messy, varied and large amount of data. The evaluation depends upon the performance, throughput and scalability of the above technologies for several scientific and industrial use-cases. This paper outlines the technologies and architectures needed for processing Big Data, as well as the description of the back-end application that implements data migration from RDBMS to NoSQL data warehouse, NoSQL database organization and how it could be useful for further data analytics.
NASA Astrophysics Data System (ADS)
Wang, Haimin; Liu, C.
2012-05-01
In recent studies by Pariat, Antiochos and DeVore (2009, 2010), fan-separatrix topology and magnetic reconnection at the null-point were simulated and found to produce homologous jets. This motivates us to search for axisymmetric magnetic structure and associated flaring/jetting activity. Using high-resolution ( 0.15" per pixel) and high-cadence ( 15 s) H-alpha center/offband observations obtained from the recently digitized films of Big Bear Solar Observatory, we were able to identify five large circular flares with associated surges. All the events exhibit a central parasite magnetic field surrounded by opposite polarity, forming a circular polarity inversion line (PIL). Consequently, a compact flare kernel at the center is surrounded by a circular ribbon, and together with the upward ejecting dark surge, these seem to depict a dome-like magnetic structure. Very interestingly, (1) the circular ribbon brightens sequentially rather than simultaneously, (2) the central compact flare kernel shows obvious motion, and (3) a remote elongated, co-temporal flare ribbon at a region with the same polarity as the central parasite site is seen in the series of four homologous events on 1991 March 17 and 18. The remote ribbon is 120" away from the jet location. Moreover, magnetic reconnection across the circular PIL is evident from the magnetic flux cancellation. These rarely observed homologous surges with circular as well as central and remote flare ribbons provide valuable evidence concerning the dynamics of magnetic reconnection in a null-point topology. This study is dedicated to Professor Hal Zirin, the founder of Big Bear Solar Observatory, who passed away on January 3, 2012.
Responses of Florida panthers to recreational deer and hog hunting
Janis, Michael W.; Clark, Joseph D.
2002-01-01
Big Cypress National Preserve constitutes approximately one-third of the range of the endangered Florida panther (Puma concolor coryi). Because recreational hunting is allowed in Big Cypress National Preserve, we examined 8 response variables (activity rates, movement rates, predation success, home-range size, home-range shifts, proximity to off-road vehicle trails, use of areas with concentrated human activity, and habitat selection) to evaluate how Florida panthers respond to human activity associated with deer and hog hunting. Data consisted of panther radiolocations collected since 1981 by the Florida Fish and Wildlife Conservation Commission and the National Park Service, which we augmented with radiolocations and activity monitoring from 1994 to 1998. A split-plot (treatment and control) study design with repeated measures of the variables for each panther taken before, during, and after the hunting season was used. We did not detect responses to hunting for variables most directly related to panther energy intake or expenditure (i.e., activity rates, movement rates, predation success of females; P>0.10). However, panthers reduced their use of Bear Island (P=0.021), an area of concentrated human activity, and were found farther from off-road vehicle trails (P≤0.001) during the hunting season, which was indicative of a reaction to human disturbance. Whereas the reaction to human activity on off-road vehicle trails probably has minor biological implications and may be linked to prey behavior, the decreased use of Bear Island is most likely a direct reaction to human activity and resulted in increased use of adjacent private lands. Future habitat loss on those private lands could exacerbate the negative consequences of this response by panthers.
Johnston, M.J.S.; Linde, A.T.; Agnew, D.C.
1994-01-01
High-precision strain was observed with a borehole dilational strainmeter in the Devil's Punchbowl during the 11:58 UT 28 June 1992 MW 7.3 Landers earthquake and the large Big Bear aftershock (MW 6.3). The strainmeter is installed at a depth of 176 m in the fault zone approximately midway between the surface traces of the San Andreas and Punchbowl faults and is about 100 km from the 85-km-long Landers rupture. We have questioned whether unusual amplified strains indicating precursive slip or high fault compliance occurred on the faults ruptured by the Landers earthquake, or in the San Andreas fault zone before and during the earthquake, whether static offsets for both the Landers and Big Bear earthquakes agree with expectation from geodetic and seismologic models of the ruptures and with observations from a nearby two-color geodimeter network, and whether postseismic behavior indicated continued slip on the Landers rupture or local triggered slip on the San Andreas. We show that the strain observed during the earthquake at this instrument shows no apparent amplification effects. There are no indications of precursive strain in these strain data due to either local slip on the San Andreas or precursive slip on the eventual Landers rupture. The observations are generally consistent with models of the earthquake in which fault geometry and slip have the same form as that determined by either inversion of the seismic data or inversion of geodetically determined ground displacements produced by the earthquake. Finally, there are some indications of minor postseismic behavior, particularly during the month following the earthquake.
Researchers take on challenges and opportunities to mine "Big Data" for answers to complex biological questions. Learn how bioinformatics uses advanced computing, mathematics, and technological platforms to store, manage, analyze, and understand data.
Bioinformatics clouds for big data manipulation.
Dai, Lin; Gao, Xin; Guo, Yan; Xiao, Jingfa; Zhang, Zhang
2012-11-28
As advances in life sciences and information technology bring profound influences on bioinformatics due to its interdisciplinary nature, bioinformatics is experiencing a new leap-forward from in-house computing infrastructure into utility-supplied cloud computing delivered over the Internet, in order to handle the vast quantities of biological data generated by high-throughput experimental technologies. Albeit relatively new, cloud computing promises to address big data storage and analysis issues in the bioinformatics field. Here we review extant cloud-based services in bioinformatics, classify them into Data as a Service (DaaS), Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS), and present our perspectives on the adoption of cloud computing in bioinformatics. This article was reviewed by Frank Eisenhaber, Igor Zhulin, and Sandor Pongor.
Selecting Security Technology Providers
ERIC Educational Resources Information Center
Schneider, Tod
2009-01-01
The world of security technology holds great promise, but it is fraught with opportunities for expensive missteps and misapplications. The quality of the security technology consultants and system integrators one uses will have a direct bearing on how well his school masters this complex subject. Security technology consultants help determine…
Can history improve big bang health reform? Commentary.
Marchildon, Gregory P
2018-07-01
At present, the professional skills of the historian are rarely relied upon when health policies are being formulated. There are numerous reasons for this, one of which is the natural desire of decision-makers to break with the past when enacting big bang policy change. This article identifies the strengths professional historians bring to bear on policy development using the establishment and subsequent reform of universal health coverage as an example. Historians provide pertinent and historically informed context; isolate the forces that have historically allowed for major reform; and separate the truly novel reforms from those attempted or implemented in the past. In addition, the historian's use of primary sources allows potentially new and highly salient facts to guide the framing of the policy problem and its solution. This paper argues that historians are critical for constructing a viable narrative of the establishment and evolution of universal health coverage policies. The lack of this narrative makes it difficult to achieve an accurate assessment of systemic gaps in coverage and access, and the design or redesign of universal health coverage that can successfully close these gaps.
Bagshaw, Sean M; Goldstein, Stuart L; Ronco, Claudio; Kellum, John A
2016-01-01
The world is immersed in "big data". Big data has brought about radical innovations in the methods used to capture, transfer, store and analyze the vast quantities of data generated every minute of every day. At the same time; however, it has also become far easier and relatively inexpensive to do so. Rapidly transforming, integrating and applying this large volume and variety of data are what underlie the future of big data. The application of big data and predictive analytics in healthcare holds great promise to drive innovation, reduce cost and improve patient outcomes, health services operations and value. Acute kidney injury (AKI) may be an ideal syndrome from which various dimensions and applications built within the context of big data may influence the structure of services delivery, care processes and outcomes for patients. The use of innovative forms of "information technology" was originally identified by the Acute Dialysis Quality Initiative (ADQI) in 2002 as a core concept in need of attention to improve the care and outcomes for patients with AKI. For this 15(th) ADQI consensus meeting held on September 6-8, 2015 in Banff, Canada, five topics focused on AKI and acute renal replacement therapy were developed where extensive applications for use of big data were recognized and/or foreseen. In this series of articles in the Canadian Journal of Kidney Health and Disease, we describe the output from these discussions.
A peek into the future of radiology using big data applications
Kharat, Amit T.; Singhal, Shubham
2017-01-01
Big data is extremely large amount of data which is available in the radiology department. Big data is identified by four Vs – Volume, Velocity, Variety, and Veracity. By applying different algorithmic tools and converting raw data to transformed data in such large datasets, there is a possibility of understanding and using radiology data for gaining new knowledge and insights. Big data analytics consists of 6Cs – Connection, Cloud, Cyber, Content, Community, and Customization. The global technological prowess and per-capita capacity to save digital information has roughly doubled every 40 months since the 1980's. By using big data, the planning and implementation of radiological procedures in radiology departments can be given a great boost. Potential applications of big data in the future are scheduling of scans, creating patient-specific personalized scanning protocols, radiologist decision support, emergency reporting, virtual quality assurance for the radiologist, etc. Targeted use of big data applications can be done for images by supporting the analytic process. Screening software tools designed on big data can be used to highlight a region of interest, such as subtle changes in parenchymal density, solitary pulmonary nodule, or focal hepatic lesions, by plotting its multidimensional anatomy. Following this, we can run more complex applications such as three-dimensional multi planar reconstructions (MPR), volumetric rendering (VR), and curved planar reconstruction, which consume higher system resources on targeted data subsets rather than querying the complete cross-sectional imaging dataset. This pre-emptive selection of dataset can substantially reduce the system requirements such as system memory, server load and provide prompt results. However, a word of caution, “big data should not become “dump data” due to inadequate and poor analysis and non-structured improperly stored data. In the near future, big data can ring in the era of personalized and individualized healthcare. PMID:28744087
ERIC Educational Resources Information Center
Laurillard, Diana
1985-01-01
Reports an evaluation of the Teddy Bear disc, an interactive videodisc developed at the Open University for a second-level course in metallurgy and materials technology. Findings from observation of students utilizing the videodisc are reviewed; successful design features and design problems are considered; and development costs are outlined. (MBR)
JPRS Report, China, Red Flag, Number 11, 1 June 1988
1988-07-27
the brand , quality, and special products which are badly needed on the market, the products which are needed by big industries, the planting and...of the professional and technological positions of the scientific and technologi- cal personnel, help the scientific and technological per- sonnel...the scale of capital construction for agriculture has been reduced. If this situation is not changed, the position of agriculture as the foundation
Heuer, R.-D.
2018-02-19
Summer Student Lecture Programme Introduction. The mission of CERN; push back the frontiers of knowledge, e.g. the secrets of the Big Bang...what was the matter like within the first moments of the Universe's existence? You have to develop new technologies for accelerators and detectors (also information technology--the Web and the GRID and medicine--diagnosis and therapy). There are three key technology areas at CERN; accelerating, particle detection, large-scale computing.
ERIC Educational Resources Information Center
Taylor, James C.
For more than 80 years, jobs in the United States have been designed by people for others. For most of these years, the experts in job design have placed the production technology above the job holder in importance. Since the 1950s, many jobs have been redesigned around new, computer-based technology. Often, the net effect has been to make those…
Pellet to Part Manufacturing System for CNCs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roschli, Alex C.; Love, Lonnie J.; Post, Brian K.
Oak Ridge National Laboratory’s Manufacturing Demonstration Facility worked with Hybrid Manufacturing Technologies to develop a compact prototype composite additive manufacturing head that can effectively extrude injection molding pellets. The head interfaces with conventional CNC machine tools enabling rapid conversion of conventional machine tools to additive manufacturing tools. The intent was to enable wider adoption of Big Area Additive Manufacturing (BAAM) technology and combine BAAM technology with conventional machining systems.
What is big data? A consensual definition and a review of key research topics
NASA Astrophysics Data System (ADS)
De Mauro, Andrea; Greco, Marco; Grimaldi, Michele
2015-02-01
Although Big Data is a trending buzzword in both academia and the industry, its meaning is still shrouded by much conceptual vagueness. The term is used to describe a wide range of concepts: from the technological ability to store, aggregate, and process data, to the cultural shift that is pervasively invading business and society, both drowning in information overload. The lack of a formal definition has led research to evolve into multiple and inconsistent paths. Furthermore, the existing ambiguity among researchers and practitioners undermines an efficient development of the subject. In this paper we have reviewed the existing literature on Big Data and analyzed its previous definitions in order to pursue two results: first, to provide a summary of the key research areas related to the phenomenon, identifying emerging trends and suggesting opportunities for future development; second, to provide a consensual definition for Big Data, by synthesizing common themes of existing works and patterns in previous definitions.
Buckling analysis of Big Dee Vacuum Vessel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lightner, S.; Gallix, R.
1983-12-01
A simplified three-dimensional shell buckling analysis of the GA Technologies Inc., Big Dee Vacuum Vessel (V/V) was performed using the finite element program TRICO. A coarse-mesh linear elastic model, which accommodated the support boundary conditions, was used to determine the buckling mode shape under a uniform external pressure. Using this buckling mode shape, refined models were used to calculate the linear buckling load (P/sub crit/) more accurately. Several different designs of the Big Dee V/V were considered in this analysis. The supports for the V/V were equally-spaced radial pins at the outer diameter of the mid-plane. For all the casesmore » considered, the buckling mode was axisymmetric in the toroidal direction. Therefore, it was possible to use only a small angular sector of a toric shell for the refined analysis. P/sub crit/ for the Big Dee is about 60 atm for a uniform external pressure. Also investigated in this analysis were the effects of geometrical imperfections and non-uniform pressure distributions.« less
Comparing modelling techniques when designing VPH gratings for BigBOSS
NASA Astrophysics Data System (ADS)
Poppett, Claire; Edelstein, Jerry; Lampton, Michael; Jelinsky, Patrick; Arns, James
2012-09-01
BigBOSS is a Stage IV Dark Energy instrument based on the Baryon Acoustic Oscillations (BAO) and Red Shift Distortions (RSD) techniques using spectroscopic data of 20 million ELG and LRG galaxies at 0.5<=z<=1.6 in addition to several hundred thousand QSOs at 0.5<=z<=3.5. When designing BigBOSS instrumentation, it is imperative to maximize throughput whilst maintaining a resolving power of between R=1500 and 4000 over a wavelength range of 360-980 nm. Volume phase Holographic (VPH) gratings have been identified as a key technology which will enable the efficiency requirement to be met, however it is important to be able to accurately predict their performance. In this paper we quantitatively compare different modelling techniques in order to assess the parameter space over which they are more capable of accurately predicting measured performance. Finally we present baseline parameters for grating designs that are most suitable for the BigBOSS instrument.
Adapting bioinformatics curricula for big data.
Greene, Anna C; Giffin, Kristine A; Greene, Casey S; Moore, Jason H
2016-01-01
Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs. © The Author 2015. Published by Oxford University Press.
Research on the Construction of Remote Sensing Automatic Interpretation Symbol Big Data
NASA Astrophysics Data System (ADS)
Gao, Y.; Liu, R.; Liu, J.; Cheng, T.
2018-04-01
Remote sensing automatic interpretation symbol (RSAIS) is an inexpensive and fast method in providing precise in-situ information for image interpretation and accuracy. This study designed a scientific and precise RSAIS data characterization method, as well as a distributed and cloud architecture massive data storage method. Additionally, it introduced an offline and online data update mode and a dynamic data evaluation mechanism, with the aim to create an efficient approach for RSAIS big data construction. Finally, a national RSAIS database with more than 3 million samples covering 86 land types was constructed during 2013-2015 based on the National Geographic Conditions Monitoring Project of China and then annually updated since the 2016 period. The RSAIS big data has proven to be a good method for large scale image interpretation and field validation. It is also notable that it has the potential to solve image automatic interpretation with the assistance of deep learning technology in the remote sensing big data era.
Adapting bioinformatics curricula for big data
Greene, Anna C.; Giffin, Kristine A.; Greene, Casey S.
2016-01-01
Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs. PMID:25829469
Beyond Einstein: From the Big Bang to Black Holes
NASA Astrophysics Data System (ADS)
White, N.
Beyond Einstein is a science-driven program of missions, education and outreach, and technology, to address three questions: What powered the Big Bang? What happens to space, time, and matter at the edge of a Black Hole? What is the mysterious Dark Energy pulling the universe apart? To address the science objectives, Beyond Einstein contains several interlinked elements. The strategic missions Constellation-X and LISA primarily investigate the nature of black holes. Constellation-X is a spectroscopic observatory that uses X-ray emitting atoms as clocks to follow the fate of matter falling into black holes. LISA will be the first space-based gravitational wave observatory uses gravitational waves to measure the dynamic structure of space and time around black holes. Moderate sized probes that are fully competed, peer-reviewed missions (300M-450M) launched every 3-5 years to address the focussed science goals: 1) Determine the nature of the Dark Energy that dominates the universe, 2) Search for the signature of the beginning of the Big Bang in the microwave background and 3) Take a census of Black Holes of all sizes and ages in the universe. The final element is a Technology Program to enable ultimate Vision Missions (after 2015) to directly detect gravitational waves echoing from the beginning of the Big Bang, and to directly image matter near the event horizon of a Black Hole. An associated Education and Public Outreach Program will inspire the next generation of scientists, and support national science standards and benchmarks.
IBM Watson: How Cognitive Computing Can Be Applied to Big Data Challenges in Life Sciences Research.
Chen, Ying; Elenee Argentinis, J D; Weber, Griff
2016-04-01
Life sciences researchers are under pressure to innovate faster than ever. Big data offer the promise of unlocking novel insights and accelerating breakthroughs. Ironically, although more data are available than ever, only a fraction is being integrated, understood, and analyzed. The challenge lies in harnessing volumes of data, integrating the data from hundreds of sources, and understanding their various formats. New technologies such as cognitive computing offer promise for addressing this challenge because cognitive solutions are specifically designed to integrate and analyze big datasets. Cognitive solutions can understand different types of data such as lab values in a structured database or the text of a scientific publication. Cognitive solutions are trained to understand technical, industry-specific content and use advanced reasoning, predictive modeling, and machine learning techniques to advance research faster. Watson, a cognitive computing technology, has been configured to support life sciences research. This version of Watson includes medical literature, patents, genomics, and chemical and pharmacological data that researchers would typically use in their work. Watson has also been developed with specific comprehension of scientific terminology so it can make novel connections in millions of pages of text. Watson has been applied to a few pilot studies in the areas of drug target identification and drug repurposing. The pilot results suggest that Watson can accelerate identification of novel drug candidates and novel drug targets by harnessing the potential of big data. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Dinov, Ivo D
2016-01-01
Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be 'team science'.
NASA Astrophysics Data System (ADS)
Silk, Joseph
Our universe was born billions of years ago in a hot, violent explosion of elementary particles and radiation - the big bang. What do we know about this ultimate moment of creation, and how do we know it? Drawing upon the latest theories and technology, this new edition of The big bang, is a sweeping, lucid account of the event that set the universe in motion. Joseph Silk begins his story with the first microseconds of the big bang, on through the evolution of stars, galaxies, clusters of galaxies, quasars, and into the distant future of our universe. He also explores the fascinating evidence for the big bang model and recounts the history of cosmological speculation. Revised and updated, this new edition features all the most recent astronomical advances, including: Photos and measurements from the Hubble Space Telescope, Cosmic Background Explorer Satellite (COBE), and Infrared Space Observatory; the latest estimates of the age of the universe; new ideas in string and superstring theory; recent experiments on neutrino detection; new theories about the presence of dark matter in galaxies; new developments in the theory of the formation and evolution of galaxies; the latest ideas about black holes, worm holes, quantum foam, and multiple universes.
Carbothermic reduction behaviors of Ti-Nb-bearing Fe concentrate from Bayan Obo ore in China
NASA Astrophysics Data System (ADS)
Wang, Guang; Du, Ya-xing; Wang, Jing-song; Xue, Qing-guo
2018-01-01
To support the development of technology to utilize low-grade Ti-Nb-bearing Fe concentrate, the reduction of the concentrate by coal was systematically investigated in the present paper. A liquid phase formed when the Ti-Nb-bearing Fe concentrate/coal composite pellet was reduced at temperatures greater than 1100°C. The addition of CaCO3 improved the reduction rate when the slag basicity was less than 1.0 and inhibited the formation of the liquid phase. Mechanical milling obviously increased the metallization degree compared with that of the standard pellet when reduced under the same conditions. Evolution of the mineral phase composition and microstructure of the reduced Ti-Nb-bearing Fe concentrate/coal composite pellet at 1100°C were analyzed by X-ray diffraction and scanning electron microscopy-energy-dispersive spectroscopy. The volume shrinkage value of the reduced Ti-Nb-bearing Fe concentrate/coal composite pellet with a basicity of 1.0 was approximately 35.2% when the pellet was reduced at 1100°C for 20 min, which enhanced the external heat transfer to the lower layers when reduced in a practical rotary hearth furnace. The present work provides key parameters and mechanism understanding for the development of carbothermic reduction technology of a Ti-Nb-bearing Fe concentrate incorporated in a pyrometallurgical utilization flow sheet.
Implementing Reliability Centered Maintenance (RCM) with State of the Art PT&I Technologies
NASA Technical Reports Server (NTRS)
Hollis, Sean; Sasser, Chase
2016-01-01
Building on the work that was started two decades ago, Jacobs Space Operations Group has utilized state of the art PTI technologies to assess the current condition of the assets they manage under the Test and Operations Support Contract (TOSC). Specifically, the Asset Management department leveraged the benefits of ultrasound technology to quantify a motor issue in the Liquid Oxygen Storage Area, and troubleshoot the sources prior to loading the tank to perform Verification and Validation (VV) activities. This technology was efficient, easy to implement, and provided system engineers with data on a possible source of the problem. In situations where legacy motors are exhibiting unexpected noises, it may seem easier to remove and refurbish the motor and replace the bearings because that solution resolves most of the common causes of the noise. However, that solution would have involved additional spending and may not have solved issues stemming from the foundation, if those existed. By utilizing the ultrasound equipment provided by UE Systems, the sound profiles allowed Jacobs TOSC team to determine that the issue resembled a faulty bearing. After replacing the bearing, the unexpected noise ceased.
Assessment of 25 kW free-piston Stirling technology alternatives for solar applications
NASA Technical Reports Server (NTRS)
Erbeznik, Raymond M.; White, Maurice A.; Penswick, L. B.; Neely, Ronald E.; Ritter, Darren C.; Wallace, David A.
1992-01-01
The final design, construction, and testing of a 25-kW free-piston advanced Stirling conversion system (ASCS) are examined. The final design of the free-piston hydraulic ASCS consists of five subsystems: heat transport subsystem (solar receiver and pool boiler), free-piston hydraulic Stirling engine, hydraulic subsystem, cooling subsystem, and electrical and control subsystem. Advantages and disadvantages are identified for each technology alternative. Technology alternatives considered are gas bearings vs flexure bearings, stationary magnet linear alternator vs moving magnetic linear alternator, and seven different control options. Component designs are generated using available in-house procedures to meet the requirements of the free-piston Stirling convertor configurations.
High Speed Operation and Testing of a Fault Tolerant Magnetic Bearing
NASA Technical Reports Server (NTRS)
DeWitt, Kenneth; Clark, Daniel
2004-01-01
Research activities undertaken to upgrade the fault-tolerant facility, continue testing high-speed fault-tolerant operation, and assist in the commission of the high temperature (1000 degrees F) thrust magnetic bearing as described. The fault-tolerant magnetic bearing test facility was upgraded to operate to 40,000 RPM. The necessary upgrades included new state-of-the art position sensors with high frequency modulation and new power edge filtering of amplifier outputs. A comparison study of the new sensors and the previous system was done as well as a noise assessment of the sensor-to-controller signals. Also a comparison study of power edge filtering for amplifier-to-actuator signals was done; this information is valuable for all position sensing and motor actuation applications. After these facility upgrades were completed, the rig is believed to have capabilities for 40,000 RPM operation, though this has yet to be demonstrated. Other upgrades included verification and upgrading of safety shielding, and upgrading control algorithms. The rig will now also be used to demonstrate motoring capabilities and control algorithms are in the process of being created. Recently an extreme temperature thrust magnetic bearing was designed from the ground up. The thrust bearing was designed to fit within the existing high temperature facility. The retrofit began near the end of the summer, 04, and continues currently. Contract staff authored a NASA-TM entitled "An Overview of Magnetic Bearing Technology for Gas Turbine Engines", containing a compilation of bearing data as it pertains to operation in the regime of the gas turbine engine and a presentation of how magnetic bearings can become a viable candidate for use in future engine technology.
A Dynamic Competition Simulation for Worldwide Big-size TV Market Using Lotka-Volterra Model
NASA Astrophysics Data System (ADS)
Chen, Wu-Tung Terry; Li, Yiming; Hung, Chih-Young
2009-08-01
Technological innovation is characterized by the substitution of new technologies for full-fledged ones in the development of new products, processes and techniques. Global TV market is seeing a price down-spiral for FPD(Flat Panel Display)-TVs, replacement of CRT by LCD, and consumer's defection to larger screen. The LCD-TV market started in Japan from 2003 and took off globally from 2005. LCD panel production is moving toward larger sizes. In the 35″-39″ size market, the price/performance ratio of LCD-TV is better than that of PDP. The purpose of this paper is to estimate the demand function of worldwide big-size (35″-39″) TVs including LCD and PDP with an explicit consideration of market competition. The demand function was estimated using Lotka-Volterra model, a famous competitive diffusion model. The results exhibit a kind of predator-prey relationship, in which the PDP market was hunted by LCD product. In addition, the coefficients of difference equations of Lotka-Volterra model in this analysis are also used to forecast the future market of the big-size LCD and PDP.
A Multi-scale, Multi-Model, Machine-Learning Solar Forecasting Technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamann, Hendrik F.
The goal of the project was the development and demonstration of a significantly improved solar forecasting technology (short: Watt-sun), which leverages new big data processing technologies and machine-learnt blending between different models and forecast systems. The technology aimed demonstrating major advances in accuracy as measured by existing and new metrics which themselves were developed as part of this project. Finally, the team worked with Independent System Operators (ISOs) and utilities to integrate the forecasts into their operations.
Space Station alpha joint bearing
NASA Technical Reports Server (NTRS)
Everman, Michael R.; Jones, P. Alan; Spencer, Porter A.
1987-01-01
Perhaps the most critical structural system aboard the Space Station is the Solar Alpha Rotary Joint which helps align the power generation system with the sun. The joint must provide structural support and controlled rotation to the outboard transverse booms as well as power and data transfer across the joint. The Solar Alpha Rotary Joint is composed of two transition sections and an integral, large diameter bearing. Alpha joint bearing design presents a particularly interesting problem because of its large size and need for high reliability, stiffness, and on orbit maintability. The discrete roller bearing developed is a novel refinement to cam follower technology. It offers thermal compensation and ease of on-orbit maintenance that are not found in conventional rolling element bearings. How the bearing design evolved is summarized. Driving requirements are reviewed, alternative concepts assessed, and the selected design is described.
NASA Astrophysics Data System (ADS)
Lary, D. J.
2013-12-01
A BigData case study is described where multiple datasets from several satellites, high-resolution global meteorological data, social media and in-situ observations are combined using machine learning on a distributed cluster using an automated workflow. The global particulate dataset is relevant to global public health studies and would not be possible to produce without the use of the multiple big datasets, in-situ data and machine learning.To greatly reduce the development time and enhance the functionality a high level language capable of parallel processing has been used (Matlab). A key consideration for the system is high speed access due to the large data volume, persistence of the large data volumes and a precise process time scheduling capability.
Using Big Data to Discover Diagnostics and Therapeutics for Gastrointestinal and Liver Diseases
Wooden, Benjamin; Goossens, Nicolas; Hoshida, Yujin; Friedman, Scott L.
2016-01-01
Technologies such as genome sequencing, gene expression profiling, proteomic and metabolomic analyses, electronic medical records, and patient-reported health information have produced large amounts of data, from various populations, cell types, and disorders (big data). However, these data must be integrated and analyzed if they are to produce models or concepts about physiologic function or mechanisms of pathogenesis. Many of these data are available to the public, allowing researchers anywhere to search for markers of specific biologic processes or therapeutic targets for specific diseases or patient types. We review recent advances in the fields of computational and systems biology, and highlight opportunities for researchers to use big data sets in the fields of gastroenterology and hepatology, to complement traditional means of diagnostic and therapeutic discovery. PMID:27773806
Neural Network Control of a Magnetically Suspended Rotor System
NASA Technical Reports Server (NTRS)
Choi, Benjamin; Brown, Gerald; Johnson, Dexter
1997-01-01
Abstract Magnetic bearings offer significant advantages because of their noncontact operation, which can reduce maintenance. Higher speeds, no friction, no lubrication, weight reduction, precise position control, and active damping make them far superior to conventional contact bearings. However, there are technical barriers that limit the application of this technology in industry. One of them is the need for a nonlinear controller that can overcome the system nonlinearity and uncertainty inherent in magnetic bearings. This paper discusses the use of a neural network as a nonlinear controller that circumvents system nonlinearity. A neural network controller was well trained and successfully demonstrated on a small magnetic bearing rig. This work demonstrated the feasibility of using a neural network to control nonlinear magnetic bearings and systems with unknown dynamics.
Priming the Pump for Big Data at Sentara Healthcare.
Kern, Howard P; Reagin, Michael J; Reese, Bertram S
2016-01-01
Today's healthcare organizations are facing significant demands with respect to managing population health, demonstrating value, and accepting risk for clinical outcomes across the continuum of care. The patient's environment outside the walls of the hospital and physician's office-and outside the electronic health record (EHR)-has a substantial impact on clinical care outcomes. The use of big data is key to understanding factors that affect the patient's health status and enhancing clinicians' ability to anticipate how the patient will respond to various therapies. Big data is essential to delivering sustainable, highquality, value-based healthcare, as well as to the success of new models of care such as clinically integrated networks (CINs) and accountable care organizations.Sentara Healthcare, based in Norfolk, Virginia, has been an early adopter of the technologies that have readied us for our big data journey: EHRs, telehealth-supported electronic intensive care units, and telehealth primary care support through MDLIVE. Although we would not say Sentara is at the cutting edge of the big data trend, it certainly is among the fast followers. Use of big data in healthcare is still at an early stage compared with other industries. Tools for data analytics are maturing, but traditional challenges such as heightened data security and limited human resources remain the primary focus for regional health systems to improve care and reduce costs. Sentara primarily makes actionable use of big data in our CIN, Sentara Quality Care Network, and at our health plan, Optima Health. Big data projects can be expensive, and justifying the expense organizationally has often been easier in times of crisis. We have developed an analytics strategic plan separate from but aligned with corporate system goals to ensure optimal investment and management of this essential asset.
Blackboard Customers Consider Alternatives
ERIC Educational Resources Information Center
Young, Jeffrey R.
2008-01-01
Blackboard has become the Microsoft of higher-education technology, say many campus-technology officials, and they do not mean the comparison as a compliment. To them the company is not only big, but also pushy, and many of them love to hate it. This article reports that a growing number of colleges are switching to Moodle, a free, open-source…
ERIC Educational Resources Information Center
Al-Qirim, Nabeel; Rouibah, Kamel; Tarhini, Ali; Serhani, Mohamed Adel; Yammahi, Aishah Rashid; Yammahi, Maraim Ahmed
2018-01-01
This research investigates the personality characteristics of Information Technology students (CIT) in UAE University (UAEU) and how such features impact their IT learning. To achieve this objective, this research attempts to explain the impact of the Big-5 factors on learning using survey research. Results from 179 respondents suggested that…
Space Technology and Earth System Science
NASA Technical Reports Server (NTRS)
Habib, Shahid
2011-01-01
Science must continue to drive the technology development. Partnering and Data Sharing among nations is very important to maximize the cost benefits of such investments Climate changes and adaptability will be a big challenge for the next several decades (1) Natural disasters frequency and locations (2) Economic and social impact can be global and (3) Water resources and management.
ERIC Educational Resources Information Center
Chalmers, Christina; Carter, Merilyn; Cooper, Tom; Nason, Rod
2017-01-01
Although education experts are increasingly advocating the incorporation of integrated Science, Technology, Engineering, and Mathematics (STEM) curriculum units to address limitations in much current STEM teaching and learning, a review of the literature reveals that more often than not such curriculum units are not mediating the construction of…
IT Certification: Still Valuable after All These Years
ERIC Educational Resources Information Center
Venator, John
2006-01-01
Though the rapid evolution of technology makes it difficult to pinpoint "the next big thing," the outlook for information technology (IT) employment appears to be promising. In fact, several categories of IT jobs currently face shortages of qualified workers, a troubling trend that is projected to get worse over the next five to seven years. These…
NASA Technical Reports Server (NTRS)
Kosovichev, A. G.
1996-01-01
The layer of transition from the nearly rigid rotation of the radiative interior to the latitudinal differential rotation of the convection zone plays a significant role in the internal dynamics of the Sun. Using rotational splitting coefficients of the p-mode frequencies, obtained during 1986-1990 at the Big Bear Solar Observatory, we have found that the thickness of the transitional layer is 0.09 +/- 0.04 solar radii (63 +/- 28 Mm), and that most of the transition occurs beneath the adiabatically stratified part of the convection zone, as suggested by the dynamo theories of the 22 yr solar activity cycle.
1980-02-01
C to~~~ II qo 3 4 .. ..N~ V. ...... .V . .N N . .S ... .V .fI . i 0 Wi . .... . 53 .O .00... 0--N. .. ZZ W. I -. 0 -z J~~D &Z av F - -q qWe at s-a...q .a1 2c z oft ew dw- lp- eE- W - 0- fP- M .. 1 ,1e mc VftWU 00, 0%0 0000 - P-PtftoO. -%ffE Uqvv 4i . qwE ft . 04W -f ewF-0 wa rnm~~t.- tol Uw0 f E0
Science and Technology Review, January-February 1997
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Table of contents: accelerators at Livermore; the B-Factory and the Big Bang; assessing exposure to radiation; next generation of computer storage; and a powerful new tool to detect clandestine nuclear tests.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-15
... OFFICE OF SCIENCE AND TECHNOLOGY POLICY President's Council of Advisors on Science and Technology... understandings from the domains of science, technology, and innovation may bear on the policy choices before the President. PCAST is administered by the Office of Science and Technology Policy (OSTP). PCAST is co-chaired...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-12
... OFFICE OF SCIENCE AND TECHNOLOGY POLICY President's Council of Advisors on Science and Technology... science, technology, and innovation may bear on the policy choices before the President. PCAST is administered by the Office of Science and Technology Policy (OSTP). PCAST is co-chaired by Dr. John P. Holdren...
Future Vision - Emerging Technologies and Their Transformational Potential on the Energy Industry
NASA Technical Reports Server (NTRS)
Fredrickson, Steven E.
2015-01-01
Where will Digital Energy be in ten years? To look that far ahead, we need to broadly consider how artificial intelligence, robotics, big data, nanotechnology, internet-of-things and other rapidly evolving and interrelated technologies will shape mankind's future. A panel of innovative visionary leaders from inside and outside the energy industry will discuss the emerging technologies that will shape the future of industrial operations over the next decade.
JPRS Report, Soviet Union. World Economy & International Relations, No. 12, December 1988.
1989-04-18
conditions of scientific and technological progress is considered to be the main factor influencing such process. Leading companies are trying to...departments, groups etc. Project teams are widely used when a new kind of product is developed or when R and D or technological problems are to be...medium-sized partners are integrated into a single scientific and technological entity by big companies. Cooperation in production is also important
Consideration of Alternate Working Fluid Properties in Gas Lubricated Foil Journal Bearings
NASA Technical Reports Server (NTRS)
Smith, Matthew J.
2004-01-01
The Oil-Free Turbomachinery Program at the NASA Glenn Research center is committed to, revolutionary improvements in performance, efficiency and reliability of turbomachinery propulsion systems. One of the key breakthroughs by which this goal is being achieved is the maturation of air lubricated foil bearing technology. Through experimental testing, foil bearings have demonstrated a variety of exceptional qualities that show them to have an important role in the future of rotordynamic lubrication. Most of the work done with foil bearings thus far has considered ambient air at atmospheric pressure as the working fluid or lubricating fluid in the bearing. However, special applications of oil-free technology require the use of air at non- standard ambient conditions or completely different working fluids altogether. The NASA Jupiter Icy Moon Orbiter program presents power generation needs far beyond that of any previous space exploration effort. The proposed spacecraft will require significant power generation to provide the propulsion necessary to reach the moons of Jupiter and navigate between them. Once there, extensive scientific research will be conducted that will also present significant power requirements. Such extreme needs require exploring a new method for power generation in space. A proposed solution involves a Brayton cycle nuclear fission reactor. The nature of this application requires reliable performance of all reactor components for many years of operation under demanding conditions. This includes the bearings which will be operating with an alternative working fluid that is a combination of Helium and Xenon gases commonly known as HeXe. This fluid has transport and thermal properties that vary significantly from that of air and the effect of these property differences on bearing performance must be considered. One of the most promising applications of oil-free technology is in aircraft turbine engines. Eliminating the oil supply systems from aircraft engines will lead to significant weight and maintenance reduction. In such applications, the lubricating fluid will be high altitude air. This air will be at much lower pressure than that at sea level. Again this property change will result in a change in bearing performance, and analysis is required to quantify this effect. The study of these alternate working fluid properties will be conducted in two ways: analytically and experimentally. Analytical research will include the use of a mathematical code that can predict film thickness profiles for various ambient conditions. Estimations of load capacity can be made based upon the film thickness trends. These values will then be compared to those obtained from classical rigid bearing analysis. Experimental Research will include testing a foil bearing at a variety of ambient air pressures. The analytical and experimental data will be compared to draw conclusions on bearing performance under alternate working fluid properties.
Nanobiotech in big pharma: a business perspective.
Würmseher, Martin; Firmin, Lea
2017-03-01
Since the early 2000s, numerous publications have presented major scientific opportunities that can be achieved through integrating insights from the area of nanotech into biotech (nanobiotech). This paper aims to explore the economic significance that nanobiotech has gained in the established pharmaceutical industry (big pharma). The empirical investigation draws on patent data as well as product revenue data; and to put the results into perspective, the amounts are compared with the established/traditional biotech sector. The results indicate that the new technology still plays only a minor role - at least from a commercial perspective.
Sustainable urban water systems in rich and poor cities--steps towards a new approach.
Newman, P
2001-01-01
The 'big pipes in, big pipes out' approach to urban water management was developed in the 19th century for a particular linear urban form. Large, sprawling car-dependent cities are pushing this approach to new limits in rich cities and it has never worked in poor cities. An alternative which uses new small-scale technology and is more community-based, is suggested for both rich and poor countries. The Sydney Olympics and a demonstration project in Java show that the approach can work.
Transportation Big Data: Unbiased Analysis and Tools to Inform Sustainable Transportation Decisions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Today, transportation operation and energy systems data are generated at an unprecedented scale. The U.S. Department of Energy's National Renewable Energy Laboratory (NREL) is the go-to source for expertise in providing data and analysis to inform industry and government transportation decision making. The lab's teams of data experts and engineers are mining and analyzing large sets of complex data -- or 'big data' -- to develop solutions that support the research, development, and deployment of market-ready technologies that reduce fuel consumption and greenhouse gas emissions.
Gathering the forgotten voices: an oral history of the CFHT's early years
NASA Astrophysics Data System (ADS)
Laychak, Mary Beth; Bryson, Liz
2011-06-01
They came to the Big Island from as far away as Murrumbeena, Australia, and as near by as Hilo, Hawaii. They were progeny of Scottish coal miners, French physicists, Chicago truck drivers, Japanese samurai and Big Island cane workers. Together, these men and women would build and commission one of the most dynamic and productive 3.6 meter telescopes in the world that remains in the forefront of science and technology. The CFHT oral history DVD preserves the stories of the first decade and a half of the observatory.
Cryostatless high temperature supercurrent bearings for rocket engine turbopumps
NASA Technical Reports Server (NTRS)
Rao, Dantam K.; Dill, James F.
1989-01-01
The rocket engine systems examined include SSME, ALS, and CTV systems. The liquid hydrogen turbopumps in the SSME and ALS vehicle systems are identified as potentially attractive candidates for development of Supercurrent Bearings since the temperatures around the bearings is about 30 K, which is considerably lower than the 95 K transition temperatures of HTS materials. At these temperatures, the current HTS materials are shown to be capable of developing significantly higher current densities. This higher current density capability makes the development of supercurrent bearings for rocket engines an attractive proposition. These supercurrent bearings are also shown to offer significant advantages over conventional bearings used in rocket engines. They can increase the life and reliability over rolling element bearings because of noncontact operation. They offer lower power loss over conventional fluid film bearings. Compared to conventional magnetic bearings, they can reduce the weight of controllers significantly, and require lower power because of the use of persistent currents. In addition, four technology areas that require further attention have been identified. These are: Supercurrent Bearing Conceptual Design Verification; HTS Magnet Fabrication and Testing; Cryosensors and Controller Development; and Rocket Engine Environmental Compatibility Testing.
NASA Technical Reports Server (NTRS)
Della-Corte, Christopher
2012-01-01
Foil gas bearings are a key technology in many commercial and emerging oilfree turbomachinery systems. These bearings are nonlinear and have been difficult to analytically model in terms of performance characteristics such as load capacity, power loss, stiffness, and damping. Previous investigations led to an empirically derived method to estimate load capacity. This method has been a valuable tool in system development. The current work extends this tool concept to include rules for stiffness and damping coefficient estimation. It is expected that these rules will further accelerate the development and deployment of advanced oil-free machines operating on foil gas bearings.
Bioinformatics clouds for big data manipulation
2012-01-01
Abstract As advances in life sciences and information technology bring profound influences on bioinformatics due to its interdisciplinary nature, bioinformatics is experiencing a new leap-forward from in-house computing infrastructure into utility-supplied cloud computing delivered over the Internet, in order to handle the vast quantities of biological data generated by high-throughput experimental technologies. Albeit relatively new, cloud computing promises to address big data storage and analysis issues in the bioinformatics field. Here we review extant cloud-based services in bioinformatics, classify them into Data as a Service (DaaS), Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS), and present our perspectives on the adoption of cloud computing in bioinformatics. Reviewers This article was reviewed by Frank Eisenhaber, Igor Zhulin, and Sandor Pongor. PMID:23190475
[OMICS AND BIG DATA, MAJOR ADVANCES TOWARDS PERSONALIZED MEDICINE OF THE FUTURE?].
Scheen, A J
2015-01-01
The increasing interest for personalized medicine evolves together with two major technological advances. First, the new-generation, rapid and less expensive, DNA sequencing method, combined with remarkable progresses in molecular biology leading to the post-genomic era (transcriptomics, proteomics, metabolomics). Second, the refinement of computing tools (IT), which allows the immediate analysis of a huge amount of data (especially, those resulting from the omics approaches) and, thus, creates a new universe for medical research, that of analyzed by computerized modelling. This article for scientific communication and popularization briefly describes the main advances in these two fields of interest. These technological progresses are combined with those occurring in communication, which makes possible the development of artificial intelligence. These major advances will most probably represent the grounds of the future personalized medicine.
The NASA Beyond Einstein Program
NASA Technical Reports Server (NTRS)
White, Nicholas E.
2006-01-01
Einstein's legacy is incomplete, his theory of General relativity raises -- but cannot answer --three profound questions: What powered the big bang? What happens to space, time, and matter at the edge of a black hole? and What is the mysterious dark energy pulling the Universe apart? The Beyond Einstein program within NASA's Office of Space Science aims to answer these questions, employing a series of missions linked by powerful new technologies and complementary approaches towards shared science goals. The Beyond Einstein program has three linked elements which advance science and technology towards two visions; to detect directly gravitational wave signals from the earliest possible moments of the BIg Bang, and to image the event horizon of a black hole. The central element is a pair of Einstein Great Observatories, Constellation-X and LISA. Constellation-X is a powerful new X-ray observatory dedicated to X-Ray Spectroscopy. LISA is the first spaced based gravitational wave detector. These powerful facilities will blaze new paths to the questions about black holes, the Big Bang and dark energy. The second element is a series of competitively selected Einstein Probes, each focused on one of the science questions and includes a mission dedicated resolving the Dark Energy mystery. The third element is a program of technology development, theoretical studies and education. The Beyond Einstein program is a new element in the proposed NASA budget for 2004. This talk will give an overview of the program and the missions contained within it.