Social Data Analytics Using Tensors and Sparse Techniques
ERIC Educational Resources Information Center
Zhang, Miao
2014-01-01
The development of internet and mobile technologies is driving an earthshaking social media revolution. They bring the internet world a huge amount of social media content, such as images, videos, comments, etc. Those massive media content and complicate social structures require the analytic expertise to transform those flood of information into…
Handling of huge multispectral image data volumes from a spectral hole burning device (SHBD)
NASA Astrophysics Data System (ADS)
Graff, Werner; Rosselet, Armel C.; Wild, Urs P.; Gschwind, Rudolf; Keller, Christoph U.
1995-06-01
We use chlorin-doped polymer films at low temperatures as the primary imaging detector. Based on the principles of persistent spectral hole burning, this system is capable of storing spatial and spectral information simultaneously in one exposure with extremely high resolution. The sun as an extended light source has been imaged onto the film. The information recorded amounts to tens of GBytes. This data volume is read out by scanning the frequency of a tunable dye laser and reading the images with a digital CCD camera. For acquisition, archival, processing, and visualization, we use MUSIC (MUlti processor System with Intelligent Communication), a single instruction multiple data parallel processor system equipped with the necessary I/O facilities. The huge amount of data requires the developemnt of sophisticated algorithms to efficiently calibrate the data and to extract useful and new information for solar physics.
An Internet of Things platform architecture for supporting ambient assisted living environments.
Tsirmpas, Charalampos; Kouris, Ioannis; Anastasiou, Athanasios; Giokas, Kostas; Iliopoulou, Dimitra; Koutsouris, Dimitris
2017-01-01
Internet of Things (IoT) is the logical further development of today's Internet, enabling a huge amount of devices to communicate, compute, sense and act. IoT sensors placed in Ambient Assisted Living (AAL) environments, enable the context awareness and allow the support of the elderly in their daily routines, ultimately allowing an independent and safe lifestyle. The vast amount of data that are generated and exchanged between the IoT nodes require innovative context modeling approaches that go beyond currently used models. Current paper presents and evaluates an open interoperable platform architecture in order to utilize the technical characteristics of IoT and handle the large amount of generated data, as a solution to the technical requirements of AAL applications.
ERIC Educational Resources Information Center
Lenkeit, Jenny
2013-01-01
Educational effectiveness research often appeals to "value-added models (VAM)" to gauge the impact of schooling on student learning net of the effect of student background variables. A huge amount of cross-sectional studies do not, however, meet VAM's requirement for longitudinal data. "Contextualised attainment models (CAM)"…
Current Status of Astrometry Satellite missions in Japan: JASMINE project series
NASA Astrophysics Data System (ADS)
Yano, T.; Gouda, N.; Kobayashi, Y.; Tsujimoto, T.; Hatsutori, Y.; Murooka, J.; Niwa, Y.; Yamada, Y.
Astrometry satellites have common technological issues. (A) Astrometry satellites are required to measure the positions of stars with high accuracy from the huge amount of data during the observational period. (B) The high stabilization of the thermal environment in the telescope is required. (C) The attitude-pointing stability of these satellites with sub-pixel accuracy is also required. Measurement of the positions of stars from a huge amount of data is the essence of astrometry. It is needed to exclude the systematic errors adequately for each image of stars in order to obtain the accurate positions. We have carried out a centroiding experiment for determining the positions of stars from about 10 000 image data. The following two points are important issues for the mission system of JASMINE in order to achieve our aim. For the small-JASMINE, we require the thermal stabilization of the telescope in order to obtain high astrometric accuracy of about 10 micro-arcsec. In order to accomplish a measurement of positions of stars with high accuracy, we must make a model of the distortion of the image on the focal plane with the accuracy of less than 0.1 nm. We have investigated numerically that the above requirement is achieved if the thermal variation is within about 1 K / 0.75 h. We also require the accuracy of the attitude-pointing stability of about 200 mas / 7 s. The utilization of the Tip-tilt mirror will make it possible to achieve such a stable pointing.
Glasses-free large size high-resolution three-dimensional display based on the projector array
NASA Astrophysics Data System (ADS)
Sang, Xinzhu; Wang, Peng; Yu, Xunbo; Zhao, Tianqi; Gao, Xing; Xing, Shujun; Yu, Chongxiu; Xu, Daxiong
2014-11-01
Normally, it requires a huge amount of spatial information to increase the number of views and to provide smooth motion parallax for natural three-dimensional (3D) display similar to real life. To realize natural 3D video display without eye-wears, a huge amount of 3D spatial information is normal required. However, minimum 3D information for eyes should be used to reduce the requirements for display devices and processing time. For the 3D display with smooth motion parallax similar to the holographic stereogram, the size the virtual viewing slit should be smaller than the pupil size of eye at the largest viewing distance. To increase the resolution, two glass-free 3D display systems rear and front projection are presented based on the space multiplexing with the micro-projector array and the special designed 3D diffuse screens with the size above 1.8 m× 1.2 m. The displayed clear depths are larger 1.5m. The flexibility in terms of digitized recording and reconstructed based on the 3D diffuse screen relieves the limitations of conventional 3D display technologies, which can realize fully continuous, natural 3-D display. In the display system, the aberration is well suppressed and the low crosstalk is achieved.
ERIC Educational Resources Information Center
Mason, Lucia; Pluchino, Patrik; Ariasi, Nicola
2014-01-01
Students search the Web frequently for many purposes, one of which is to search information for academic assignments. Given the huge amount of easily accessible online information, they are required to develop new reading skills and become more able to effectively evaluate the reliability of web sources. This study investigates the distribution of…
Digital pathology: DICOM-conform draft, testbed, and first results.
Zwönitzer, Ralf; Kalinski, Thomas; Hofmann, Harald; Roessner, Albert; Bernarding, Johannes
2007-09-01
Hospital information systems are state of the art nowadays. Therefore, Digital Pathology, also labelled as Virtual Microscopy, has gained increased attention. Triggered by radiology, standardized information models and workflows were world-wide defined based on DICOM. However, DICOM-conform integration of Digital Pathology into existing clinical information systems imposes new problems requiring specific solutions concerning the huge amount of data as well as the special structure of the data to be managed, transferred, and stored. We implemented a testbed to realize and evaluate the workflow of digitized slides from acquisition to archiving. The experiences led to the draft of a DICOM-conform information model that accounted for extensions, definitions, and technical requirements necessary to integrate digital pathology in a hospital-wide DICOM environment. Slides were digitized, compressed, and could be viewed remotely. Real-time transfer of the huge amount of data was optimized using streaming techniques. Compared to a recent discussion in the DICOM Working Group for Digital Pathology (WG26) our experiences led to a preference of a JPEG2000/JPIP-based streaming of the whole slide image. The results showed that digital pathology is feasible but strong efforts by users and vendors are still necessary to integrate Digital Pathology into existing information systems.
ERIC Educational Resources Information Center
Choi, Samuel P. M.; Lam, S. S.; Li, Kam Cheong; Wong, Billy T. M.
2018-01-01
While learning analytics (LA) practices have been shown to be practical and effective, most of them require a huge amount of data and effort. This paper reports a case study which demonstrates the feasibility of practising LA at a low cost for instructors to identify at-risk students in an undergraduate business quantitative methods course.…
Exploring Cloud Computing for Large-scale Scientific Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Guang; Han, Binh; Yin, Jian
This paper explores cloud computing for large-scale data-intensive scientific applications. Cloud computing is attractive because it provides hardware and software resources on-demand, which relieves the burden of acquiring and maintaining a huge amount of resources that may be used only once by a scientific application. However, unlike typical commercial applications that often just requires a moderate amount of ordinary resources, large-scale scientific applications often need to process enormous amount of data in the terabyte or even petabyte range and require special high performance hardware with low latency connections to complete computation in a reasonable amount of time. To address thesemore » challenges, we build an infrastructure that can dynamically select high performance computing hardware across institutions and dynamically adapt the computation to the selected resources to achieve high performance. We have also demonstrated the effectiveness of our infrastructure by building a system biology application and an uncertainty quantification application for carbon sequestration, which can efficiently utilize data and computation resources across several institutions.« less
Distributed and parallel approach for handle and perform huge datasets
NASA Astrophysics Data System (ADS)
Konopko, Joanna
2015-12-01
Big Data refers to the dynamic, large and disparate volumes of data comes from many different sources (tools, machines, sensors, mobile devices) uncorrelated with each others. It requires new, innovative and scalable technology to collect, host and analytically process the vast amount of data. Proper architecture of the system that perform huge data sets is needed. In this paper, the comparison of distributed and parallel system architecture is presented on the example of MapReduce (MR) Hadoop platform and parallel database platform (DBMS). This paper also analyzes the problem of performing and handling valuable information from petabytes of data. The both paradigms: MapReduce and parallel DBMS are described and compared. The hybrid architecture approach is also proposed and could be used to solve the analyzed problem of storing and processing Big Data.
Set-Membership Identification for Robust Control Design
1993-04-28
system G can be regarded as having no memory in (18) in terms of G and 0, we get of events prior to t = 1, the initial time. Roughly, this means all...algorithm in [1]. Also in our application, the size of the matrices involved is quite large and special attention should be paid to the memory ...management and algorithmic implementation; otherwise huge amounts of memory will be required to perform the optimization even for modest values of M and N
Intrusion Prevention and Detection in Grid Computing - The ALICE Case
NASA Astrophysics Data System (ADS)
Gomez, Andres; Lara, Camilo; Kebschull, Udo
2015-12-01
Grids allow users flexible on-demand usage of computing resources through remote communication networks. A remarkable example of a Grid in High Energy Physics (HEP) research is used in the ALICE experiment at European Organization for Nuclear Research CERN. Physicists can submit jobs used to process the huge amount of particle collision data produced by the Large Hadron Collider (LHC). Grids face complex security challenges. They are interesting targets for attackers seeking for huge computational resources. Since users can execute arbitrary code in the worker nodes on the Grid sites, special care should be put in this environment. Automatic tools to harden and monitor this scenario are required. Currently, there is no integrated solution for such requirement. This paper describes a new security framework to allow execution of job payloads in a sandboxed context. It also allows process behavior monitoring to detect intrusions, even when new attack methods or zero day vulnerabilities are exploited, by a Machine Learning approach. We plan to implement the proposed framework as a software prototype that will be tested as a component of the ALICE Grid middleware.
Ntofon, Okung-Dike; Channegowda, Mayur P; Efstathiou, Nikolaos; Rashidi Fard, Mehdi; Nejabati, Reza; Hunter, David K; Simeonidou, Dimitra
2013-02-25
In this paper, a novel Software-Defined Networking (SDN) architecture is proposed for high-end Ultra High Definition (UHD) media applications. UHD media applications require huge amounts of bandwidth that can only be met with high-capacity optical networks. In addition, there are requirements for control frameworks capable of delivering effective application performance with efficient network utilization. A novel SDN-based Controller that tightly integrates application-awareness with network control and management is proposed for such applications. An OpenFlow-enabled test-bed demonstrator is reported with performance evaluations of advanced online and offline media- and network-aware schedulers.
NASA Astrophysics Data System (ADS)
Christofori, E.; Bierwagen, J.
2013-07-01
Recording Cultural Heritage objects using terrestrial laserscanning becomes more and more popular over the last years. Since terrestrial Laserscanning System (TLS) Manufacturers have strongly increased the amount and speed of data captured with a single scan at each system upgrade and cutting down system costs the use of TLS Systems for recording cultural heritage is an option for recording worth to think about beside traditional methods like Photogrammetric. TLS Systems can be a great tool for capturing complex cultural heritage object within a short amount of time beside the traditional methods but can be a nightmare to handle for further process if not used right while capturing. Furthermore TLS Systems still have to be recognized as survey equipment, even though some of the manufactures promote them as everyday tool. They have to be used in an intelligent way having in mind the clients and the individual cultural objects needs. Thus the efficient way to use TLS Systems for data recording becomes a relevant topic to deal with the huge Amount of data the Systems collect while recording. Already small projects can turn into huge Pointcloud Datasets that End user, like Architects or Archaeologist neither can't deal with as their technical equipment doesn't fit the requirements of the Dataset nor do they have the software tools to use the Data as the current software tools still are high prized. Even the necessary interpretation of the Dataset can be a tough task if the people who have to work on with the Pointcloud aren't educated right in order to understand TLS and the results it creates. The use of TLS Systems has to have in mind the project requirements of the individual Heritage Object, like the required accuracy, standards for Levels of Details (e.g. "Empfehlungen für die Baudokumentation, Günther Eckstein, Germany"), the required kind of Deliverables (Visualization, 2D Drawings, True Deformation Drawings, 3D Models, BIM or 4D - Animations) as well as the Projects budget, restrictions and special conditions of the object. And if it's used in the right way TLS will fulfil all request and furthermore create additional recording, deliverable and financial benefit. Christofori und Partner is working with TLS Systems on cultural heritage Objects since 2005 trying to optimize the use of these systems (even in combining different systems like TLS, photogrammetric or new techniques) as well as creating useable deliverables for the Clients (Owner, Conservator, Designers and the Public) they can work on with.
Who owns Australia's water--elements of an effective regulatory model.
McKay, J
2003-01-01
This paper identifies and describes a number of global trends in regulatory theory and legal scholarship. It points out the huge level of complexity demanded by globalisation and the unfortunate complication of this is that there is legal indeterminacy. The legal indeterminacy springs from the desire to amend and alter existing models. That has been the thrust of the Council of Australian Governments changes to adapt and add huge amounts of complexity to a flawed system. This paper argues that an effective water regulatory model requires a fundamental re-examination of the concept of water ownership and a capturing by the State of the right to allocate rainfall. This foundation is effective and the way forward to deal with the key issues in this transition phase. The second key element to an effective regulatory model is the concept of performance-based assessment. This requires information and schemes to be set up to work out ways to monitor and evaluate the performance of the utility on selected criteria. For Australia at present there is a dire lack of agreed criteria on these key issues and these have the potential to pull apart the whole process. The key issues are indigenous rights, governance issues, public participation, alteration of pre-existing rights and incorporation of environmental requirements.
Development of a Database System for Data Obtained by Hyper Suprime-Cam on the Subaru Telescope
NASA Astrophysics Data System (ADS)
Yamada, Y.; Takata, T.; Furusawa, H.; Okura, Y.; Koike, M.; Yamanoi, H.; Yasuda, N.; Bickerton, S.; Katayama, N.; Mineo, S.; Lupton, R.; Bosch, J.; Loomis, C.; Miyatake, H.; Price, P.; Smith, K.; Lang, D.
2014-05-01
We are developing a database system for the Hyper Suprime-Cam (HSC) data on the Subaru Telescope in preparation for the HSC Survey. Since HSC has a huge field of view (1.5 degree diameter), it will produce a huge amount of data. Here, we make a brief report on the prototype of our database.
Next Generation Mass Memory Architecture
NASA Astrophysics Data System (ADS)
Herpel, H.-J.; Stahle, M.; Lonsdorfer, U.; Binzer, N.
2010-08-01
Future Mass Memory units will have to cope with various demanding requirements driven by onboard instruments (optical and SAR) that generate a huge amount of data (>10TBit) at a data rate > 6 Gbps. For downlink data rates around 3 Gbps will be feasible using latest ka-band technology together with Variable Coding and Modulation (VCM) techniques. These high data rates and storage capacities need to be effectively managed. Therefore, data structures and data management functions have to be improved and adapted to existing standards like the Packet Utilisation Standard (PUS). In this paper we will present a highly modular and scalable architectural approach for mass memories in order to support a wide range of mission requirements.
DICOMGrid: a middleware to integrate PACS and EELA-2 grid infrastructure
NASA Astrophysics Data System (ADS)
Moreno, Ramon A.; de Sá Rebelo, Marina; Gutierrez, Marco A.
2010-03-01
Medical images provide lots of information for physicians, but the huge amount of data produced by medical image equipments in a modern Health Institution is not completely explored in its full potential yet. Nowadays medical images are used in hospitals mostly as part of routine activities while its intrinsic value for research is underestimated. Medical images can be used for the development of new visualization techniques, new algorithms for patient care and new image processing techniques. These research areas usually require the use of huge volumes of data to obtain significant results, along with enormous computing capabilities. Such qualities are characteristics of grid computing systems such as EELA-2 infrastructure. The grid technologies allow the sharing of data in large scale in a safe and integrated environment and offer high computing capabilities. In this paper we describe the DicomGrid to store and retrieve medical images, properly anonymized, that can be used by researchers to test new processing techniques, using the computational power offered by grid technology. A prototype of the DicomGrid is under evaluation and permits the submission of jobs into the EELA-2 grid infrastructure while offering a simple interface that requires minimal understanding of the grid operation.
Storage and Retrieval of Large RDF Graph Using Hadoop and MapReduce
NASA Astrophysics Data System (ADS)
Farhan Husain, Mohammad; Doshi, Pankil; Khan, Latifur; Thuraisingham, Bhavani
Handling huge amount of data scalably is a matter of concern for a long time. Same is true for semantic web data. Current semantic web frameworks lack this ability. In this paper, we describe a framework that we built using Hadoop to store and retrieve large number of RDF triples. We describe our schema to store RDF data in Hadoop Distribute File System. We also present our algorithms to answer a SPARQL query. We make use of Hadoop's MapReduce framework to actually answer the queries. Our results reveal that we can store huge amount of semantic web data in Hadoop clusters built mostly by cheap commodity class hardware and still can answer queries fast enough. We conclude that ours is a scalable framework, able to handle large amount of RDF data efficiently.
Immunoinformatics: an integrated scenario
Tomar, Namrata; De, Rajat K
2010-01-01
Genome sequencing of humans and other organisms has led to the accumulation of huge amounts of data, which include immunologically relevant data. A large volume of clinical data has been deposited in several immunological databases and as a result immunoinformatics has emerged as an important field which acts as an intersection between experimental immunology and computational approaches. It not only helps in dealing with the huge amount of data but also plays a role in defining new hypotheses related to immune responses. This article reviews classical immunology, different databases and prediction tools. It also describes applications of immunoinformatics in designing in silico vaccination and immune system modelling. All these efforts save time and reduce cost. PMID:20722763
NASA Astrophysics Data System (ADS)
Efstathiou, Nectarios; Skitsas, Michael; Psaroudakis, Chrysostomos; Koutras, Nikolaos
2017-09-01
Nowadays, video surveillance cameras are used for the protection and monitoring of a huge number of facilities worldwide. An important element in such surveillance systems is the use of aerial video streams originating from onboard sensors located on Unmanned Aerial Vehicles (UAVs). Video surveillance using UAVs represent a vast amount of video to be transmitted, stored, analyzed and visualized in a real-time way. As a result, the introduction and development of systems able to handle huge amount of data become a necessity. In this paper, a new approach for the collection, transmission and storage of aerial videos and metadata is introduced. The objective of this work is twofold. First, the integration of the appropriate equipment in order to capture and transmit real-time video including metadata (i.e. position coordinates, target) from the UAV to the ground and, second, the utilization of the ADITESS Versatile Media Content Management System (VMCMS-GE) for storing of the video stream and the appropriate metadata. Beyond the storage, VMCMS-GE provides other efficient management capabilities such as searching and processing of videos, along with video transcoding. For the evaluation and demonstration of the proposed framework we execute a use case where the surveillance of critical infrastructure and the detection of suspicious activities is performed. Collected video Transcodingis subject of this evaluation as well.
Enabling Graph Appliance for Genome Assembly
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singh, Rina; Graves, Jeffrey A; Lee, Sangkeun
2015-01-01
In recent years, there has been a huge growth in the amount of genomic data available as reads generated from various genome sequencers. The number of reads generated can be huge, ranging from hundreds to billions of nucleotide, each varying in size. Assembling such large amounts of data is one of the challenging computational problems for both biomedical and data scientists. Most of the genome assemblers developed have used de Bruijn graph techniques. A de Bruijn graph represents a collection of read sequences by billions of vertices and edges, which require large amounts of memory and computational power to storemore » and process. This is the major drawback to de Bruijn graph assembly. Massively parallel, multi-threaded, shared memory systems can be leveraged to overcome some of these issues. The objective of our research is to investigate the feasibility and scalability issues of de Bruijn graph assembly on Cray s Urika-GD system; Urika-GD is a high performance graph appliance with a large shared memory and massively multithreaded custom processor designed for executing SPARQL queries over large-scale RDF data sets. However, to the best of our knowledge, there is no research on representing a de Bruijn graph as an RDF graph or finding Eulerian paths in RDF graphs using SPARQL for potential genome discovery. In this paper, we address the issues involved in representing a de Bruin graphs as RDF graphs and propose an iterative querying approach for finding Eulerian paths in large RDF graphs. We evaluate the performance of our implementation on real world ebola genome datasets and illustrate how genome assembly can be accomplished with Urika-GD using iterative SPARQL queries.« less
Profiling and Improving I/O Performance of a Large-Scale Climate Scientific Application
NASA Technical Reports Server (NTRS)
Liu, Zhuo; Wang, Bin; Wang, Teng; Tian, Yuan; Xu, Cong; Wang, Yandong; Yu, Weikuan; Cruz, Carlos A.; Zhou, Shujia; Clune, Tom;
2013-01-01
Exascale computing systems are soon to emerge, which will pose great challenges on the huge gap between computing and I/O performance. Many large-scale scientific applications play an important role in our daily life. The huge amounts of data generated by such applications require highly parallel and efficient I/O management policies. In this paper, we adopt a mission-critical scientific application, GEOS-5, as a case to profile and analyze the communication and I/O issues that are preventing applications from fully utilizing the underlying parallel storage systems. Through in-detail architectural and experimental characterization, we observe that current legacy I/O schemes incur significant network communication overheads and are unable to fully parallelize the data access, thus degrading applications' I/O performance and scalability. To address these inefficiencies, we redesign its I/O framework along with a set of parallel I/O techniques to achieve high scalability and performance. Evaluation results on the NASA discover cluster show that our optimization of GEOS-5 with ADIOS has led to significant performance improvements compared to the original GEOS-5 implementation.
Impact of ICT on Performance of Construction Companies in Slovakia
NASA Astrophysics Data System (ADS)
Mesároš, Peter; Mandičák, Tomáš
2017-10-01
Information and communication technologies became a part of management tools in modern companies. Construction industry and its participants deal with a serious requirement for processing the huge amount of information on construction projects including design, construction, time and cost parameters, economic efficiency and sustainability. To fulfil this requirement, companies have to use appropriate ICT tools. Aim of the paper is to examine the impact of ICT exploitation on performance of construction companies. The impact of BIM tools, ERP systems and controlling system on cost and profit indicators will be measured on the sample of 85 companies from construction industry in Slovakia. Enterprise size, enterprise ownership and role in construction process will be set as independent variables for statistical analyse. The results will be considered for different groups of companies.
A novel data storage logic in the cloud
Mátyás, Bence; Szarka, Máté; Járvás, Gábor; Kusper, Gábor; Argay, István; Fialowski, Alice
2016-01-01
Databases which store and manage long-term scientific information related to life science are used to store huge amount of quantitative attributes. Introduction of a new entity attribute requires modification of the existing data tables and the programs that use these data tables. The solution is increasing the virtual data tables while the number of screens remains the same. The main objective of the present study was to introduce a logic called Joker Tao (JT) which provides universal data storage for cloud-based databases. It means all types of input data can be interpreted as an entity and attribute at the same time, in the same data table. PMID:29026521
A novel data storage logic in the cloud.
Mátyás, Bence; Szarka, Máté; Járvás, Gábor; Kusper, Gábor; Argay, István; Fialowski, Alice
2016-01-01
Databases which store and manage long-term scientific information related to life science are used to store huge amount of quantitative attributes. Introduction of a new entity attribute requires modification of the existing data tables and the programs that use these data tables. The solution is increasing the virtual data tables while the number of screens remains the same. The main objective of the present study was to introduce a logic called Joker Tao (JT) which provides universal data storage for cloud-based databases. It means all types of input data can be interpreted as an entity and attribute at the same time, in the same data table.
Towards an automated intelligence product generation capability
NASA Astrophysics Data System (ADS)
Smith, Alison M.; Hawes, Timothy W.; Nolan, James J.
2015-05-01
Creating intelligence information products is a time consuming and difficult process for analysts faced with identifying key pieces of information relevant to a complex set of information requirements. Complicating matters, these key pieces of information exist in multiple modalities scattered across data stores, buried in huge volumes of data. This results in the current predicament analysts find themselves; information retrieval and management consumes huge amounts of time that could be better spent performing analysis. The persistent growth in data accumulation rates will only increase the amount of time spent on these tasks without a significant advance in automated solutions for information product generation. We present a product generation tool, Automated PrOduct Generation and Enrichment (APOGEE), which aims to automate the information product creation process in order to shift the bulk of the analysts' effort from data discovery and management to analysis. APOGEE discovers relevant text, imagery, video, and audio for inclusion in information products using semantic and statistical models of unstructured content. APOGEEs mixed-initiative interface, supported by highly responsive backend mechanisms, allows analysts to dynamically control the product generation process ensuring a maximally relevant result. The combination of these capabilities results in significant reductions in the time it takes analysts to produce information products while helping to increase the overall coverage. Through evaluation with a domain expert, APOGEE has been shown the potential to cut down the time for product generation by 20x. The result is a flexible end-to-end system that can be rapidly deployed in new operational settings.
Modeling of biological intelligence for SCM system optimization.
Chen, Shengyong; Zheng, Yujun; Cattani, Carlo; Wang, Wanliang
2012-01-01
This article summarizes some methods from biological intelligence for modeling and optimization of supply chain management (SCM) systems, including genetic algorithms, evolutionary programming, differential evolution, swarm intelligence, artificial immune, and other biological intelligence related methods. An SCM system is adaptive, dynamic, open self-organizing, which is maintained by flows of information, materials, goods, funds, and energy. Traditional methods for modeling and optimizing complex SCM systems require huge amounts of computing resources, and biological intelligence-based solutions can often provide valuable alternatives for efficiently solving problems. The paper summarizes the recent related methods for the design and optimization of SCM systems, which covers the most widely used genetic algorithms and other evolutionary algorithms.
Exploitation and Benefits of BIM in Construction Project Management
NASA Astrophysics Data System (ADS)
Mesároš, Peter; Mandičák, Tomáš
2017-10-01
BIM is increasingly getting into the awareness in construction industry. BIM is the process of creating and data managing of the building during its life cycle. BIM became a part of management tools in modern construction companies. Construction projects have a number of participants. It means difficulty process of construction project management and a serious requirement for processing the huge amount of information including design, construction, time and cost parameters, economic efficiency and sustainability. Progressive information and communication technologies support cost management and management of construction project. One of them is Building Information Modelling. Aim of the paper is to examine the impact of BIM exploitation and benefits on construction project management in Slovak companies.
Modeling of Biological Intelligence for SCM System Optimization
Chen, Shengyong; Zheng, Yujun; Cattani, Carlo; Wang, Wanliang
2012-01-01
This article summarizes some methods from biological intelligence for modeling and optimization of supply chain management (SCM) systems, including genetic algorithms, evolutionary programming, differential evolution, swarm intelligence, artificial immune, and other biological intelligence related methods. An SCM system is adaptive, dynamic, open self-organizing, which is maintained by flows of information, materials, goods, funds, and energy. Traditional methods for modeling and optimizing complex SCM systems require huge amounts of computing resources, and biological intelligence-based solutions can often provide valuable alternatives for efficiently solving problems. The paper summarizes the recent related methods for the design and optimization of SCM systems, which covers the most widely used genetic algorithms and other evolutionary algorithms. PMID:22162724
Efficient boundary hunting via vector quantization
NASA Astrophysics Data System (ADS)
Diamantini, Claudia; Panti, Maurizio
2001-03-01
A great amount of information about a classification problem is contained in those instances falling near the decision boundary. This intuition dates back to the earliest studies in pattern recognition, and in the more recent adaptive approaches to the so called boundary hunting, such as the work of Aha et alii on Instance Based Learning and the work of Vapnik et alii on Support Vector Machines. The last work is of particular interest, since theoretical and experimental results ensure the accuracy of boundary reconstruction. However, its optimization approach has heavy computational and memory requirements, which limits its application on huge amounts of data. In the paper we describe an alternative approach to boundary hunting based on adaptive labeled quantization architectures. The adaptation is performed by a stochastic gradient algorithm for the minimization of the error probability. Error probability minimization guarantees the accurate approximation of the optimal decision boundary, while the use of a stochastic gradient algorithm defines an efficient method to reach such approximation. In the paper comparisons to Support Vector Machines are considered.
Self-Assembled Smart Nanocarriers for Targeted Drug Delivery.
Cui, Wei; Li, Junbai; Decher, Gero
2016-02-10
Nanostructured drug-carrier systems promise numerous benefits for drug delivery. They can be engineered to precisely control drug-release rates or to target specific sites within the body with a specific amount of therapeutic agent. However, to achieve the best therapeutic effects, the systems should be designed for carrying the optimum amount of a drug to the desired target where it should be released at the optimum rate for a specified time. Despite numerous attempts, fulfilling all of these requirements in a synergistic way remains a huge challenge. The trend in drug delivery is consequently directed toward integrated multifunctional carrier systems, providing selective recognition in combination with sustained or triggered release. Capsules as vesicular systems enable drugs to be confined for controlled release. Furthermore, carriers modified with recognition groups can enhance the capability of encapsulated drug efficacy. Here, recent advances are reviewed regarding designing and preparing assembled capsules with targeting ligands or size controllable for selective recognition in drug delivery. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Principles of gene microarray data analysis.
Mocellin, Simone; Rossi, Carlo Riccardo
2007-01-01
The development of several gene expression profiling methods, such as comparative genomic hybridization (CGH), differential display, serial analysis of gene expression (SAGE), and gene microarray, together with the sequencing of the human genome, has provided an opportunity to monitor and investigate the complex cascade of molecular events leading to tumor development and progression. The availability of such large amounts of information has shifted the attention of scientists towards a nonreductionist approach to biological phenomena. High throughput technologies can be used to follow changing patterns of gene expression over time. Among them, gene microarray has become prominent because it is easier to use, does not require large-scale DNA sequencing, and allows for the parallel quantification of thousands of genes from multiple samples. Gene microarray technology is rapidly spreading worldwide and has the potential to drastically change the therapeutic approach to patients affected with tumor. Therefore, it is of paramount importance for both researchers and clinicians to know the principles underlying the analysis of the huge amount of data generated with microarray technology.
Faster sequence homology searches by clustering subsequences.
Suzuki, Shuji; Kakuta, Masanori; Ishida, Takashi; Akiyama, Yutaka
2015-04-15
Sequence homology searches are used in various fields. New sequencing technologies produce huge amounts of sequence data, which continuously increase the size of sequence databases. As a result, homology searches require large amounts of computational time, especially for metagenomic analysis. We developed a fast homology search method based on database subsequence clustering, and implemented it as GHOSTZ. This method clusters similar subsequences from a database to perform an efficient seed search and ungapped extension by reducing alignment candidates based on triangle inequality. The database subsequence clustering technique achieved an ∼2-fold increase in speed without a large decrease in search sensitivity. When we measured with metagenomic data, GHOSTZ is ∼2.2-2.8 times faster than RAPSearch and is ∼185-261 times faster than BLASTX. The source code is freely available for download at http://www.bi.cs.titech.ac.jp/ghostz/ akiyama@cs.titech.ac.jp Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.
ERIC Educational Resources Information Center
Flannery, Maura C.
2004-01-01
New material discovered in the study of cell research is presented for the benefit of biology teachers. Huge amounts of data are being generated in fields like cellular dynamics, and it is felt that people's understanding of the cell is becoming much more complex and detailed.
Quasar Drenched in Water Vapor Artist Concept
2012-08-31
Artist concept illustrates a quasar, or feeding black hole, similar to APM 08279+5255, where astronomers discovered huge amounts of water vapor. Gas and dust likely form a torus around the central black hole, with clouds of charged gas above and below.
1994-02-08
reasons. Second, they provide an important bridge between the huge amount of work on categorization to the much smaller body of work on problem...subjects talking more about relevant information compared to the neutral problems (i.e., a higher proption of the protocol statements concerned information
Fiber in access technologies and network convergence: an opportunity for optical integration
NASA Astrophysics Data System (ADS)
Ghiggino, Pierpaolo C.
2008-11-01
Broadband networks are among the fastest growing segment in telecom. The initial and still very significant push originated with xDSL technologies and indeed a significant amount of research and development is still occurring in this field with impressive results and allowing for a remarkable use of the installed copper infrastructure way beyond its originally planned bandwidth capabilities. However it is clear that ultimately a more suitable fiber based infrastructure will be needed in order to reduce both operational and network technology costs. Such cost reduction in inevitable as the added value to end users is only related to services and these cannot be priced outside a sensible window, whilst the related bandwidth increase is much more dramatic and its huge variability must be met with little or no cost impact by the network and its operation. Fiber in access has indeed the potential to cope with a huge bandwidth demand for many years to come as its inherent bandwidth capabilities are only just tapped by current service requirements. However the whole technology supply chain must follow in line. In particular optical technology must brace itself to cope with the required much larger deployment and greater cost effectiveness, whilst at the same time deliver performance suitable to the bandwidth increase offered in the longer term by the fiber medium. This paper looks at this issues and debates the opportunities for a new class of optical devices making use of the progress in optical integration
Significance and integration of molecular diagnostics in the framework of veterinary practice.
Aranaz, Alicia
2015-01-01
The field of molecular diagnostics in veterinary practice is rapidly evolving. An array of molecular techniques of different complexity is available to facilitate the fast and specific diagnosis of animal diseases. The choice for the adequate technique is dependent on the mission and attributions of the laboratory and requires both a knowledge of the molecular biology basis and of its limitations. The ability to quickly detect pathogens and their characteristics would allow for precise decision-making and target measures such as prophylaxis, appropriate therapy, and biosafety plans to control disease outbreaks. In practice, taking benefit of the huge amount of data that can be obtained using molecular techniques highlights the need of collaboration between veterinarians in the laboratory and practitioners.
NASA Astrophysics Data System (ADS)
Tauanov, Z.; Abylgazina, L.; Spitas, C.; Itskos, G.; Inglezakis, V.
2017-09-01
Coal fly ash (CFA) is a waste by-product of coal combustion. Kazakhstan has vast coal deposits and is major consumer of coal and hence produces huge amounts of CFA annually. The government aims to recycle and effectively utilize this waste by-product. Thus, a detailed study of the physical and chemical properties of material is required as the data available in literature is either outdated or not applicable for recently produced CFA samples. The full mineralogical, microstructural and thermal characterization of three types of coal fly ash (CFA) produced in two large Kazakhstani power plants is reported in this work. The properties of CFAs were compared between samples as well as with published values.
A data storage and retrieval model for Louisiana traffic operations data : final report.
DOT National Transportation Integrated Search
1995-09-01
The type and amount of data managed by the Louisiana Department of Transportation and Development (DOTD) are huge. In many cases, these data are used to perform traffic engineering studies and highway safety analyses, among others. At the present tim...
... wanting to go to parties or out for dinner) What Is Bulimia? Instead of starving themselves, people who have bulimia nervosa (say: boo-LEE-mee-uh nur-VOH-suh) will binge and purge . That means they will binge (that is, eat a huge amount of food, like a tub ...
Celesti, Antonio; Fazio, Maria; Romano, Agata; Bramanti, Alessia; Bramanti, Placido; Villari, Massimo
2018-05-01
The Open Archive Information System (OAIS) is a reference model for organizing people and resources in a system, and it is already adopted in care centers and medical systems to efficiently manage clinical data, medical personnel, and patients. Archival storage systems are typically implemented using traditional relational database systems, but the relation-oriented technology strongly limits the efficiency in the management of huge amount of patients' clinical data, especially in emerging cloud-based, that are distributed. In this paper, we present an OAIS healthcare architecture useful to manage a huge amount of HL7 clinical documents in a scalable way. Specifically, it is based on a NoSQL column-oriented Data Base Management System deployed in the cloud, thus to benefit from a big tables and wide rows available over a virtual distributed infrastructure. We developed a prototype of the proposed architecture at the IRCCS, and we evaluated its efficiency in a real case of study.
NASA Technical Reports Server (NTRS)
2007-01-01
Our 'constant' sun is really more like a spherical sea of incredibly hot plasma, changing all the time. Astronomers like to keep a good eye on it, so no dramatic change goes by unnoticed. One amazing occurrence happened on Dec 7, 2007 and was seen by one of the two STEREO satellites. STEREO, as you recall, consists of a pair of satellites which observe the sun from different angles and allow astronomers to get a ŗ-D' view of the solar atmosphere and solar outflows. On December 7 one of the STEREO satellites captured a view (in the extreme ultraviolet part of the electromagnetic spectrum) of a Coronal Mass Ejection that released a huge amount of energy into the solar atmosphere, and a huge amount of matter into interplanetary space. A sort of atmospheric 'sunquake'. One result of this 'sunquake' was the production of a giant wave rippling through almost the entire solar atmosphere. The image above shows a snapshot of this unbelievable wave, slightly enhanced for viewability. Don't miss the movie. What damps the wave?
Behaviour of Belle II ARICH Hybrid Avalanche Photo-Detector in magnetic field
NASA Astrophysics Data System (ADS)
Kindo, H.; Adachi, I.; Dolenec, R.; Hataya, K.; Iori, S.; Iwata, S.; Kakuno, H.; Kataura, R.; Kawai, H.; Kobayashi, T.; Konno, T.; Korpar, S.; Kriz˘an, P.; Kumita, T.; Mrvar, M.; Nishida, S.; Ogawa, K.; Ogawa, S.; Pestotnik, R.; Šantelj, L.; Sumiyoshi, T.; Tabata, M.; Yonenaga, M.; Yusa, Y.
2017-12-01
The proximity-focusing Aerogel Ring-Imaging Cherenkov detector (ARICH) has been designed to separate kaons from pions in the forward end-cap of the Belle II spectrometer. The detector will be placed in 1.5 T magnetic field and must have immunity to it. In ARICH R&D, we solve the problem with new equipment called Hybrid Avalanche Photo-Detector (HAPD) which developed by Hamamatsu Photonics. Recently the production of about 500 HAPDs was completed. We test HAPDs in magnetic field in KEK. We found some HAPDs have significant amount of dead time, which reaches up to 30% in the worst case. The dead time is caused by very large (more than 10,000 times larger than a single photon signal) and frequent (∼5 Hz) signals, which make electronics paralysed. The huge signals are observed in about 30% of HAPDs. To identify the origin and understand the mechanism, we perform some extra test of HAPDs. We find a strange dependence of the huge signals to the APD bias voltage. If we reduce the bias voltage applied to one of the 4 APDs by 10 V, the frequency of the huge signals is much reduced. On the other hand, if we reduce the voltage of all the 4 HAPDs, huge signals do not decrease, or even increase in some case. We also find the huge signals seems to be related to the vacuum inside HAPD. We present about the observation of the huge signals of HAPDs in the magnetic field, and our strategy to manage it.
Real-time high-level video understanding using data warehouse
NASA Astrophysics Data System (ADS)
Lienard, Bruno; Desurmont, Xavier; Barrie, Bertrand; Delaigle, Jean-Francois
2006-02-01
High-level Video content analysis such as video-surveillance is often limited by computational aspects of automatic image understanding, i.e. it requires huge computing resources for reasoning processes like categorization and huge amount of data to represent knowledge of objects, scenarios and other models. This article explains how to design and develop a "near real-time adaptive image datamart", used, as a decisional support system for vision algorithms, and then as a mass storage system. Using RDF specification as storing format of vision algorithms meta-data, we can optimise the data warehouse concepts for video analysis, add some processes able to adapt the current model and pre-process data to speed-up queries. In this way, when new data is sent from a sensor to the data warehouse for long term storage, using remote procedure call embedded in object-oriented interfaces to simplified queries, they are processed and in memory data-model is updated. After some processing, possible interpretations of this data can be returned back to the sensor. To demonstrate this new approach, we will present typical scenarios applied to this architecture such as people tracking and events detection in a multi-camera network. Finally we will show how this system becomes a high-semantic data container for external data-mining.
ELECTRICITY GENERATION FROM ANAEROBIC WASTEWATER TREATMENT IN MICROBIAL FUEL CELLS (MFCS) - PHASE I
Municipal wastewater treatment plants represent a huge energy ‘sink’ in the United States. Estimates are that these plants consume up to 3 percent of the total amount of power consumed annually. Ironically, the wastewater is concentrated with materials (carbohydrates) which ...
Privacy Preserving PCA on Distributed Bioinformatics Datasets
ERIC Educational Resources Information Center
Li, Xin
2011-01-01
In recent years, new bioinformatics technologies, such as gene expression microarray, genome-wide association study, proteomics, and metabolomics, have been widely used to simultaneously identify a huge number of human genomic/genetic biomarkers, generate a tremendously large amount of data, and dramatically increase the knowledge on human…
Personalized Recommender System for Digital Libraries
ERIC Educational Resources Information Center
Omisore, M. O.; Samuel, O. W.
2014-01-01
The huge amount of information available online has given rise to personalization and filtering systems. Recommender systems (RS) constitute a specific type of information filtering technique that present items according to user's interests. In this research, a web-based personalized recommender system capable of providing learners with books that…
Intellectual Property: a powerful tool to develop biotech research
Giugni, Diego; Giugni, Valter
2010-01-01
Summary Today biotechnology is perhaps the most important technology field because of the strong health and food implications. However, due to the nature of said technology, there is the need of a huge amount of investments to sustain the experimentation costs. Consequently, investors aim to safeguard as much as possible their investments. Intellectual Property, and in particular patents, has been demonstrated to actually constitute a powerful tool to help them. Moreover, patents represent an extremely important means to disclose biotechnology inventions. Patentable biotechnology inventions involve products as nucleotide and amino acid sequences, microorganisms, processes or methods for modifying said products, uses for the manufacture of medicaments, etc. There are several ways to protect inventions, but all follow the three main patentability requirements: novelty, inventive step and industrial application. PMID:21255349
NASA Astrophysics Data System (ADS)
Chiappa, Pierangelo
Bandwidth-hungry services, such as higher speed Internet, voice over IP (VoIP), and IPTV, allow people to exchange and store huge amounts of data among worldwide locations. In the age of global communications, domestic users, companies, and organizations around the world generate new contents making bandwidth needs grow exponentially, along with the need for new services. These bandwidth and connectivity demands represent a concern for operators who require innovative technologies to be ready for scaling. To respond efficiently to these demands, Alcatel-Lucent is fast moving toward photonic integration circuits technologies as the key to address best performances at the lowest "bit per second" cost. This article describes Alcatel-Lucent's contribution in strategic directions or achievements, as well as possible new developments.
Real-time UNIX in HEP data acquisition
NASA Astrophysics Data System (ADS)
Buono, S.; Gaponenko, I.; Jones, R.; Mapelli, L.; Mornacchi, G.; Prigent, D.; Sanchez-Corral, E.; Skiadelli, M.; Toppers, A.; Duval, P. Y.; Ferrato, D.; Le Van Suu, A.; Qian, Z.; Rondot, C.; Ambrosini, G.; Fumagalli, G.; Aguer, M.; Huet, M.
1994-12-01
Today's experimentation in high energy physics is characterized by an increasing need for sensitivity to rare phenomena and complex physics signatures, which require the use of huge and sophisticated detectors and consequently a high performance readout and data acquisition. Multi-level triggering, hierarchical data collection and an always increasing amount of processing power, distributed throughout the data acquisition layers, will impose a number of features on the software environment, especially the need for a high level of standardization. Real-time UNIX seems, today, the best solution for the platform independence, operating system interface standards and real-time features necessary for data acquisition in HEP experiments. We present the results of the evaluation, in a realistic application environment, of a Real-Time UNIX operating system: the EP/LX real-time UNIX system.
Human Microbiome Acquisition and Bioinformatic Challenges in Metagenomic Studies
2018-01-01
The study of the human microbiome has become a very popular topic. Our microbial counterpart, in fact, appears to play an important role in human physiology and health maintenance. Accordingly, microbiome alterations have been reported in an increasing number of human diseases. Despite the huge amount of data produced to date, less is known on how a microbial dysbiosis effectively contributes to a specific pathology. To fill in this gap, other approaches for microbiome study, more comprehensive than 16S rRNA gene sequencing, i.e., shotgun metagenomics and metatranscriptomics, are becoming more widely used. Methods standardization and the development of specific pipelines for data analysis are required to contribute to and increase our understanding of the human microbiome relationship with health and disease status. PMID:29382070
Radiomics: Extracting more information from medical images using advanced feature analysis
Lambin, Philippe; Rios-Velazquez, Emmanuel; Leijenaar, Ralph; Carvalho, Sara; van Stiphout, Ruud G.P.M.; Granton, Patrick; Zegers, Catharina M.L.; Gillies, Robert; Boellard, Ronald; Dekker, André; Aerts, Hugo J.W.L.
2015-01-01
Solid cancers are spatially and temporally heterogeneous. This limits the use of invasive biopsy based molecular assays but gives huge potential for medical imaging, which has the ability to capture intra-tumoural heterogeneity in a non-invasive way. During the past decades, medical imaging innovations with new hardware, new imaging agents and standardised protocols, allows the field to move towards quantitative imaging. Therefore, also the development of automated and reproducible analysis methodologies to extract more information from image-based features is a requirement. Radiomics – the high-throughput extraction of large amounts of image features from radiographic images – addresses this problem and is one of the approaches that hold great promises but need further validation in multi-centric settings and in the laboratory. PMID:22257792
Personal Digital Information Archiving among Students of Social Sciences and Humanities
ERIC Educational Resources Information Center
Krtalic, Maja; Marcetic, Hana; Micunovic, Milijana
2016-01-01
Introduction: As both academic citizens and active participants in information society who use information, students produce huge amounts of personal digital data and documents. It is therefore important to raise questions about their awareness, responsibility, tendencies and activities they undertake to preserve their collective digital heritage.…
USDA-ARS?s Scientific Manuscript database
The rheological properties of modified waxy starch and waxy starch-polyacrylamide graft copolymers prepared by reactive extrusion were investigated. Both materials can absorb huge amount of water and form gels. The modified waxy starch and waxy starch-polyacrylamide graft copolymer gels all exhibite...
ERIC Educational Resources Information Center
Horvath, Thomas
2005-01-01
In 1986, Lake Nyos, a volcanic lake in Cameroon, released a huge amount of carbon dioxide gas, killing over 1,700 people in the surrounding area. This case study, developed for use in a limnology or aquatic biology course, explores that event, introducing students to concepts relating to lake formation, thermal stratification, and dissolved gases.…
[Pulmonary Carcinosarcoma Presenting Hemothorax Caused by Pleural Invasion;Report of a Case].
Kazawa, Nobukata; Shibamoto, Yuta; Kitabayashi, Yukiya; Ishihara, Yumi; Gotoh, Taeko; Sawada, Yuusuke; Inukai, Ryo; Tsujimura, Takashi; Hattori, Hideo; Niimi, Akio; Nakanishi, Ryoichi; Kitaichi, Masanori
2016-11-01
A 71-year-old man presented with hemothorax with cough, sputa and worsening dyspnea. On chest X-ray and computed tomography(CT), a huge tumor in the right upper lobe with hematoma and small amount of gas suggesting hemopneumothorax was revealed. No apparent lymphadenopathy nor intrapulmonary metastases were observed. The tumor showed a little enhancement on the contrastenhanced CT. Then the resction of the tumor was performed, and the pathological evaluation revealed a carcionosarcoma (adenocarcinoma+osteosarcoma) pT3N0 (stage II B) G4 pl2. Sarcomatoid carcinoma such as carcinosarcoma should be considered as a possible cause of hemothorax in making a diagnosis of hemorrhagic hypovascular huge lung tumor.
NASA Astrophysics Data System (ADS)
Dittmar, N.; Haberstroh, Ch.; Hesse, U.; Krzyzowski, M.
2016-04-01
The transfer of liquid helium (LHe) into mobile dewars or transport vessels is a common and unavoidable process at LHe decant stations. During this transfer reasonable amounts of LHe evaporate due to heat leak and pressure drop. Thus generated helium gas needs to be collected and reliquefied which requires a huge amount of electrical energy. Therefore, the design of transfer lines used at LHe decant stations has been optimised to establish a LHe transfer with minor evaporation losses which increases the overall efficiency and capacity of LHe decant stations. This paper presents the experimental results achieved during the thermohydraulic optimisation of a flexible LHe transfer line. An extensive measurement campaign with a set of dedicated transfer lines equipped with pressure and temperature sensors led to unique experimental data of this specific transfer process. The experimental results cover the heat leak, the pressure drop, the transfer rate, the outlet quality, and the cool-down and warm-up behaviour of the examined transfer lines. Based on the obtained results the design of the considered flexible transfer line has been optimised, featuring reduced heat leak and pressure drop.
Parallel computing in genomic research: advances and applications
Ocaña, Kary; de Oliveira, Daniel
2015-01-01
Today’s genomic experiments have to process the so-called “biological big data” that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. PMID:26604801
Parallel computing in genomic research: advances and applications.
Ocaña, Kary; de Oliveira, Daniel
2015-01-01
Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities.
MPEG-7 based video annotation and browsing
NASA Astrophysics Data System (ADS)
Hoeynck, Michael; Auweiler, Thorsten; Wellhausen, Jens
2003-11-01
The huge amount of multimedia data produced worldwide requires annotation in order to enable universal content access and to provide content-based search-and-retrieval functionalities. Since manual video annotation can be time consuming, automatic annotation systems are required. We review recent approaches to content-based indexing and annotation of videos for different kind of sports and describe our approach to automatic annotation of equestrian sports videos. We especially concentrate on MPEG-7 based feature extraction and content description, where we apply different visual descriptors for cut detection. Further, we extract the temporal positions of single obstacles on the course by analyzing MPEG-7 edge information. Having determined single shot positions as well as the visual highlights, the information is jointly stored with meta-textual information in an MPEG-7 description scheme. Based on this information, we generate content summaries which can be utilized in a user-interface in order to provide content-based access to the video stream, but further for media browsing on a streaming server.
Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds
NASA Astrophysics Data System (ADS)
Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni
2012-09-01
Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.
A case report of surgical debulking for a huge mass of elephantiasis neuromatosa
Hoshi, Manabu; Ieguchi, Makoto; Taguchi, Susumu; Yamasaki, Shinya
2009-01-01
Achievement of a safe outcome for an extensive mass with hypervascularity in the extremities requires a surgical team skilled in musculoskeletal oncology. We report debulking surgery for a huge mass of elephantiasis neuromatosa in the right leg of a 56-year old man using the novel Ligasure® vessel sealing system. PMID:21139882
Liu, Lin
2013-03-01
Dynamics of lipid bodies and plastids in chili pepper fruits during ripening were investigated by means of transmission electron microscopy. Mesocarp of chili pepper fruits consists of collenchyma, normal parenchyma, and huge celled parenchyma. In mature green fruits, plastids contain numerous thylakoids that are well organized into grana in collenchyma, a strikingly huge amount of starch and irregularly organized thylakoids in normal parenchyma, and simple tubes rather than thylakoids in huge celled parenchyma. These morphological features suggest that plastids are chloroplasts in collenchyma, chloroamyloplasts in normal parenchyma, proplastids in huge celled parenchyma. As fruits ripen to red, plastids in all cell types convert to chromoplasts and, concomitantly, lipid bodies accumulate in both cytoplasm and chromoplasts. Cytosolic lipid bodies are lined up in a regular layer adjacent to plasma membrane. The cytosolic lipid body consists of a core surrounded by a membrane. The core is comprised of a more electron-dense central part enclosed by a slightly less electron-dense peripheral layer. Plastidial lipid bodies in collenchyma, normal parenchyma, and endodermis initiate as plastoglobuli, which in turn convert to rod-like structures. Therefore, plastidial lipid bodies are more dynamic than cytosolic lipid bodies. Both cytosolic and plastidial lipid bodies contain rich unsaturated lipids. Copyright © 2012 Elsevier Ltd. All rights reserved.
Virtual Resources Centers and Their Role in Small Rural Schools.
ERIC Educational Resources Information Center
Freitas, Candido Varela de; Silva, Antonio Pedro da
Virtual resources centers have been considered a pedagogical tool since the increasing development of electronic means allowed for the storage of huge amounts of information and its easy retrieval. Bearing in mind the need for enhancing the appearance of those centers, a discipline of "Management of Resources Centers" was included in a…
A Survey of Stemming Algorithms in Information Retrieval
ERIC Educational Resources Information Center
Moral, Cristian; de Antonio, Angélica; Imbert, Ricardo; Ramírez, Jaime
2014-01-01
Background: During the last fifty years, improved information retrieval techniques have become necessary because of the huge amount of information people have available, which continues to increase rapidly due to the use of new technologies and the Internet. Stemming is one of the processes that can improve information retrieval in terms of…
ERIC Educational Resources Information Center
Gunal, Serkan
2008-01-01
Digital libraries play a crucial role in distance learning. Nowadays, they are one of the fundamental information sources for the students enrolled in this learning system. These libraries contain huge amount of instructional data (text, audio and video) offered by the distance learning program. Organization of the digital libraries is…
Merged data models for multi-parameterized querying: Spectral data base meets GIS-based map archive
NASA Astrophysics Data System (ADS)
Naß, A.; D'Amore, M.; Helbert, J.
2017-09-01
Current and upcoming planetary missions deliver a huge amount of different data (remote sensing data, in-situ data, and derived products). Within this contribution present how different data (bases) can be managed and merged, to enable multi-parameterized querying based on the constant spatial context.
Academic Analytics: Anatomy of an Exploratory Essay
ERIC Educational Resources Information Center
Ferreira, Sérgio André; Andrade, António
2016-01-01
Investment in technological subsystems to support the activity of teaching and learning and the various areas of the life of Higher Education Institutions (HEI) is of increasing importance in the implementation of the policy and strategy of these organizations. Each of these subsystems collects a huge amount of data that, if properly organized,…
ERIC Educational Resources Information Center
Hinton, Geoffrey
2014-01-01
It is possible to learn multiple layers of non-linear features by backpropagating error derivatives through a feedforward neural network. This is a very effective learning procedure when there is a huge amount of labeled training data, but for many learning tasks very few labeled examples are available. In an effort to overcome the need for…
DIY Analytics for Postsecondary Students
ERIC Educational Resources Information Center
Arndt, Timothy; Guercio, Angela
2014-01-01
Recently organizations have begun to realize the potential value in the huge amounts of raw, constantly fluctuating data sets that they generate and, with the help of advances in storage and processing technologies, collect. This leads to the phenomenon of big data. This data may be stored in structured format in relational database systems, but…
Knowledge, Democracy, and the Internet
ERIC Educational Resources Information Center
Mößner, Nicola; Kitcher, Philip
2017-01-01
The internet has considerably changed epistemic practices in science as well as in everyday life. Apparently, this technology allows more and more people to get access to a huge amount of information. Some people even claim that the internet leads to a "democratization of knowledge." In the following text, we will analyze this statement.…
Challenges Facing Chinese PE Curriculum Reform--Teachers' Talk
ERIC Educational Resources Information Center
Jin, Aijing
2009-01-01
China has attracted a huge amount of interest from around the world over the last two decades because of its rapid and vigorous development. Rapid economic growth has brought with it significant structural reforms in all trades and professions across China. Within this context of rapid social change, the Chinese basic education system has been…
Planning the World History Course: A Reasoned Approach to Omission
ERIC Educational Resources Information Center
Weinland, Thomas P.
2012-01-01
Planning a world history course presents a nearly impossible task. One cannot complete a world history course, or even a European history course, without casting a huge amount of historical information onto the curriculum planning scrapheap. An emphasis on the twentieth century means leaving out significant information from earlier times. "But how…
[The classification of ceramics according to its chemical nature and its method of production].
Moureau, Thomas; Bouhy, Alice; Raepsaet, Nicolas; Vanheusden, Alain
2006-01-01
Nowadays, we find on the market a huge amount of design and manufactured system that allows the realisation of all-ceramics restorations, using different ceramics. The purpose of this article is suggesting a classification of mostly used ceramics and a few laboratory process used in our university.
Large-scale retrieval for medical image analytics: A comprehensive review.
Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting
2018-01-01
Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
Visual exploration and analysis of ionospheric scintillation monitoring data: The ISMR Query Tool
NASA Astrophysics Data System (ADS)
Vani, Bruno César; Shimabukuro, Milton Hirokazu; Galera Monico, João Francisco
2017-07-01
Ionospheric Scintillations are rapid variations on the phase and/or amplitude of a radio signal as it passes through ionospheric plasma irregularities. The ionosphere is a specific layer of the Earth's atmosphere located approximately between 50 km and 1000 km above the Earth's surface. As Global Navigation Satellite Systems (GNSS) - such as GPS, Galileo, BDS and GLONASS - use radio signals, these variations degrade their positioning service quality. Due to its location, Brazil is one of the places most affected by scintillation in the world. For that reason, ionosphere monitoring stations have been deployed over Brazilian territory since 2011 through cooperative projects between several institutions in Europe and Brazil. Such monitoring stations compose a network that generates a large amount of monitoring data everyday. GNSS receivers deployed at these stations - named Ionospheric Scintillation Monitor Receivers (ISMR) - provide scintillation indices and related signal metrics for available satellites dedicated to satellite-based navigation and positioning services. With this monitoring infrastructure, more than ten million observation values are generated and stored every day. Extracting the relevant information from this huge amount of data was a hard process and required the expertise of computer and geoscience scientists. This paper describes the concepts, design and aspects related to the implementation of the software that has been supporting research on ISMR data - the so-called ISMR Query Tool. Usability and other aspects are also presented via examples of application. This web based software has been designed and developed aiming to ensure insights over the huge amount of ISMR data that is fetched every day on an integrated platform. The software applies and adapts time series mining and information visualization techniques to extend the possibilities of exploring and analyzing ISMR data. The software is available to the scientific community through the World Wide Web, therefore constituting an analysis infrastructure that complements the monitoring one, providing support for researching ionospheric scintillation in the GNSS context. Interested researchers can access the functionalities without cost at http://is-cigala-calibra.fct.unesp.br/, under online request to the Space Geodesy Study Group from UNESP - Univ Estadual Paulista at Presidente Prudente.
Radio Frequency Interference Detection using Machine Learning.
NASA Astrophysics Data System (ADS)
Mosiane, Olorato; Oozeer, Nadeem; Aniyan, Arun; Bassett, Bruce A.
2017-05-01
Radio frequency interference (RFI) has plagued radio astronomy which potentially might be as bad or worse by the time the Square Kilometre Array (SKA) comes up. RFI can be either internal (generated by instruments) or external that originates from intentional or unintentional radio emission generated by man. With the huge amount of data that will be available with up coming radio telescopes, an automated aproach will be required to detect RFI. In this paper to try automate this process we present the result of applying machine learning techniques to cross match RFI from the Karoo Array Telescope (KAT-7) data. We found that not all the features selected to characterise RFI are always important. We further investigated 3 machine learning techniques and conclude that the Random forest classifier performs with a 98% Area Under Curve and 91% recall in detecting RFI.
Neonatal heart rate prediction.
Abdel-Rahman, Yumna; Jeremic, Aleksander; Tan, Kenneth
2009-01-01
Technological advances have caused a decrease in the number of infant deaths. Pre-term infants now have a substantially increased chance of survival. One of the mechanisms that is vital to saving the lives of these infants is continuous monitoring and early diagnosis. With continuous monitoring huge amounts of data are collected with so much information embedded in them. By using statistical analysis this information can be extracted and used to aid diagnosis and to understand development. In this study we have a large dataset containing over 180 pre-term infants whose heart rates were recorded over the length of their stay in the Neonatal Intensive Care Unit (NICU). We test two types of models, empirical bayesian and autoregressive moving average. We then attempt to predict future values. The autoregressive moving average model showed better results but required more computation.
2013-01-01
Background Professionals in the biomedical domain are confronted with an increasing mass of data. Developing methods to assist professional end users in the field of Knowledge Discovery to identify, extract, visualize and understand useful information from these huge amounts of data is a huge challenge. However, there are so many diverse methods and methodologies available, that for biomedical researchers who are inexperienced in the use of even relatively popular knowledge discovery methods, it can be very difficult to select the most appropriate method for their particular research problem. Results A web application, called KNODWAT (KNOwledge Discovery With Advanced Techniques) has been developed, using Java on Spring framework 3.1. and following a user-centered approach. The software runs on Java 1.6 and above and requires a web server such as Apache Tomcat and a database server such as the MySQL Server. For frontend functionality and styling, Twitter Bootstrap was used as well as jQuery for interactive user interface operations. Conclusions The framework presented is user-centric, highly extensible and flexible. Since it enables methods for testing using existing data to assess suitability and performance, it is especially suitable for inexperienced biomedical researchers, new to the field of knowledge discovery and data mining. For testing purposes two algorithms, CART and C4.5 were implemented using the WEKA data mining framework. PMID:23763826
Holzinger, Andreas; Zupan, Mario
2013-06-13
Professionals in the biomedical domain are confronted with an increasing mass of data. Developing methods to assist professional end users in the field of Knowledge Discovery to identify, extract, visualize and understand useful information from these huge amounts of data is a huge challenge. However, there are so many diverse methods and methodologies available, that for biomedical researchers who are inexperienced in the use of even relatively popular knowledge discovery methods, it can be very difficult to select the most appropriate method for their particular research problem. A web application, called KNODWAT (KNOwledge Discovery With Advanced Techniques) has been developed, using Java on Spring framework 3.1. and following a user-centered approach. The software runs on Java 1.6 and above and requires a web server such as Apache Tomcat and a database server such as the MySQL Server. For frontend functionality and styling, Twitter Bootstrap was used as well as jQuery for interactive user interface operations. The framework presented is user-centric, highly extensible and flexible. Since it enables methods for testing using existing data to assess suitability and performance, it is especially suitable for inexperienced biomedical researchers, new to the field of knowledge discovery and data mining. For testing purposes two algorithms, CART and C4.5 were implemented using the WEKA data mining framework.
A Pedagogical Approach to the Boltzmann Factor through Experiments and Simulations
ERIC Educational Resources Information Center
Battaglia, O. R.; Bonura, A.; Sperandeo-Mineo, R. M.
2009-01-01
The Boltzmann factor is the basis of a huge amount of thermodynamic and statistical physics, both classical and quantum. It governs the behaviour of all systems in nature that are exchanging energy with their environment. To understand why the expression has this specific form involves a deep mathematical analysis, whose flow of logic is hard to…
Different Aspects of Involving Family in School Life
ERIC Educational Resources Information Center
Blândul, Valentin-Cosmin
2016-01-01
The school has come to not have a huge credibility even, sometimes being abandoned, primarily because, nowadays, no matter hierarchy and is not perceived as a value. The actual society no longer has trust in the educational establishment, the values acquired and ranked by the amount of learning embedded in it. Such an attitude is reflected by the…
Risk analysis and timber investments: a bibliography of theory and applications.
Carol A. Hyldahl; David C. Baumgartner
1991-01-01
Contains a fairly complete set of references to the small but rapidly growing amount of literature directly related to the study of risk in forestry in the United States up to 1989. Also includes representative references for the huge literature of general financial theory dealing with risk. Includes 95 annotated references and 17 additional textbook references...
Modeling MOOC Student Behavior with Two-Layer Hidden Markov Models
ERIC Educational Resources Information Center
Geigle, Chase; Zhai, ChengXiang
2017-01-01
Massive open online courses (MOOCs) provide educators with an abundance of data describing how students interact with the platform, but this data is highly underutilized today. This is in part due to the lack of sophisticated tools to provide interpretable and actionable summaries of huge amounts of MOOC activity present in log data. To address…
Integrating the Core: A New Management Curriculum to Empower Our Students
ERIC Educational Resources Information Center
Brawley, Dorothy; Campbell, Stacy; Desman, Robert; Kolenko, Thomas; Moodie, Douglas
2013-01-01
This paper follows Kennesaw State's University's (KSU) faculty journey in developing a new integrated core curriculum for their Management majors that will empower the students and meet the needs of today's employers. Curriculums must change to stay current. Depending on the amount of change, this can be a huge undertaking for a department…
Don't Throw It Away!: Raise Recycling Awareness through Communications Project
ERIC Educational Resources Information Center
Lazaros, Edward J.; Shackelford, Ray
2008-01-01
Americans discard a huge amount of material everyday. The activity described in this article--determining how much waste is thrown out or recycled in the school cafeteria over a five-day period--dramatically increases students' awareness of this fact of contemporary life. Armed with the information they've gathered, students go on to the…
Why PACS is no longer a four-letter word.
Chopra, R M
2000-01-01
The real value of PACS is not realized until widespread adoption exists among physicians other than interpreting radiologists. Referring physicians at the office level, in the operating room and in other departments must be willing to embrace the reading of images on monitors. That takes time. The payoff for a PACS system is therefore not realized until sometime in the future. Given the huge up-front capital expenditure required of PACS solutions, it is no wonder that the decision has historically been a difficult one to make. Enter the application service provider (ASP). The marriage of the ASP model to PACS seems to be one of the true "killer apps" currently available in the healthcare technology space. An ASP can host and maintain the software inherent in PACS solutions. Images are centrally archived over the short-, medium-, and long-term timeframe, utilizing state-of-art data management facilities. Some ASPs also provide the necessary bandwidth to office sites and the small amount of hardware that is required onsite, such as viewing stations or monitors. Costs for Internet-based image management under the ASP model rely on a pay-as-you-go formula, which may include all software, support, required hardware and bandwidth as part of the service. There may be a minor up-front fee for installation. The ASP pricing model eliminates the huge gamble an organization takes on "big iron" PACS purchases. Those benefits rely on the first rule of finance: a dollar today is worth more than a dollar tomorrow. PACS and ASPs were made for one another. Because the financial benefits of PACS are realized over time, the timing of cash flows is extremely important. Other benefits inherent in the ASP model such as scalability, diminished need for IT personnel, software version integrity and better pricing because of economies of scale are attractive also.
The Mayak Worker Dosimetry System (MWDS-2013): Implementation of the Dose Calculations.
Zhdanov, А; Vostrotin, V; Efimov, А; Birchall, A; Puncher, M
2016-07-15
The calculation of internal doses for the Mayak Worker Dosimetry System (MWDS-2013) involved extensive computational resources due to the complexity and sheer number of calculations required. The required output consisted of a set of 1000 hyper-realizations: each hyper-realization consists of a set (1 for each worker) of probability distributions of organ doses. This report describes the hardware components and computational approaches required to make the calculation tractable. Together with the software, this system is referred to here as the 'PANDORA system'. It is based on a commercial SQL server database in a series of six work stations. A complete run of the entire Mayak worker cohort entailed a huge amount of calculations in PANDORA and due to the relatively slow speed of writing the data into the SQL server, each run took about 47 days. Quality control was monitored by comparing doses calculated in PANDORA with those in a specially modified version of the commercial software 'IMBA Professional Plus'. Suggestions are also made for increasing calculation and storage efficiency for future dosimetry calculations using PANDORA. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Luque, Joaquín; Larios, Diego F; Personal, Enrique; Barbancho, Julio; León, Carlos
2016-05-18
Environmental audio monitoring is a huge area of interest for biologists all over the world. This is why some audio monitoring system have been proposed in the literature, which can be classified into two different approaches: acquirement and compression of all audio patterns in order to send them as raw data to a main server; or specific recognition systems based on audio patterns. The first approach presents the drawback of a high amount of information to be stored in a main server. Moreover, this information requires a considerable amount of effort to be analyzed. The second approach has the drawback of its lack of scalability when new patterns need to be detected. To overcome these limitations, this paper proposes an environmental Wireless Acoustic Sensor Network architecture focused on use of generic descriptors based on an MPEG-7 standard. These descriptors demonstrate it to be suitable to be used in the recognition of different patterns, allowing a high scalability. The proposed parameters have been tested to recognize different behaviors of two anuran species that live in Spanish natural parks; the Epidalea calamita and the Alytes obstetricans toads, demonstrating to have a high classification performance.
Luque, Joaquín; Larios, Diego F.; Personal, Enrique; Barbancho, Julio; León, Carlos
2016-01-01
Environmental audio monitoring is a huge area of interest for biologists all over the world. This is why some audio monitoring system have been proposed in the literature, which can be classified into two different approaches: acquirement and compression of all audio patterns in order to send them as raw data to a main server; or specific recognition systems based on audio patterns. The first approach presents the drawback of a high amount of information to be stored in a main server. Moreover, this information requires a considerable amount of effort to be analyzed. The second approach has the drawback of its lack of scalability when new patterns need to be detected. To overcome these limitations, this paper proposes an environmental Wireless Acoustic Sensor Network architecture focused on use of generic descriptors based on an MPEG-7 standard. These descriptors demonstrate it to be suitable to be used in the recognition of different patterns, allowing a high scalability. The proposed parameters have been tested to recognize different behaviors of two anuran species that live in Spanish natural parks; the Epidalea calamita and the Alytes obstetricans toads, demonstrating to have a high classification performance. PMID:27213375
Expert system shell to reason on large amounts of data
NASA Technical Reports Server (NTRS)
Giuffrida, Gionanni
1994-01-01
The current data base management systems (DBMS's) do not provide a sophisticated environment to develop rule based expert systems applications. Some of the new DBMS's come with some sort of rule mechanism; these are active and deductive database systems. However, both of these are not featured enough to support full implementation based on rules. On the other hand, current expert system shells do not provide any link with external databases. That is, all the data are kept in the system working memory. Such working memory is maintained in main memory. For some applications the reduced size of the available working memory could represent a constraint for the development. Typically these are applications which require reasoning on huge amounts of data. All these data do not fit into the computer main memory. Moreover, in some cases these data can be already available in some database systems and continuously updated while the expert system is running. This paper proposes an architecture which employs knowledge discovering techniques to reduce the amount of data to be stored in the main memory; in this architecture a standard DBMS is coupled with a rule-based language. The data are stored into the DBMS. An interface between the two systems is responsible for inducing knowledge from the set of relations. Such induced knowledge is then transferred to the rule-based language working memory.
Revisiting flow maps: a classification and a 3D alternative to visual clutter
NASA Astrophysics Data System (ADS)
Gu, Yuhang; Kraak, Menno-Jan; Engelhardt, Yuri
2018-05-01
Flow maps have long been servicing people in exploring movement by representing origin-destination data (OD data). Due to recent developments in data collecting techniques the amount of movement data is increasing dramatically. With such huge amounts of data, visual clutter in flow maps is becoming a challenge. This paper revisits flow maps, provides an overview of the characteristics of OD data and proposes a classification system for flow maps. For dealing with problems of visual clutter, 3D flow maps are proposed as potential alternative to 2D flow maps.
Roles and applications of biomedical ontologies in experimental animal science.
Masuya, Hiroshi
2012-01-01
A huge amount of experimental data from past studies has played a vital role in the development of new knowledge and technologies in biomedical science. The importance of computational technologies for the reuse of data, data integration, and knowledge discoveries has also increased, providing means of processing large amounts of data. In recent years, information technologies related to "ontologies" have played more significant roles in the standardization, integration, and knowledge representation of biomedical information. This review paper outlines the history of data integration in biomedical science and its recent trends in relation to the field of experimental animal science.
Malecha, Ziemowit M; Poliski, Jarosaw; Chorowski, Maciej
2017-12-01
The transportation of dangerous substances by truck carriers harbors important safety issues in both road and mine tunnels. Even though traffic conditions in road and mine tunnels are different, the potential geometric and hydrodynamic similarities can lead to similar effects from the uncontrolled leakage of the dangerous material. This work was motivated by the design study of the LAGUNA-LBNO (Large Apparatus studying Grand Unification and Neutrino Astrophysics and Long Baseline Neutrino Oscillations) project. The considered neutrino detector requires a huge amount of liquid argon, which must be transported down the tunnel. The present work focuses on the estimation of the most credible incident and the resulting consequences in the case of a truck accident in the tunnel. The approach and tools used in the present work are generic and can be adapted to other similar situations. © 2017 Society for Risk Analysis.
Prioritization of malaria endemic zones using self-organizing maps in the Manipur state of India.
Murty, Upadhyayula Suryanarayana; Srinivasa Rao, Mutheneni; Misra, Sunil
2008-09-01
Due to the availability of a huge amount of epidemiological and public health data that require analysis and interpretation by using appropriate mathematical tools to support the existing method to control the mosquito and mosquito-borne diseases in a more effective way, data-mining tools are used to make sense from the chaos. Using data-mining tools, one can develop predictive models, patterns, association rules, and clusters of diseases, which can help the decision-makers in controlling the diseases. This paper mainly focuses on the applications of data-mining tools that have been used for the first time to prioritize the malaria endemic regions in Manipur state by using Self Organizing Maps (SOM). The SOM results (in two-dimensional images called Kohonen maps) clearly show the visual classification of malaria endemic zones into high, medium and low in the different districts of Manipur, and will be discussed in the paper.
Accelerating Pathology Image Data Cross-Comparison on CPU-GPU Hybrid Systems
Wang, Kaibo; Huai, Yin; Lee, Rubao; Wang, Fusheng; Zhang, Xiaodong; Saltz, Joel H.
2012-01-01
As an important application of spatial databases in pathology imaging analysis, cross-comparing the spatial boundaries of a huge amount of segmented micro-anatomic objects demands extremely data- and compute-intensive operations, requiring high throughput at an affordable cost. However, the performance of spatial database systems has not been satisfactory since their implementations of spatial operations cannot fully utilize the power of modern parallel hardware. In this paper, we provide a customized software solution that exploits GPUs and multi-core CPUs to accelerate spatial cross-comparison in a cost-effective way. Our solution consists of an efficient GPU algorithm and a pipelined system framework with task migration support. Extensive experiments with real-world data sets demonstrate the effectiveness of our solution, which improves the performance of spatial cross-comparison by over 18 times compared with a parallelized spatial database approach. PMID:23355955
Instances selection algorithm by ensemble margin
NASA Astrophysics Data System (ADS)
Saidi, Meryem; Bechar, Mohammed El Amine; Settouti, Nesma; Chikh, Mohamed Amine
2018-05-01
The main limit of data mining algorithms is their inability to deal with the huge amount of available data in a reasonable processing time. A solution of producing fast and accurate results is instances and features selection. This process eliminates noisy or redundant data in order to reduce the storage and computational cost without performances degradation. In this paper, a new instance selection approach called Ensemble Margin Instance Selection (EMIS) algorithm is proposed. This approach is based on the ensemble margin. To evaluate our approach, we have conducted several experiments on different real-world classification problems from UCI Machine learning repository. The pixel-based image segmentation is a field where the storage requirement and computational cost of applied model become higher. To solve these limitations we conduct a study based on the application of EMIS and other instance selection techniques for the segmentation and automatic recognition of white blood cells WBC (nucleus and cytoplasm) in cytological images.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurtulmus, Erhan; Karaboyacı, Mustafa; Yigitarslan, Sibel
2013-12-16
The pollution of polyethylene teraphtalate (PET) is in huge amounts due to the most widely usage as a packaging material in several industries. Regional pumice has several desirable characteristics such as porous structure, low-cost and light-weight. Considering the requirements approved by the Ministry of Public Works on isolation, composite insulation material consisting of PET and pumice was studied. Sheets of composites differing both in particle size of pumice and composition of polymer were produced by hot-molding technique. Characterization of new composite material was achieved by measuring its weight, density, flammability, endurance against both to common acids and bases, and tomore » a force applied, heat insulation and water adsorption capacity. The results of the study showed that produced composite material is an alternative building material due to its desirable characteristics; low weight, capability of low heat conduction.« less
Cloud access to interoperable IVOA-compliant VOSpace storage
NASA Astrophysics Data System (ADS)
Bertocco, S.; Dowler, P.; Gaudet, S.; Major, B.; Pasian, F.; Taffoni, G.
2018-07-01
Handling, processing and archiving the huge amount of data produced by the new generation of experiments and instruments in Astronomy and Astrophysics are among the more exciting challenges to address in designing the future data management infrastructures and computing services. We investigated the feasibility of a data management and computation infrastructure, available world-wide, with the aim of merging the FAIR data management provided by IVOA standards with the efficiency and reliability of a cloud approach. Our work involved the Canadian Advanced Network for Astronomy Research (CANFAR) infrastructure and the European EGI federated cloud (EFC). We designed and deployed a pilot data management and computation infrastructure that provides IVOA-compliant VOSpace storage resources and wide access to interoperable federated clouds. In this paper, we detail the main user requirements covered, the technical choices and the implemented solutions and we describe the resulting Hybrid cloud Worldwide infrastructure, its benefits and limitations.
Intellectual Property: a powerful tool to develop biotech research.
Giugni, Diego; Giugni, Valter
2010-09-01
Today biotechnology is perhaps the most important technology field because of the strong health and food implications. However, due to the nature of said technology, there is the need of a huge amount of investments to sustain the experimentation costs. Consequently, investors aim to safeguard as much as possible their investments. Intellectual Property, and in particular patents, has been demonstrated to actually constitute a powerful tool to help them. Moreover, patents represent an extremely important means to disclose biotechnology inventions. Patentable biotechnology inventions involve products as nucleotide and amino acid sequences, microorganisms, processes or methods for modifying said products, uses for the manufacture of medicaments, etc. There are several ways to protect inventions, but all follow the three main patentability requirements: novelty, inventive step and industrial application. © 2010 The Authors; Journal compilation © 2010 Society for Applied Microbiology and Blackwell Publishing Ltd.
Should the next standby power target be 0-watt?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meier, Alan; Siderius, Hans-Paul
The standby power use of appliances continues to consume large amounts of electricity. Considerable success has been made in reducing each device’s use, but these savings have been offset by a huge increase in the number of products using standby power and new power requirements for maintaining network connections. Current strategies to reduce standby have limitations and may not be most appropriate for emerging energy consumption trends. A new strategy for further reductions in standby, the “Standzero” option, encourages electrical products to be designed to operate for short periods without relying on mains-supplied electricity. Energy savings are achieved through enhancedmore » efficiency and by harvesting ambient energy. A sensitivity analysis suggests many appliances could be designed to operate for at least an hour without relying on mains power and, in some cases, may be able to operate indefinitely at zero watts until activated.« less
BOLDMirror: a global mirror system of DNA barcode data.
Liu, D; Liu, L; Guo, G; Wang, W; Sun, Q; Parani, M; Ma, J
2013-11-01
DNA barcoding is a novel concept for taxonomic identification using short, specific genetic markers and has been applied to study a large number of eukaryotes. The huge amount of data output generated by DNA barcoding requires well-organized information systems. Besides the Barcode of Life Data system (BOLD) established in Canada, the mirror system is also important for the international barcode of life project (iBOL). For this purpose, we developed the BOLDMirror, a global mirror system of DNA barcode data. It is open-sourced and can run on the LAMP (Linux + Apache + MySQL + PHP) environment. BOLDMirror has data synchronization, data representation and statistics modules, and also provides spaces to store user operation history. BOLDMirror can be accessed at http://www.boldmirror.net and several countries have used it to setup their site of DNA barcoding. © 2012 John Wiley & Sons Ltd.
A Study on the Saving Method of Plate Jigs in Hull Block Butt Welding
NASA Astrophysics Data System (ADS)
Ko, Dae-Eun
2017-11-01
A large amount of plate jigs is used for alignment of welding line and control of welding deformations in hull block assembly stage. Besides material cost, the huge working man-hours required for working process of plate jigs is one of the obstacles in productivity growth of shipyard. In this study, analysis method was proposed to simulate the welding deformations of block butt joint with plate jigs setting. Using the proposed analysis method, an example simulation was performed for actual panel block joint to investigate the saving method of plate jigs. Results show that it is possible to achieve two objectives of quality accuracy of the hull block and saving the plate jig usage at the same time by deploying the plate jigs at the right places. And the proposed analysis method can be used in establishing guidelines for the proper use of plate jigs in block assembly stage.
A nurse staffing analysis at the largest hospital in the Gulf region
NASA Astrophysics Data System (ADS)
Louly, M.; Gharbi, A.; Azaiez, M. N.; Bouras, A.
2014-12-01
The paper considers a staffing problem at a local hospital. The managers consider they are understaffed and try to overwhelm the staffing deficit problem through overtime, rather than hiring additional nurses. However, the huge amount of allocated budget for overtime becomes a concern and needs some assessment, analysis and justification. The current hospital estimates suggests that the shortage at the hospital level corresponds to 300 full time equivalent (FTE) nurses, but the deficit is not basedon deep scientific approach. This paper deals with staffing model that provides the required scientific evidence on the deficit level. It also gives the accurate information on the overtime components. As a results, the suggested staffing model shows that some nursing units are unnecessarily overstaffed. Moreover, the current study reveals that the real deficit is of only 215 FTE resulting in a potential saving of 28%.
All-IP-Ethernet architecture for real-time sensor-fusion processing
NASA Astrophysics Data System (ADS)
Hiraki, Kei; Inaba, Mary; Tezuka, Hiroshi; Tomari, Hisanobu; Koizumi, Kenichi; Kondo, Shuya
2016-03-01
Serendipter is a device that distinguishes and selects very rare particles and cells from huge amount of population. We are currently designing and constructing information processing system for a Serendipter. The information processing system for Serendipter is a kind of sensor-fusion system but with much more difficulties: To fulfill these requirements, we adopt All IP based architecture: All IP-Ethernet based data processing system consists of (1) sensor/detector directly output data as IP-Ethernet packet stream, (2) single Ethernet/TCP/IP streams by a L2 100Gbps Ethernet switch, (3) An FPGA board with 100Gbps Ethernet I/F connected to the switch and a Xeon based server. Circuits in the FPGA include 100Gbps Ethernet MAC, buffers and preprocessing, and real-time Deep learning circuits using multi-layer neural networks. Proposed All-IP architecture solves existing problem to construct large-scale sensor-fusion systems.
Design and deployment of an elastic network test-bed in IHEP data center based on SDN
NASA Astrophysics Data System (ADS)
Zeng, Shan; Qi, Fazhi; Chen, Gang
2017-10-01
High energy physics experiments produce huge amounts of raw data, while because of the sharing characteristics of the network resources, there is no guarantee of the available bandwidth for each experiment which may cause link congestion problems. On the other side, with the development of cloud computing technologies, IHEP have established a cloud platform based on OpenStack which can ensure the flexibility of the computing and storage resources, and more and more computing applications have been deployed on virtual machines established by OpenStack. However, under the traditional network architecture, network capability can’t be required elastically, which becomes the bottleneck of restricting the flexible application of cloud computing. In order to solve the above problems, we propose an elastic cloud data center network architecture based on SDN, and we also design a high performance controller cluster based on OpenDaylight. In the end, we present our current test results.
Splitting parameter yield (SPY): A program for semiautomatic analysis of shear-wave splitting
NASA Astrophysics Data System (ADS)
Zaccarelli, Lucia; Bianco, Francesca; Zaccarelli, Riccardo
2012-03-01
SPY is a Matlab algorithm that analyzes seismic waveforms in a semiautomatic way, providing estimates of the two observables of the anisotropy: the shear-wave splitting parameters. We chose to exploit those computational processes that require less intervention by the user, gaining objectivity and reliability as a result. The algorithm joins the covariance matrix and the cross-correlation techniques, and all the computation steps are interspersed by several automatic checks intended to verify the reliability of the yields. The resulting semiautomation generates two new advantages in the field of anisotropy studies: handling a huge amount of data at the same time, and comparing different yields. From this perspective, SPY has been developed in the Matlab environment, which is widespread, versatile, and user-friendly. Our intention is to provide the scientific community with a new monitoring tool for tracking the temporal variations of the crustal stress field.
DOT National Transportation Integrated Search
2018-02-01
Qing Lu (ORCID ID 0000-0002-9120-9218) Given a huge amount of annual investment and large inputs of energy and natural resources in pavement maintenance and rehabilitation (M&R) activities, significant environmental improvement and budget saving can ...
USDA's Vick tells radio audience wind farms mean huge water savings
USDA-ARS?s Scientific Manuscript database
Since most of the electricity in the U.S. is generated using coal and natural gas as fuel, almost every wind farm announcement includes the estimated amount of carbon dioxide which was not released to the atmosphere. According to Wikipedia, 2.25 tons of CO2 and 1.14 tons of CO2 were released for eve...
Quantum cutting in nanoparticles producing two green photons
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lorbeer, C; Mudring, Anja -V
2014-01-01
A synthetic route to nanoscale NaGdF4:Ln is presented which allows for quantum cutting based on the Gd-Er-Tb system. This shows, that cross-relaxation and other energy transfer processes necessary for multiphoton emission can be achieved in nanoparticles even if the large surface and the potentially huge amount of killer traps would suggest a lack of subsequent emission.
ERIC Educational Resources Information Center
Alyuruk, Hakan; Cavas, Levent
2014-01-01
Genomics and proteomics projects have produced a huge amount of raw biological data including DNA and protein sequences. Although these data have been stored in data banks, their evaluation is strictly dependent on bioinformatics tools. These tools have been developed by multidisciplinary experts for fast and robust analysis of biological data.…
Comparing Different Fault Identification Algorithms in Distributed Power System
NASA Astrophysics Data System (ADS)
Alkaabi, Salim
A power system is a huge complex system that delivers the electrical power from the generation units to the consumers. As the demand for electrical power increases, distributed power generation was introduced to the power system. Faults may occur in the power system at any time in different locations. These faults cause a huge damage to the system as they might lead to full failure of the power system. Using distributed generation in the power system made it even harder to identify the location of the faults in the system. The main objective of this work is to test the different fault location identification algorithms while tested on a power system with the different amount of power injected using distributed generators. As faults may lead the system to full failure, this is an important area for research. In this thesis different fault location identification algorithms have been tested and compared while the different amount of power is injected from distributed generators. The algorithms were tested on IEEE 34 node test feeder using MATLAB and the results were compared to find when these algorithms might fail and the reliability of these methods.
The Feasibility of Linear Motors and High-Energy Thrusters for Massive Aerospace Vehicles
NASA Astrophysics Data System (ADS)
Stull, M. A.
A combination of two propulsion technologies, superconducting linear motors using ambient magnetic fields and high- energy particle beam thrusters, may make it possible to develop massive aerospace vehicles the size of aircraft carriers. If certain critical thresholds can be attained, linear motors can enable massive vehicles to fly within the atmosphere and can propel them to orbit. Thrusters can do neither, because power requirements are prohibitive. However, unless superconductors having extremely high critical current densities can be developed, the interplanetary magnetic field is too weak for linear motors to provide sufficient acceleration to reach even nearby planets. On the other hand, high-energy thrusters can provide adequate acceleration using a minimal amount of reaction mass, at achievable levels of power generation. If the requirements for linear motor propulsion can be met, combining the two modes of propulsion could enable huge nuclear powered spacecraft to reach at least the inner planets of the solar system, the asteroid belt, and possibly Jupiter, in reasonably short times under continuous acceleration, opening them to exploration, resource development and colonization.
A Radio-Map Automatic Construction Algorithm Based on Crowdsourcing
Yu, Ning; Xiao, Chenxian; Wu, Yinfeng; Feng, Renjian
2016-01-01
Traditional radio-map-based localization methods need to sample a large number of location fingerprints offline, which requires huge amount of human and material resources. To solve the high sampling cost problem, an automatic radio-map construction algorithm based on crowdsourcing is proposed. The algorithm employs the crowd-sourced information provided by a large number of users when they are walking in the buildings as the source of location fingerprint data. Through the variation characteristics of users’ smartphone sensors, the indoor anchors (doors) are identified and their locations are regarded as reference positions of the whole radio-map. The AP-Cluster method is used to cluster the crowdsourced fingerprints to acquire the representative fingerprints. According to the reference positions and the similarity between fingerprints, the representative fingerprints are linked to their corresponding physical locations and the radio-map is generated. Experimental results demonstrate that the proposed algorithm reduces the cost of fingerprint sampling and radio-map construction and guarantees the localization accuracy. The proposed method does not require users’ explicit participation, which effectively solves the resource-consumption problem when a location fingerprint database is established. PMID:27070623
Processing of Crawled Urban Imagery for Building Use Classification
NASA Astrophysics Data System (ADS)
Tutzauer, P.; Haala, N.
2017-05-01
Recent years have shown a shift from pure geometric 3D city models to data with semantics. This is induced by new applications (e.g. Virtual/Augmented Reality) and also a requirement for concepts like Smart Cities. However, essential urban semantic data like building use categories is often not available. We present a first step in bridging this gap by proposing a pipeline to use crawled urban imagery and link it with ground truth cadastral data as an input for automatic building use classification. We aim to extract this city-relevant semantic information automatically from Street View (SV) imagery. Convolutional Neural Networks (CNNs) proved to be extremely successful for image interpretation, however, require a huge amount of training data. Main contribution of the paper is the automatic provision of such training datasets by linking semantic information as already available from databases provided from national mapping agencies or city administrations to the corresponding façade images extracted from SV. Finally, we present first investigations with a CNN and an alternative classifier as a proof of concept.
In the Service of the National Economy
1960-07-22
research on hydromechanics, particularly as applied to hydroturbines . The construction project for the hydraulic complex in the region of the Sanhsi...Gorge on the Yangtse River, to be initiated next year, will be on a huge scale. The design and manufacture of the huge hydroturbines require the...designed a small hydroturbine to be operated under low pressure (0.3 - 1.0 meters). The laws of the aerodynamics of the propeller were taken into account
Kepper, Nick; Ettig, Ramona; Dickmann, Frank; Stehr, Rene; Grosveld, Frank G; Wedemann, Gero; Knoch, Tobias A
2010-01-01
Especially in the life-science and the health-care sectors the huge IT requirements are imminent due to the large and complex systems to be analysed and simulated. Grid infrastructures play here a rapidly increasing role for research, diagnostics, and treatment, since they provide the necessary large-scale resources efficiently. Whereas grids were first used for huge number crunching of trivially parallelizable problems, increasingly parallel high-performance computing is required. Here, we show for the prime example of molecular dynamic simulations how the presence of large grid clusters including very fast network interconnects within grid infrastructures allows now parallel high-performance grid computing efficiently and thus combines the benefits of dedicated super-computing centres and grid infrastructures. The demands for this service class are the highest since the user group has very heterogeneous requirements: i) two to many thousands of CPUs, ii) different memory architectures, iii) huge storage capabilities, and iv) fast communication via network interconnects, are all needed in different combinations and must be considered in a highly dedicated manner to reach highest performance efficiency. Beyond, advanced and dedicated i) interaction with users, ii) the management of jobs, iii) accounting, and iv) billing, not only combines classic with parallel high-performance grid usage, but more importantly is also able to increase the efficiency of IT resource providers. Consequently, the mere "yes-we-can" becomes a huge opportunity like e.g. the life-science and health-care sectors as well as grid infrastructures by reaching higher level of resource efficiency.
Large Scale Document Inversion using a Multi-threaded Computing System
Jung, Sungbo; Chang, Dar-Jen; Park, Juw Won
2018-01-01
Current microprocessor architecture is moving towards multi-core/multi-threaded systems. This trend has led to a surge of interest in using multi-threaded computing devices, such as the Graphics Processing Unit (GPU), for general purpose computing. We can utilize the GPU in computation as a massive parallel coprocessor because the GPU consists of multiple cores. The GPU is also an affordable, attractive, and user-programmable commodity. Nowadays a lot of information has been flooded into the digital domain around the world. Huge volume of data, such as digital libraries, social networking services, e-commerce product data, and reviews, etc., is produced or collected every moment with dramatic growth in size. Although the inverted index is a useful data structure that can be used for full text searches or document retrieval, a large number of documents will require a tremendous amount of time to create the index. The performance of document inversion can be improved by multi-thread or multi-core GPU. Our approach is to implement a linear-time, hash-based, single program multiple data (SPMD), document inversion algorithm on the NVIDIA GPU/CUDA programming platform utilizing the huge computational power of the GPU, to develop high performance solutions for document indexing. Our proposed parallel document inversion system shows 2-3 times faster performance than a sequential system on two different test datasets from PubMed abstract and e-commerce product reviews. CCS Concepts •Information systems➝Information retrieval • Computing methodologies➝Massively parallel and high-performance simulations. PMID:29861701
Large Scale Document Inversion using a Multi-threaded Computing System.
Jung, Sungbo; Chang, Dar-Jen; Park, Juw Won
2017-06-01
Current microprocessor architecture is moving towards multi-core/multi-threaded systems. This trend has led to a surge of interest in using multi-threaded computing devices, such as the Graphics Processing Unit (GPU), for general purpose computing. We can utilize the GPU in computation as a massive parallel coprocessor because the GPU consists of multiple cores. The GPU is also an affordable, attractive, and user-programmable commodity. Nowadays a lot of information has been flooded into the digital domain around the world. Huge volume of data, such as digital libraries, social networking services, e-commerce product data, and reviews, etc., is produced or collected every moment with dramatic growth in size. Although the inverted index is a useful data structure that can be used for full text searches or document retrieval, a large number of documents will require a tremendous amount of time to create the index. The performance of document inversion can be improved by multi-thread or multi-core GPU. Our approach is to implement a linear-time, hash-based, single program multiple data (SPMD), document inversion algorithm on the NVIDIA GPU/CUDA programming platform utilizing the huge computational power of the GPU, to develop high performance solutions for document indexing. Our proposed parallel document inversion system shows 2-3 times faster performance than a sequential system on two different test datasets from PubMed abstract and e-commerce product reviews. •Information systems➝Information retrieval • Computing methodologies➝Massively parallel and high-performance simulations.
Financing College in Hard Times: Work and Student Aid. The CSU Crisis and California's Future
ERIC Educational Resources Information Center
Civil Rights Project / Proyecto Derechos Civiles, 2011
2011-01-01
This report is the third in a series of reports designed to analyze the impact of the fiscal cutbacks on opportunity for higher education in the California State University system, the huge network of 23 universities that provide the greatest amount of Bachelor of Arts (BA) level of education in the state. The first study, "Higher Tuition,…
Adaptive Highlighting of Links to Assist Surfing on the Internet
2002-01-01
search engines do not offer a satisfactory solution, their indexing cycle is long and creates a time lag of about one month. Moreover, sometimes search engines offer a huge amount of documents, which is hard to constrain and to increase the ratio of relevant information. A novel AI-assisted surfing method, which highlights links during surfing is studied here. The method makes use
ERIC Educational Resources Information Center
Yorek, Nurettin; Ugulu, Ilker
2015-01-01
In this study, artificial neural networks are suggested as a model that can be "trained" to yield qualitative results out of a huge amount of categorical data. It can be said that this is a new approach applied in educational qualitative data analysis. In this direction, a cascade-forward back-propagation neural network (CFBPN) model was…
Ganges River Delta, Bangladesh, India
1994-11-14
The Ganges River Delta is the largest inter-tidal delta in the world. With its extensive mangrove mud flats, swamp vegetation and sand dunes, it is characteristic of many tropical and subtropical coasts. As seen in this photograph, the tributaries and distributaries of the Ganges and Brahmaputra Rivers deposit huge amounts of silt and clay that create a shifting maze of waterways and islands in the Bay of Bengal.
An Analysis of Lexical Errors of Korean Language Learners: Some American College Learners' Case
ERIC Educational Resources Information Center
Kang, Manjin
2014-01-01
There has been a huge amount of research on errors of language learners. However, most of them have focused on syntactic errors and those about lexical errors are not found easily despite the importance of lexical learning for the language learners. The case is even rarer for Korean language. In line with this background, this study was designed…
Task Assignment Heuristics for Distributed CFD Applications
NASA Technical Reports Server (NTRS)
Lopez-Benitez, N.; Djomehri, M. J.; Biswas, R.; Biegel, Bryan (Technical Monitor)
2001-01-01
CFD applications require high-performance computational platforms: 1. Complex physics and domain configuration demand strongly coupled solutions; 2. Applications are CPU and memory intensive; and 3. Huge resource requirements can only be satisfied by teraflop-scale machines or distributed computing.
NASA Astrophysics Data System (ADS)
2016-07-01
As an astronomer, educator and science advocate at Columbia University in the US, David Helfand has spent his career knocking down faulty arguments and misleading “facts” that cling on despite the huge amount of information available to modern audiences. In his book A Survival Guide to the Misinformation Age, Helfand explains how the same “habits of mind” that make someone a good scientist can also give non-scientists “an antidote to the misinformation glut”.
Designing a Forcenet Information Topology
2004-12-01
and limitations, but one consistent problem is that by tagging data the amount of bits needed to represent the data grows proportionately. As a...40 Frank Morning, Jr., “Smallsats Grow Up,” Aviation Week & Space Technology, December 8, 2003 41 Ibid. 32 (such as emails home or...Below are a few important points concerning a huge field of work. a. Dissimilar Redundancy and Reconstitution A growing concern within the DoD is
Abundance of He-3 and other solar-wind-derived volatiles in lunar soil
NASA Technical Reports Server (NTRS)
Swindle, Timothy D.
1992-01-01
Volatiles implanted into the lunar regolith by the solar wind are potentially important lunar resources. Wittenberg et al. (1986) have proposed that lunar He-3 could be used as a fuel for terrestrial nuclear fusion reactors. They argue that a fusion scheme involving D and He-3 would be cleaner and more efficient than currently-proposed schemes involving D and T. However, since the terrestrial inventory of He-3 is so small, they suggest that the lunar regolith, with concentrations of the order of parts per billion (by mass) would be an economical source of He-3. Solar-wind implantation is also the primary source of H, C, and N in lunar soil. These elements could also be important, particularly for life support and for propellant production. In a SERC study of the feasibility of obtaining the necessary amount of He-3, Swindle et al. (1990) concluded that the available amount is sufficient for early reactors, at least, but that the mining problems, while not necessarily insurmountable, are prodigious. The volatiles H, C, and N, on the other hand, come in parts per million level abundances. The differences in abundances mean that (1) a comparable amount of H, C, and/or N could be extracted with orders of magnitude smaller operations than required for He-3, and (2) if He-3 extraction ever becomes important, huge quantities of H, C, and N will be produced as by-products.
Galaxies Collide to Create Hot, Huge Galaxy
NASA Technical Reports Server (NTRS)
2009-01-01
This image of a pair of colliding galaxies called NGC 6240 shows them in a rare, short-lived phase of their evolution just before they merge into a single, larger galaxy. The prolonged, violent collision has drastically altered the appearance of both galaxies and created huge amounts of heat turning NGC 6240 into an 'infrared luminous' active galaxy. A rich variety of active galaxies, with different shapes, luminosities and radiation profiles exist. These galaxies may be related astronomers have suspected that they may represent an evolutionary sequence. By catching different galaxies in different stages of merging, a story emerges as one type of active galaxy changes into another. NGC 6240 provides an important 'missing link' in this process. This image was created from combined data from the infrared array camera of NASA's Spitzer Space Telescope at 3.6 and 8.0 microns (red) and visible light from NASA's Hubble Space Telescope (green and blue).[A new tool for retrieving clinical data from various sources].
Nielsen, Erik Waage; Hovland, Anders; Strømsnes, Oddgeir
2006-02-23
A doctor's tool for extracting clinical data from various sources on groups of hospital patients into one file has been in demand. For this purpose we evaluated Qlikview. Based on clinical information required by two cardiologists, an IT specialist with thorough knowledge of the hospital's data system (www.dips.no) used 30 days to assemble one Qlikview file. Data was also assembled from a pre-hospital ambulance system. The 13 Mb Qlikview file held various information on 12430 patients admitted to the cardiac unit 26,287 times over the last 21 years. Included were also 530,912 clinical laboratory analyses from these patients during the past five years. Some information required by the cardiologists was inaccessible due to lack of coding or data storage. Some databases could not export their data. Others were encrypted by the software company. A major part of the required data could be extracted to Qlikview. Searches went fast in spite of the huge amount of data. Qlikview could assemble clinical information to doctors from different data systems. Doctors from different hospitals could share and further refine empty Qlikview files for their own use. When the file is assembled, doctors can, on their own, search for answers to constantly changing clinical questions, also at odd hours.
Depth assisted compression of full parallax light fields
NASA Astrophysics Data System (ADS)
Graziosi, Danillo B.; Alpaslan, Zahir Y.; El-Ghoroury, Hussein S.
2015-03-01
Full parallax light field displays require high pixel density and huge amounts of data. Compression is a necessary tool used by 3D display systems to cope with the high bandwidth requirements. One of the formats adopted by MPEG for 3D video coding standards is the use of multiple views with associated depth maps. Depth maps enable the coding of a reduced number of views, and are used by compression and synthesis software to reconstruct the light field. However, most of the developed coding and synthesis tools target linearly arranged cameras with small baselines. Here we propose to use the 3D video coding format for full parallax light field coding. We introduce a view selection method inspired by plenoptic sampling followed by transform-based view coding and view synthesis prediction to code residual views. We determine the minimal requirements for view sub-sampling and present the rate-distortion performance of our proposal. We also compare our method with established video compression techniques, such as H.264/AVC, H.264/MVC, and the new 3D video coding algorithm, 3DV-ATM. Our results show that our method not only has an improved rate-distortion performance, it also preserves the structure of the perceived light fields better.
A new web-based system to improve the monitoring of snow avalanche hazard in France
NASA Astrophysics Data System (ADS)
Bourova, Ekaterina; Maldonado, Eric; Leroy, Jean-Baptiste; Alouani, Rachid; Eckert, Nicolas; Bonnefoy-Demongeot, Mylene; Deschatres, Michael
2016-05-01
Snow avalanche data in the French Alps and Pyrenees have been recorded for more than 100 years in several databases. The increasing amount of observed data required a more integrative and automated service. Here we report the comprehensive web-based Snow Avalanche Information System newly developed to this end for three important data sets: an avalanche chronicle (Enquête Permanente sur les Avalanches, EPA), an avalanche map (Carte de Localisation des Phénomènes d'Avalanche, CLPA) and a compilation of hazard and vulnerability data recorded on selected paths endangering human settlements (Sites Habités Sensibles aux Avalanches, SSA). These data sets are now integrated into a common database, enabling full interoperability between all different types of snow avalanche records: digitized geographic data, avalanche descriptive parameters, eyewitness reports, photographs, hazard and risk levels, etc. The new information system is implemented through modular components using Java-based web technologies with Spring and Hibernate frameworks. It automates the manual data entry and improves the process of information collection and sharing, enhancing user experience and data quality, and offering new outlooks to explore and exploit the huge amount of snow avalanche data available for fundamental research and more applied risk assessment.
Nakano, Ryohei Thomas; Matsushima, Ryo; Nagano, Atsushi J.; Fukao, Yoichiro; Fujiwara, Masayuki; Kondo, Maki; Nishimura, Mikio; Hara-Nishimura, Ikuko
2012-01-01
The endoplasmic reticulum (ER) has a unique, network-like morphology. The ER structures are composed of tubules, cisternae, and three-way junctions. This morphology is highly conserved among eukaryotes, but the molecular mechanism that maintains ER morphology has not yet been elucidated. In addition, certain Brassicaceae plants develop a unique ER-derived organelle called the ER body. This organelle accumulates large amounts of PYK10, a β-glucosidase, but its physiological functions are still obscure. We aimed to identify a novel factor required for maintaining the morphology of the ER, including ER bodies, and employed a forward-genetic approach using transgenic Arabidopsis thaliana (GFP-h) with fluorescently-labeled ER. We isolated and investigated a mutant (designated endoplasmic reticulum morphology3, ermo3) with huge aggregates and abnormal punctate structures of ER. ERMO3 encodes a GDSL-lipase/esterase family protein, also known as MVP1. Here, we showed that, although ERMO3/MVP1/GOLD36 was expressed ubiquitously, the morphological defects of ermo3 were specifically seen in a certain type of cells where ER bodies developed. Coimmunoprecipitation analysis combined with mass spectrometry revealed that ERMO3/MVP1/GOLD36 interacts with the PYK10 complex, a huge protein complex that is thought to be important for ER body-related defense systems. We also found that the depletion of transcription factor NAI1, a master regulator for ER body formation, suppressed the formation of ER-aggregates in ermo3 cells, suggesting that NAI1 expression plays an important role in the abnormal aggregation of ER. Our results suggest that ERMO3/MVP1/GOLD36 is required for preventing ER and other organelles from abnormal aggregation and for maintaining proper ER morphology in a coordinated manner with NAI1. PMID:23155454
NASA Astrophysics Data System (ADS)
Shen, Tzu-Chiang; Ovando, Nicolás.; Bartsch, Marcelo; Simmond, Max; Vélez, Gastón; Robles, Manuel; Soto, Rubén.; Ibsen, Jorge; Saldias, Christian
2012-09-01
ALMA is the first astronomical project being constructed and operated under industrial approach due to the huge amount of elements involved. In order to achieve the maximum through put during the engineering and scientific commissioning phase, several production lines have been established to work in parallel. This decision required modification in the original system architecture in which all the elements are controlled and operated within a unique Standard Test Environment (STE). The advance in the network industry and together with the maturity of virtualization paradigm allows us to provide a solution which can replicate the STE infrastructure without changing their network address definition. This is only possible with Virtual Routing and Forwarding (VRF) and Virtual LAN (VLAN) concepts. The solution allows dynamic reconfiguration of antennas and other hardware across the production lines with minimum time and zero human intervention in the cabling. We also push the virtualization even further, classical rack mount servers are being replaced and consolidated by blade servers. On top of them virtualized server are centrally administrated with VMWare ESX. Hardware costs and system administration effort will be reduced considerably. This mechanism has been established and operated successfully during the last two years. This experience gave us confident to propose a solution to divide the main operation array into subarrays using the same concept which will introduce huge flexibility and efficiency for ALMA operation and eventually may simplify the complexity of ALMA core observing software since there will be no need to deal with subarrays complexity at software level.
NASA Astrophysics Data System (ADS)
Chauhan, Akshansha; Sharma, Manish; Mehdi, Waseem; Singh, Rachita; Mishra, Sunil K.; Singh, Ramesh
An intense fire occurred at Indian Oil Corporation (IOC) located at Sitapur near Jaipur city on 29 October 2009 around 6.00 pm. High flames up to 70 ft were seen and emission of black plumes were observed over few days. The huge fire killed few and injured a dozen of people. Soon after this huge fire, people living in the adjoining areas escaped and a spurt of patients complaining respiratory problems were reported and taken to the nearest hospital for medical care. The people living in the surrounding villages suffered eye irritation, rashes and were also rushed to the nearest hospital for emergency care. Huge amount of carbon soot was seen in the atmosphere which was deposited in the field and houses. Huge emission of toxic gases like CO, CO2, SO2, NOx were due to burning of oil, although the routine observed data by the Central Pollution Control Board was not made available so it was difficult to comment on the exact amount of these toxic gases. These gases modify the atmospheric composition initially over the IOC region and with time dispersed in the direction of wind towards south-eastern parts affecting major cities Kota, Gwalior, etc. Soon after the fire, cloudy conditions were observed over Delhi which is north-east of IOC, with a thick smog which interrupted road and air traffic for a couple of days. An analysis of multi satellite sensor data (MODIS, AIRS, OMI AURA, AMSER) were carried out. Terra MODIS Image (1 km and 250 m resolution) clearly shows the dispersion of plume. The plume shows south-east direction due to dominance of north-westerly wind in the region. Numerous atmospheric (aerosol optical depth, angstrom coefficient, water vapor and CO mixing ratio, total ozone column) and meteorological parameters (air temperature, relative humidity) are found change. AIRS data show the enhancement of carbon monoxide and changes in atmospheric parameters at around 500 hPa pressure level in the nearby cities due to dispersion in the direction of wind towards south-eastern parts affecting major cities Kota, Gwalior, etc. The observed changes in the climatic conditions of Delhi, health and ecological impact and formation of a thick smog in Delhi will be presented in view of the observed ground and satellite data.
Comprehensive outsourcing biobanking facility to serve the international research community.
Diaferia, Giuseppe R; Biunno, Ida; DeBlasio, Pasquale
2011-06-01
The validity of results from biomarker studies using archived specimens depends on the integrity of the specimens and the manner in which they are collected, processed, and stored. The management of a huge amount of biomaterial generated from research studies and clinical trials is becoming a very demanding task and many organizations are facing the choice between in-house storage and processing and outsourcing some activities. Storage and logistic functions are the prime targets for outsourcing, because to sustain these critical assets organizations must have the expertise, the dedicated qualified personnel, the proper quality control programs, and available resources to fulfill the mandatory requirements to maintain the integrity of the samples. External biobanks are dedicated and certified infrastructures (ISO, GMP, etc.) that apply efficient logistic and shipping activities, use validated standard operating procedures, install appropriate monitoring back-up systems, and, most of all, have room for expansion. Thus, the choice between in-house biobanking and outsourcing cannot be exclusively based on a financial decision; it must also consider (i) type of collection/project, (ii) logistic complexity (number and locations of collection sites), (iii) safety requirements, (iv) functional expertise, and (v) business priorities.
2010-11-01
that a program is bug free. Also, and this is an important issue in getting people 29 to use them, static checkers tend to have false positives...enormous variety of non-standard dialects took a huge amount of work to get what they describe as full version-specific bug compatibility.) Model...coincident detection by T-cells. In- cidentally, this is why it is enough to get rid of T-cells that bind self-peptides without similarly culling B-cells
Cuba: A Short Critical Bibliographic Guide
NASA Astrophysics Data System (ADS)
Basosi, Duccio
An island with a population of approximately eleven million citizens, Cuba has been the topic of a huge amount of books and articles by scholars, politicians, artists, tourists and—why not?—foreign undercover agents. A random search in a well-known on-line bookshop gives some 118,000 results for the island's name. In brief, to present a selection of basic works on Cuba is a very harsh task that necessarily leads to difficult choices.
Smart Shop Assistant - Using Semantic Technologies to Improve Online Shopping
NASA Astrophysics Data System (ADS)
Niemann, Magnus; Mochol, Malgorzata; Tolksdorf, Robert
Internet commerce experiences a rising complexity: Not only more and more products become available online but also the amount of information available on a single product has been constantly increasing. Thanks to the Web 2.0 development it is, in the meantime, quite common to involve customers in the creation of product description and extraction of additional product information by offering customers feedback forms and product review sites, users' weblogs and other social web services. To face this situation, one of the main tasks in a future internet will be to aggregate, sort and evaluate this huge amount of information to aid the customers in choosing the "perfect" product for their needs.
Astrophysics and Big Data: Challenges, Methods, and Tools
NASA Astrophysics Data System (ADS)
Garofalo, Mauro; Botta, Alessio; Ventre, Giorgio
2017-06-01
Nowadays there is no field research which is not flooded with data. Among the sciences, astrophysics has always been driven by the analysis of massive amounts of data. The development of new and more sophisticated observation facilities, both ground-based and spaceborne, has led data more and more complex (Variety), an exponential growth of both data Volume (i.e., in the order of petabytes), and Velocity in terms of production and transmission. Therefore, new and advanced processing solutions will be needed to process this huge amount of data. We investigate some of these solutions, based on machine learning models as well as tools and architectures for Big Data analysis that can be exploited in the astrophysical context.
A Study on Group Key Agreement in Sensor Network Environments Using Two-Dimensional Arrays
Jang, Seung-Jae; Lee, Young-Gu; Lee, Kwang-Hyung; Kim, Tai-Hoon; Jun, Moon-Seog
2011-01-01
These days, with the emergence of the concept of ubiquitous computing, sensor networks that collect, analyze and process all the information through the sensors have become of huge interest. However, sensor network technology fundamentally has wireless communication infrastructure as its foundation and thus has security weakness and limitations such as low computing capacity, power supply limitations and price. In this paper, and considering the characteristics of the sensor network environment, we propose a group key agreement method using a keyset pre-distribution of two-dimension arrays that should minimize the exposure of key and personal information. The key collision problems are resolved by utilizing a polygonal shape’s center of gravity. The method shows that calculating a polygonal shape’s center of gravity only requires a very small amount of calculations from the users. The simple calculation not only increases the group key generation efficiency, but also enhances the sense of security by protecting information between nodes. PMID:22164072
Experimental and numerical study of wastewater pollution in Yuhui channel, Jiashan city
NASA Astrophysics Data System (ADS)
Fu, Lei; Peng, Zhenhua; You, Aiju
2018-02-01
Due to the development of economics and society in China, the huge amount of wastewater becomes a serious problem in most of the Chinese cities. Therefore, the construction of wastewater treatment plant draws much more attentions than before. The discharge from the wastewater treatment plant is then considered as a point source in most of the important rivers and channels in China. In this study, a typical wastewater treatment plant extension project is introduced as a case study, a filed monitoring experiment is designed and executed to observe required data, then, a two-dimensional model is estabilished to simulate the water quality downsteam of the wastewater treatment plant, CODCr is considered as a typical pollutant during the simulation. The simulation results indicate that different discharge conditions will lead to different CODCr concentration downstream of the wastewater treatment plant, and an emergency plan should be prepared to minimize the risk of the pollution in the channel under unusual and accident conditions.
Meta4: a web application for sharing and annotating metagenomic gene predictions using web services.
Richardson, Emily J; Escalettes, Franck; Fotheringham, Ian; Wallace, Robert J; Watson, Mick
2013-01-01
Whole-genome shotgun metagenomics experiments produce DNA sequence data from entire ecosystems, and provide a huge amount of novel information. Gene discovery projects require up-to-date information about sequence homology and domain structure for millions of predicted proteins to be presented in a simple, easy-to-use system. There is a lack of simple, open, flexible tools that allow the rapid sharing of metagenomics datasets with collaborators in a format they can easily interrogate. We present Meta4, a flexible and extensible web application that can be used to share and annotate metagenomic gene predictions. Proteins and predicted domains are stored in a simple relational database, with a dynamic front-end which displays the results in an internet browser. Web services are used to provide up-to-date information about the proteins from homology searches against public databases. Information about Meta4 can be found on the project website, code is available on Github, a cloud image is available, and an example implementation can be seen at.
SPHINX--an algorithm for taxonomic binning of metagenomic sequences.
Mohammed, Monzoorul Haque; Ghosh, Tarini Shankar; Singh, Nitin Kumar; Mande, Sharmila S
2011-01-01
Compared with composition-based binning algorithms, the binning accuracy and specificity of alignment-based binning algorithms is significantly higher. However, being alignment-based, the latter class of algorithms require enormous amount of time and computing resources for binning huge metagenomic datasets. The motivation was to develop a binning approach that can analyze metagenomic datasets as rapidly as composition-based approaches, but nevertheless has the accuracy and specificity of alignment-based algorithms. This article describes a hybrid binning approach (SPHINX) that achieves high binning efficiency by utilizing the principles of both 'composition'- and 'alignment'-based binning algorithms. Validation results with simulated sequence datasets indicate that SPHINX is able to analyze metagenomic sequences as rapidly as composition-based algorithms. Furthermore, the binning efficiency (in terms of accuracy and specificity of assignments) of SPHINX is observed to be comparable with results obtained using alignment-based algorithms. A web server for the SPHINX algorithm is available at http://metagenomics.atc.tcs.com/SPHINX/.
Hardware-software face detection system based on multi-block local binary patterns
NASA Astrophysics Data System (ADS)
Acasandrei, Laurentiu; Barriga, Angel
2015-03-01
Face detection is an important aspect for biometrics, video surveillance and human computer interaction. Due to the complexity of the detection algorithms any face detection system requires a huge amount of computational and memory resources. In this communication an accelerated implementation of MB LBP face detection algorithm targeting low frequency, low memory and low power embedded system is presented. The resulted implementation is time deterministic and uses a customizable AMBA IP hardware accelerator. The IP implements the kernel operations of the MB-LBP algorithm and can be used as universal accelerator for MB LBP based applications. The IP employs 8 parallel MB-LBP feature evaluators cores, uses a deterministic bandwidth, has a low area profile and the power consumption is ~95 mW on a Virtex5 XC5VLX50T. The resulted implementation acceleration gain is between 5 to 8 times, while the hardware MB-LBP feature evaluation gain is between 69 and 139 times.
Open loop, auto reversing liquid nitrogen circulation thermal system for thermo vacuum chamber
NASA Astrophysics Data System (ADS)
Naidu, M. C. A.; Nolakha, Dinesh; Saharkar, B. S.; Kavani, K. M.; Patel, D. R.
2012-11-01
In a thermo vacuum chamber, attaining and controlling low and high temperatures (-100 Deg. C to +120 Deg. C) is a very important task. This paper describes the development of "Open loop, auto reversing liquid nitrogen based thermal system". System specifications, features, open loop auto reversing system, liquid nitrogen flow paths etc. are discussed in this paper. This thermal system consists of solenoid operated cryogenic valves, double embossed thermal plate (shroud), heating elements, temperature sensors and PLC. Bulky items like blowers, heating chambers, liquid nitrogen injection chambers, huge pipe lines and valves were not used. This entire thermal system is very simple to operate and PLC based, fully auto system with auto tuned to given set temperatures. This system requires a very nominal amount of liquid nitrogen (approx. 80 liters / hour) while conducting thermo vacuum tests. This system was integrated to 1.2m dia thermo vacuum chamber, as a part of its augmentation, to conduct extreme temperature cycling tests on passive antenna reflectors of satellites.
High-speed ADC and DAC modules with fibre optic interconnections for telecom satellites
NASA Astrophysics Data System (ADS)
Heikkinen, Veli; Juntunen, Eveliina; Karppinen, Mikko; Kautio, Kari; Ollila, Jyrki; Sitomaniemi, Aila; Tanskanen, Antti; Casey, Rory; Scott, Shane; Gachon, Hélène; Sotom, Michel; Venet, Norbert; Toivonen, Jaakko; Tuominen, Taisto; Karafolas, Nikos
2017-11-01
The flexibility required for future telecom payloads calls for the introduction of more and more digital processing capabilities. Aggregate data throughputs of several Tbps will have to be handled onboard, thus creating the need for effective, ADCDSP and DACDSP highspeed links. ADC and DAC modules with optical interconnections is an attractive option as it can solve easily the transmission and routing of the expected huge amount of data. This technique will enable to increase the bandwidth and/or the number of beams/channels to be treated, or to support advanced digital processing architectures including beam forming. We realised electrooptic ADC and DAC modules containing an 8 bit, 2 GSa/s A/D converter and a 12 bit, 2 GSa/s D/A converter. The 4channel parallel fibre optic link employs 850nm VCSELs and GaAs PIN photodiodes coupled to 50/125μm fibre ribbon cable. ADCDSP and DSPDAC links both have an aggregate data rate of 25 Gbps. The paper presents the current status of this development.
SEDHI: a new generation of detection electronics for earth observation satellites
NASA Astrophysics Data System (ADS)
Dantes, Didier; Neveu, Claude; Biffi, Jean-Marc; Devilliers, Christophe; Andre, Serge
2017-11-01
Future earth observation optical systems will be more and more demanding in terms of ground sampling distance, swath width, number of spectral bands, duty cycle. Existing architectures of focal planes and video processing electronics are hardly compatible with these new requirements: electronic functions are split in several units, and video processing is limited to frequencies around 5 MHz in order to fulfil the radiometric requirements expected for high performance image quality systems. This frequency limitation induces a high number of video chains operated in parallel to process the huge amount of pixels at focal plane output, and leads to unacceptable mass and power consumption budgets. Furthermore, splitting the detection electronics functions into several units (at least one for the focal plane and proximity electronics, and one for the video processing functions) does not optimise the production costs : specific development efforts must be performed on critical analogue electronics at each equipment level and operations of assembly, integration and tests are duplicated at equipment and subsystem levels. Alcatel Space Industries has proposed to CNES a new concept of highly integrated detection electronics (SEDHI), and is developing for CNES a breadboard which will allow to confirm its potentialities. This paper presents the trade-off study which have been performed before selection of this new concept and summarises the main advantages and drawbacks of each possible architecture. The electrical, mechanical and thermal aspects of the SEDHI concept are described, including the basic technologies : ASIC for phase shift of detector clocks, ASIC for video processing, hybrids, microchip module... The adaptability to a large amount of missions and optical instruments is also discussed.
Huge mediastinal liposarcoma resected by clamshell thoracotomy: a case report.
Toda, Michihito; Izumi, Nobuhiro; Tsukioka, Takuma; Komatsu, Hiroaki; Okada, Satoshi; Hara, Kantaro; Ito, Ryuichi; Shibata, Toshihiko; Nishiyama, Noritoshi
2017-12-01
Liposarcoma is the single most common soft tissue sarcoma. Because mediastinal liposarcomas often grow rapidly and frequently recur locally despite adjuvant chemotherapy and radiotherapy, they require complete excision. Therefore, the feasibility of achieving complete surgical excision must be carefully considered. We here report a case of a huge mediastinal liposarcoma resected via clamshell thoracotomy. A 64-year-old man presented with dyspnea on effort. Cardiomegaly had been diagnosed 6 years previously, but had been left untreated. A computed tomography scan showed a huge (36 cm diameter) anterior mediastinal tumor expanding into the pleural cavities bilaterally. The tumor comprised mostly fatty tissue but contained two solid areas. Echo-guided needle biopsies were performed and a diagnosis of an atypical lipomatous tumor was established by pathological examination of the biopsy samples. Surgical resection was performed via a clamshell incision, enabling en bloc resection of this huge tumor. Although there was no invasion of surrounding organs, the left brachiocephalic vein was resected because it was circumferentially surrounded by tumor and could not be preserved. The tumor weighed 3500 g. Pathologic examination of the resected tumor resulted in a diagnosis of a biphasic tumor comprising dedifferentiated liposarcoma and non-adipocytic sarcoma with necrotic areas. The patient remains free of recurrent tumor 20 months postoperatively. Clamshell incision provides an excellent surgical field and can be performed safely in patients with huge mediastinal liposarcomas.
INFORMATION TECHNOLOGY AND U.S. ENERGY CONSUMPTION: ENERGY HOG, PRODUCTIVITY TOOL, OR BOTH?
Journal Article by John A. "Skip" Laitner. Abstract: A signicant debate has emerged with respect to the energy requirements of the Internet. The popular literature has echoed a misleading study that incorrectly suggests the growth of the information economy will require huge amo...
Dinh, Hieu; Rajasekaran, Sanguthevar
2011-07-15
Exact-match overlap graphs have been broadly used in the context of DNA assembly and the shortest super string problem where the number of strings n ranges from thousands to billions. The length ℓ of the strings is from 25 to 1000, depending on the DNA sequencing technologies. However, many DNA assemblers using overlap graphs suffer from the need for too much time and space in constructing the graphs. It is nearly impossible for these DNA assemblers to handle the huge amount of data produced by the next-generation sequencing technologies where the number n of strings could be several billions. If the overlap graph is explicitly stored, it would require Ω(n(2)) memory, which could be prohibitive in practice when n is greater than a hundred million. In this article, we propose a novel data structure using which the overlap graph can be compactly stored. This data structure requires only linear time to construct and and linear memory to store. For a given set of input strings (also called reads), we can informally define an exact-match overlap graph as follows. Each read is represented as a node in the graph and there is an edge between two nodes if the corresponding reads overlap sufficiently. A formal description follows. The maximal exact-match overlap of two strings x and y, denoted by ov(max)(x, y), is the longest string which is a suffix of x and a prefix of y. The exact-match overlap graph of n given strings of length ℓ is an edge-weighted graph in which each vertex is associated with a string and there is an edge (x, y) of weight ω=ℓ-|ov(max)(x, y)| if and only if ω ≤ λ, where |ov(max)(x, y)| is the length of ov(max)(x, y) and λ is a given threshold. In this article, we show that the exact-match overlap graphs can be represented by a compact data structure that can be stored using at most (2λ-1)(2⌈logn⌉+⌈logλ⌉)n bits with a guarantee that the basic operation of accessing an edge takes O(log λ) time. We also propose two algorithms for constructing the data structure for the exact-match overlap graph. The first algorithm runs in O(λℓnlogn) worse-case time and requires O(λ) extra memory. The second one runs in O(λℓn) time and requires O(n) extra memory. Our experimental results on a huge amount of simulated data from sequence assembly show that the data structure can be constructed efficiently in time and memory. Our DNA sequence assembler that incorporates the data structure is freely available on the web at http://www.engr.uconn.edu/~htd06001/assembler/leap.zip
Beowulf Distributed Processing and the United States Geological Survey
Maddox, Brian G.
2002-01-01
Introduction In recent years, the United States Geological Survey's (USGS) National Mapping Discipline (NMD) has expanded its scientific and research activities. Work is being conducted in areas such as emergency response research, scientific visualization, urban prediction, and other simulation activities. Custom-produced digital data have become essential for these types of activities. High-resolution, remotely sensed datasets are also seeing increased use. Unfortunately, the NMD is also finding that it lacks the resources required to perform some of these activities. Many of these projects require large amounts of computer processing resources. Complex urban-prediction simulations, for example, involve large amounts of processor-intensive calculations on large amounts of input data. This project was undertaken to learn and understand the concepts of distributed processing. Experience was needed in developing these types of applications. The idea was that this type of technology could significantly aid the needs of the NMD scientific and research programs. Porting a numerically intensive application currently being used by an NMD science program to run in a distributed fashion would demonstrate the usefulness of this technology. There are several benefits that this type of technology can bring to the USGS's research programs. Projects can be performed that were previously impossible due to a lack of computing resources. Other projects can be performed on a larger scale than previously possible. For example, distributed processing can enable urban dynamics research to perform simulations on larger areas without making huge sacrifices in resolution. The processing can also be done in a more reasonable amount of time than with traditional single-threaded methods (a scaled version of Chester County, Pennsylvania, took about fifty days to finish its first calibration phase with a single-threaded program). This paper has several goals regarding distributed processing technology. It will describe the benefits of the technology. Real data about a distributed application will be presented as an example of the benefits that this technology can bring to USGS scientific programs. Finally, some of the issues with distributed processing that relate to USGS work will be discussed.
NASA Astrophysics Data System (ADS)
Christoff, Nicole; Jorda, Laurent; Viseur, Sophie; Bouley, Sylvain; Manolova, Agata; Mari, Jean-Luc
2017-04-01
One of the challenges of Planetary Science is to estimate as accurately as possible the age of the geological units that crop out on the different space objects in the Solar system. This dating relies on the counting of the impact craters that cover the given outcrop surface. Using this technique, a chronology of the geological events can be determined and their formation and evolution processes can be understood. Over the last decade, several missions to asteroids and planets, such as Dawn to Vesta and Ceres, Messenger to Mercury, Mars Orbiter and Mars Express, produced a huge amount of images, from which equally huge DEMs have been generated. Planned missions, such as BepiColombo, will produce an even larger set of images. This rapidly growing amount of visible images and DEMs makes it more and more fastidious to manually identify craters. Acquisition data will become bigger and this will then require more accurate planetary surface analysis. Because of the importance of the problem, many Crater Detection Algorithm (CDA) were developed and applied onto either image data (2D) or DEM (2D1/5), and rarely onto full 3D data such as 3D topographic meshes. We propose a new approach, based on the detection of crater rim, which form a characteristic round shape. The proposed approach contains two main steps: 1) each vertex is labelled with the values of the mean curvature and minimal curvatures; 2) this curvature map is injected into a Neural Network (NN) to automatically process the region of interest. As a NN approach, it requires a training set of manually detected craters to estimate the optimal weights of the NN. Once trained, the NN can be applied onto the regions of interest for automatically extracting all the craters. As a result, it was observed that detecting forms using a two-dimensional map based on the computation of discrete differential estimators on the 3D mesh is more efficient than using a simple elevation map. This approach significantly reduces the number of false negative detections compared to previous approaches based on 2.5D data processing. The proposed method was validated on a Mars dataset, including a numerical topography acquired by the Mars Orbiter Laser Altimeter (MOLA) instrument and combined with Barlow et al. (2000) crater database. Keywords: geometric modeling, mesh processing, neural network, discrete curvatures, crater detection, planetary science.
Evolution of wetting layer in InAs/GaAs quantum dot system
Ye, XL; Wang, ZG
2006-01-01
For InAs/GaAs quantum dot system, the evolution of the wetting layer (WL) with the InAs deposition thickness has been studied by reflectance difference spectroscopy (RDS). Two transitions related to the heavy- and light-hole in the WL have been distinguished in RD spectra. Taking into account the strain and segregation effects, a model has been presented to deduce the InAs amount in the WL and the segregation coefficient of the indium atoms from the transition energies of heavy- and light-holes. The variation of the InAs amount in the WL and the segregation coefficient are found to rely closely on the growth modes. In addition, the huge dots also exhibits a strong effect on the evolution of the WL. The observed linear dependence of In segregation coefficient upon the InAs amount in the WL demonstrates that the segregation is enhanced by the strain in the WL.
TransAtlasDB: an integrated database connecting expression data, metadata and variants
Adetunji, Modupeore O; Lamont, Susan J; Schmidt, Carl J
2018-01-01
Abstract High-throughput transcriptome sequencing (RNAseq) is the universally applied method for target-free transcript identification and gene expression quantification, generating huge amounts of data. The constraint of accessing such data and interpreting results can be a major impediment in postulating suitable hypothesis, thus an innovative storage solution that addresses these limitations, such as hard disk storage requirements, efficiency and reproducibility are paramount. By offering a uniform data storage and retrieval mechanism, various data can be compared and easily investigated. We present a sophisticated system, TransAtlasDB, which incorporates a hybrid architecture of both relational and NoSQL databases for fast and efficient data storage, processing and querying of large datasets from transcript expression analysis with corresponding metadata, as well as gene-associated variants (such as SNPs) and their predicted gene effects. TransAtlasDB provides the data model of accurate storage of the large amount of data derived from RNAseq analysis and also methods of interacting with the database, either via the command-line data management workflows, written in Perl, with useful functionalities that simplifies the complexity of data storage and possibly manipulation of the massive amounts of data generated from RNAseq analysis or through the web interface. The database application is currently modeled to handle analyses data from agricultural species, and will be expanded to include more species groups. Overall TransAtlasDB aims to serve as an accessible repository for the large complex results data files derived from RNAseq gene expression profiling and variant analysis. Database URL: https://modupeore.github.io/TransAtlasDB/ PMID:29688361
The Public School Infrastructure Problem: Deteriorating Buildings and Deferred Maintenance
ERIC Educational Resources Information Center
Hunter, Richard C.
2009-01-01
The deterioration of public school buildings is more prevalent in large cities that, because of funding shortfalls, have deferred maintenance and require huge sums to bring their buildings up to acceptable standards. Cities such as New York will require approximately $680 million to address the problem of deferred maintenance for needed painting,…
Portrait Face-Off: Gilbert Stuart vs. Peter Max
ERIC Educational Resources Information Center
Crumpecker, Cheryl
2012-01-01
When art classes are short and infrequent, it is always a challenge to meet required state and national standards. A unit comparing and contrasting Peter Max's Pop art portraits with the realistic style of Gilbert Stuart's presidential portraits provides an opportunity to address a huge number of these requirements. Focus can change with the age…
Risk Management: Earning Recognition with an Automated Safety Program
ERIC Educational Resources Information Center
Lansberry, Linden; Strasburger, Tom
2012-01-01
Risk management is a huge task that requires diligent oversight to avoid penalties, fines, or lawsuits. Add in the burden of limited resources that schools face today, and the challenge of meeting the required training, reporting, compliance, and other administrative issues associated with a safety program is almost insurmountable. Despite an…
Chevrier, Sandy; Boidot, Romain
2014-10-06
The widespread use of Next Generation Sequencing has opened up new avenues for cancer research and diagnosis. NGS will bring huge amounts of new data on cancer, and especially cancer genetics. Current knowledge and future discoveries will make it necessary to study a huge number of genes that could be involved in a genetic predisposition to cancer. In this regard, we developed a Nextera design to study 11 complete genes involved in DNA damage repair. This protocol was developed to safely study 11 genes (ATM, BARD1, BRCA1, BRCA2, BRIP1, CHEK2, PALB2, RAD50, RAD51C, RAD80, and TP53) from promoter to 3'-UTR in 24 patients simultaneously. This protocol, based on transposase technology and gDNA enrichment, gives a great advantage in terms of time for the genetic diagnosis thanks to sample multiplexing. This protocol can be safely used with blood gDNA.
NASA Astrophysics Data System (ADS)
Rahardiantoro, S.; Sartono, B.; Kurnia, A.
2017-03-01
In recent years, DNA methylation has been the special issue to reveal the pattern of a lot of human diseases. Huge amount of data would be the inescapable phenomenon in this case. In addition, some researchers interesting to take some predictions based on these huge data, especially using regression analysis. The classical approach would be failed to take the task. Model averaging by Ando and Li [1] could be an alternative approach to face this problem. This research applied the model averaging to get the best prediction in high dimension of data. In the practice, the case study by Vargas et al [3], data of exposure to aflatoxin B1 (AFB1) and DNA methylation in white blood cells of infants in The Gambia, take the implementation of model averaging. The best ensemble model selected based on the minimum of MAPE, MAE, and MSE of predictions. The result is ensemble model by model averaging with number of predictors in model candidate is 15.
A web-based system for supporting global land cover data production
NASA Astrophysics Data System (ADS)
Han, Gang; Chen, Jun; He, Chaoying; Li, Songnian; Wu, Hao; Liao, Anping; Peng, Shu
2015-05-01
Global land cover (GLC) data production and verification process is very complicated, time consuming and labor intensive, requiring huge amount of imagery data and ancillary data and involving many people, often from different geographic locations. The efficient integration of various kinds of ancillary data and effective collaborative classification in large area land cover mapping requires advanced supporting tools. This paper presents the design and development of a web-based system for supporting 30-m resolution GLC data production by combining geo-spatial web-service and Computer Support Collaborative Work (CSCW) technology. Based on the analysis of the functional and non-functional requirements from GLC mapping, a three tiers system model is proposed with four major parts, i.e., multisource data resources, data and function services, interactive mapping and production management. The prototyping and implementation of the system have been realised by a combination of Open Source Software (OSS) and commercially available off-the-shelf system. This web-based system not only facilitates the integration of heterogeneous data and services required by GLC data production, but also provides online access, visualization and analysis of the images, ancillary data and interim 30 m global land-cover maps. The system further supports online collaborative quality check and verification workflows. It has been successfully applied to China's 30-m resolution GLC mapping project, and has improved significantly the efficiency of GLC data production and verification. The concepts developed through this study should also benefit other GLC or regional land-cover data production efforts.
Knight-Jones, T J D; Rushton, J
2013-11-01
Although a disease of low mortality, the global impact of foot and mouth disease (FMD) is colossal due to the huge numbers of animals affected. This impact can be separated into two components: (1) direct losses due to reduced production and changes in herd structure; and (2) indirect losses caused by costs of FMD control, poor access to markets and limited use of improved production technologies. This paper estimates that annual impact of FMD in terms of visible production losses and vaccination in endemic regions alone amount to between US$6.5 and 21 billion. In addition, outbreaks in FMD free countries and zones cause losses of >US$1.5 billion a year. FMD impacts are not the same throughout the world: FMD is highly contagious and the actions of one farmer affect the risk of FMD occurring on other holdings; thus sizeable externalities are generated. Control therefore requires coordination within and between countries. These externalities imply that FMD control produces a significant amount of public goods, justifying the need for national and international public investment. Equipping poor countries with the tools needed to control FMD will involve the long term development of state veterinary services that in turn will deliver wider benefits to a nation including the control of other livestock diseases. Copyright © 2013 Elsevier B.V. All rights reserved.
An Analysis of Earth Science Data Analytics Use Cases
NASA Technical Reports Server (NTRS)
Shie, Chung-Lin; Kempler, Steve
2014-01-01
The increase in the number and volume, and sources, of globally available Earth science data measurements and datasets have afforded Earth scientists and applications researchers unprecedented opportunities to study our Earth in ever more sophisticated ways. In fact, the NASA Earth Observing System Data Information System (EOSDIS) archives have doubled from 2007 to 2014, to 9.1 PB (Ramapriyan, 2009; and https:earthdata.nasa.govaboutsystem-- performance). In addition, other US agency, international programs, field experiments, ground stations, and citizen scientists provide a plethora of additional sources for studying Earth. Co--analyzing huge amounts of heterogeneous data to glean out unobvious information is a daunting task. Earth science data analytics (ESDA) is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. It can include Data Preparation, Data Reduction, and Data Analysis. Through work associated with the Earth Science Information Partners (ESIP) Federation, a collection of Earth science data analytics use cases have been collected and analyzed for the purpose of extracting the types of Earth science data analytics employed, and requirements for data analytics tools and techniques yet to be implemented, based on use case needs. ESIP generated use case template, ESDA use cases, use case types, and preliminary use case analysis (this is a work in progress) will be presented.
JPRS Report, Soviet Union, Political Affairs.
1988-01-14
Lenin, because neither his photos nor painted portraits can do it the way the moving pictures can. On film he is all motion, his expression and moods ...which his adult life has not yet managed to deal with. They are not discussed in school and if an incomprehen- sible statement about overcoming some...people’s judge’s work and get rid of the huge amount of unproductive labor. We fill more pages with handwrit - ing in a day that many typists will use. I
Big Data Analytics for a Smart Green Infrastructure Strategy
NASA Astrophysics Data System (ADS)
Barrile, Vincenzo; Bonfa, Stefano; Bilotta, Giuliana
2017-08-01
As well known, Big Data is a term for data sets so large or complex that traditional data processing applications aren’t sufficient to process them. The term “Big Data” is referred to using predictive analytics. It is often related to user behavior analytics, or other advanced data analytics methods which from data extract value, and rarely to a particular size of data set. This is especially true for the huge amount of Earth Observation data that satellites constantly orbiting the earth daily transmit.
NASA Astrophysics Data System (ADS)
Konishi, Tsuyoshi; Tanida, Jun; Ichioka, Yoshiki
1995-06-01
A novel technique, the visual-area coding technique (VACT), for the optical implementation of fuzzy logic with the capability of visualization of the results is presented. This technique is based on the microfont method and is considered to be an instance of digitized analog optical computing. Huge amounts of data can be processed in fuzzy logic with the VACT. In addition, real-time visualization of the processed result can be accomplished.
2012-01-01
adult respiratory distress syndrome (ALI/ARDS) occur after fractures in a sporadic entity often termed ‘‘ fat embolism syndrome.’’ Fat embolism ...We then went on to evaluate the role of fracture injuries in mobilizing mtDNA from human tissue trauma. As proposed, we used discarded human samples...to show that long bone fractures (very common in combatants) and their repair by clinical reamed nailing operations mobilized huge amounts of mtDNA
Collective purchase behavior toward retail price changes
NASA Astrophysics Data System (ADS)
Ueno, Hiromichi; Watanabe, Tsutomu; Takayasu, Hideki; Takayasu, Misako
2011-02-01
By analyzing a huge amount of point-of-sale data collected from Japanese supermarkets, we find power law relationships between price and sales numbers. The estimated values of the exponents of these power laws depend on the category of products; however, they are independent of the stores, thereby implying the existence of universal human purchase behavior. The rate of sales numbers around these power laws are generally approximated by log-normal distributions implying that there are hidden random parameters, which might proportionally affect the purchase activity.
NASA Astrophysics Data System (ADS)
Galison, Peter
2010-02-01
Secrecy in matters of national defense goes back far past antiquity. But our modern form of national secrecy owes a huge amount to a the large scale, systematic, and technical system of scientific secrecy that began in the Radar and Manhattan Projects of World War II and came to its current form in the Cold War. Here I would like to capture some of this trajectory and to present some of the paradoxes and deep conundrums that our secrecy system offers us in the Post-Cold War world. )
Data Reduction and Control Software for Meteor Observing Stations Based on CCD Video Systems
NASA Technical Reports Server (NTRS)
Madiedo, J. M.; Trigo-Rodriguez, J. M.; Lyytinen, E.
2011-01-01
The SPanish Meteor Network (SPMN) is performing a continuous monitoring of meteor activity over Spain and neighbouring countries. The huge amount of data obtained by the 25 video observing stations that this network is currently operating made it necessary to develop new software packages to accomplish some tasks, such as data reduction and remote operation of autonomous systems based on high-sensitivity CCD video devices. The main characteristics of this software are described here.
Problems and Prospects of Science Education in Bangladesh
NASA Astrophysics Data System (ADS)
Choudhury, Shamima K.
2009-04-01
Scientific and technological know-how, not the amount of natural resources, determines the development of a country. Bangladesh, with insignificant natural resources and a huge population on a small piece of land, can be developed through scientific and technological means. Whereas it was once the most sought-after subject at secondary and postsecondary levels, science is losing its appeal in an alarming shift of choice. Problems in science education and possible solutions for Bangladesh, which has limited resources for encouraging science education, are presented.
Communication: The simplified generalized entropy theory of glass-formation in polymer melts.
Freed, Karl F
2015-08-07
While a wide range of non-trivial predictions of the generalized entropy theory (GET) of glass-formation in polymer melts agree with a large number of observed universal and non-universal properties of these glass-formers and even for the dependence of these properties on monomer molecular structure, the huge mathematical complexity of the theory precludes its extension to describe, for instance, the perplexing, complex behavior observed for technologically important polymer films with thickness below ∼100 nm and for which a fundamental molecular theory is lacking for the structural relaxation. The present communication describes a hugely simplified version of the theory, called the simplified generalized entropy theory (SGET) that provides one component necessary for devising a theory for the structural relaxation of thin polymer films and thereby supplements the first required ingredient, the recently developed Flory-Huggins level theory for the thermodynamic properties of thin polymer films, before the concluding third step of combining all the components into the SGET for thin polymer films. Comparisons between the predictions of the SGET and the full GET for the four characteristic temperatures of glass-formation provide good agreement for a highly non-trivial model system of polymer melts with chains of the structure of poly(n-α olefins) systems where the GET has produced good agreement with experiment. The comparisons consider values of the relative backbone and side group stiffnesses such that the glass transition temperature decreases as the amount of excess free volume diminishes, contrary to general expectations but in accord with observations for poly(n-alkyl methacrylates). Moreover, the SGET is sufficiently concise to enable its discussion in a standard course on statistical mechanics or polymer physics.
Communication: The simplified generalized entropy theory of glass-formation in polymer melts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freed, Karl F.
2015-08-07
While a wide range of non-trivial predictions of the generalized entropy theory (GET) of glass-formation in polymer melts agree with a large number of observed universal and non-universal properties of these glass-formers and even for the dependence of these properties on monomer molecular structure, the huge mathematical complexity of the theory precludes its extension to describe, for instance, the perplexing, complex behavior observed for technologically important polymer films with thickness below ∼100 nm and for which a fundamental molecular theory is lacking for the structural relaxation. The present communication describes a hugely simplified version of the theory, called the simplifiedmore » generalized entropy theory (SGET) that provides one component necessary for devising a theory for the structural relaxation of thin polymer films and thereby supplements the first required ingredient, the recently developed Flory-Huggins level theory for the thermodynamic properties of thin polymer films, before the concluding third step of combining all the components into the SGET for thin polymer films. Comparisons between the predictions of the SGET and the full GET for the four characteristic temperatures of glass-formation provide good agreement for a highly non-trivial model system of polymer melts with chains of the structure of poly(n-α olefins) systems where the GET has produced good agreement with experiment. The comparisons consider values of the relative backbone and side group stiffnesses such that the glass transition temperature decreases as the amount of excess free volume diminishes, contrary to general expectations but in accord with observations for poly(n-alkyl methacrylates). Moreover, the SGET is sufficiently concise to enable its discussion in a standard course on statistical mechanics or polymer physics.« less
Health data and data governance.
Hovenga, Evelyn J S; Grain, Heather
2013-01-01
Health is a knowledge industry, based on data collected to support care, service planning, financing and knowledge advancement. Increasingly there is a need to collect, retrieve and use health record information in an electronic format to provide greater flexibility, as this enables retrieval and display of data in multiple locations and formats irrespective of where the data were collected. Electronically maintained records require greater structure and consistency to achieve this. The use of data held in records generated in real time in clinical systems also has the potential to reduce the time it takes to gain knowledge, as there is less need to collect research specific information, this is only possible if data governance principles are applied. Connected devices and information systems are now generating huge amounts of data, as never before seen. An ability to analyse and mine very large amounts of data, "Big Data", provides policy and decision makers with new insights into varied aspects of work and information flow and operational business patterns and trends, and drives greater efficiencies, and safer and more effective health care. This enables decision makers to apply rules and guidance that have been developed based upon knowledge from many individual patient records through recognition of triggers based upon that knowledge. In clinical decision support systems information about the individual is compared to rules based upon knowledge gained from accumulated information of many to provide guidance at appropriate times in the clinical process. To achieve this the data in the individual system, and the knowledge rules must be represented in a compatible and consistent manner. This chapter describes data attributes; explains the difference between data and information; outlines the requirements for quality data; shows the relevance of health data standards; and describes how data governance impacts representation of content in systems and the use of that information.
Giant mucinous cystadenocarcinoma of ovary: A case report and review of literature
Katke, Rajshree Dayanand
2016-01-01
Giant cystadenocarcinomas of the ovary are rarely described. Huge ovarian masses are mostly benign, but malignancy should be ruled out by investigations and clinical assessment. Giant cysts require resection because of compressive symptoms or risk of malignancy and their management invariably requires laparotomy to prevent perforation and spillage of the cyst fluid into peritoneal cavity. Here, we present a case of a 42-year-old female with severe and rapidly growing abdominal distension operated for exploratory laparotomy for cystic mass excision. On histology, mass was found to be metastatic mucinous cystadenocarcinoma with omental metastasis. The diagnostic and management challenges posed by this unexpected and unusual presentation of an ovarian cystadenocarcinoma are discussed. The main aim of this report is to draw attention to huge ovarian epithelial cysts with unsuspected presentation contributing to a decrease in any underdiagnosis, misdiagnosis, and mismanagement that might occur. PMID:27134482
Wonczak, Stephan; Thiele, Holger; Nieroda, Lech; Jabbari, Kamel; Borowski, Stefan; Sinha, Vishal; Gunia, Wilfried; Lang, Ulrich; Achter, Viktor; Nürnberg, Peter
2015-01-01
Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files. PMID:25942438
Zhang, Haibo; Zhou, Yang; Huang, Yujuan; Wu, Longhua; Liu, Xinghua; Luo, Yongming
2016-06-01
The protected vegetable farming is a style of high frequent rotation farming which requires a huge amount of fertilizers to maintain soil fertility. A total of 125 surface soils covering from east to west of China were sampled for the analysis of 17 antibiotics in order to identify antibiotics contamination caused by long-term manures application. The results indicate that the agricultural land has accumulated a statistically significantly higher antibiotics concentration than conventional open croplands. The maximum oxytetracycline concentration was 8400 μg kg(-1), the highest level that has ever been reported for oxytetracycline in soils. The residual concentration is decided by both plant duration and manure type. Short-term (<5 years) planting shows the highest residues of tetracyclines and fluoroquinolones in the soils. The organic farming characteristic of applying commercial compost as a single fertilizer in planting shows the lowest antibiotics residue in the soils on the whole. Principal component analysis suggests that the various combinations of antibiotic compounds in the soil may be used to trace the manure source. The antibiotics in soil may threaten water quality through contamination by diffusion. Ciprofloxacin and sulfachinoxalin are calculated to be a higher migration risk to surface waters, hence their environmental fate requires further study. Copyright © 2016 Elsevier Ltd. All rights reserved.
Medical image digital archive: a comparison of storage technologies
NASA Astrophysics Data System (ADS)
Chunn, Timothy; Hutchings, Matt
1998-07-01
A cost effective, high capacity digital archive system is one of the remaining key factors that will enable a radiology department to eliminate film as an archive medium. The ever increasing amount of digital image data is creating the need for huge archive systems that can reliably store and retrieve millions of images and hold from a few terabytes of data to possibly hundreds of terabytes. Selecting the right archive solution depends on a number of factors: capacity requirements, write and retrieval performance requirements, scaleability in capacity and performance, conformance to open standards, archive availability and reliability, security, cost, achievable benefits and cost savings, investment protection, and more. This paper addresses many of these issues. It compares and positions optical disk and magnetic tape technologies, which are the predominant archive mediums today. New technologies will be discussed, such as DVD and high performance tape. Price and performance comparisons will be made at different archive capacities, plus the effect of file size on random and pre-fetch retrieval time will be analyzed. The concept of automated migration of images from high performance, RAID disk storage devices to high capacity, NearlineR storage devices will be introduced as a viable way to minimize overall storage costs for an archive.
Xie, Hongtu; Shi, Shaoying; Xiao, Hui; Xie, Chao; Wang, Feng; Fang, Qunle
2016-01-01
With the rapid development of the one-stationary bistatic forward-looking synthetic aperture radar (OS-BFSAR) technology, the huge amount of the remote sensing data presents challenges for real-time imaging processing. In this paper, an efficient time-domain algorithm (ETDA) considering the motion errors for the OS-BFSAR imaging processing, is presented. This method can not only precisely handle the large spatial variances, serious range-azimuth coupling and motion errors, but can also greatly improve the imaging efficiency compared with the direct time-domain algorithm (DTDA). Besides, it represents the subimages on polar grids in the ground plane instead of the slant-range plane, and derives the sampling requirements considering motion errors for the polar grids to offer a near-optimum tradeoff between the imaging precision and efficiency. First, OS-BFSAR imaging geometry is built, and the DTDA for the OS-BFSAR imaging is provided. Second, the polar grids of subimages are defined, and the subaperture imaging in the ETDA is derived. The sampling requirements for polar grids are derived from the point of view of the bandwidth. Finally, the implementation and computational load of the proposed ETDA are analyzed. Experimental results based on simulated and measured data validate that the proposed ETDA outperforms the DTDA in terms of the efficiency improvement. PMID:27845757
[A large-scale accident in Alpine terrain].
Wildner, M; Paal, P
2015-02-01
Due to the geographical conditions, large-scale accidents amounting to mass casualty incidents (MCI) in Alpine terrain regularly present rescue teams with huge challenges. Using an example incident, specific conditions and typical problems associated with such a situation are presented. The first rescue team members to arrive have the elementary tasks of qualified triage and communication to the control room, which is required to dispatch the necessary additional support. Only with a clear "concept", to which all have to adhere, can the subsequent chaos phase be limited. In this respect, a time factor confounded by adverse weather conditions or darkness represents enormous pressure. Additional hazards are frostbite and hypothermia. If priorities can be established in terms of urgency, then treatment and procedure algorithms have proven successful. For evacuation of causalities, a helicopter should be strived for. Due to the low density of hospitals in Alpine regions, it is often necessary to distribute the patients over a wide area. Rescue operations in Alpine terrain have to be performed according to the particular conditions and require rescue teams to have specific knowledge and expertise. The possibility of a large-scale accident should be considered when planning events. With respect to optimization of rescue measures, regular training and exercises are rational, as is the analysis of previous large-scale Alpine accidents.
NASA Astrophysics Data System (ADS)
Uslu, Faruk Sukru
2017-07-01
Oil spills on the ocean surface cause serious environmental, political, and economic problems. Therefore, these catastrophic threats to marine ecosystems require detection and monitoring. Hyperspectral sensors are powerful optical sensors used for oil spill detection with the help of detailed spectral information of materials. However, huge amounts of data in hyperspectral imaging (HSI) require fast and accurate computation methods for detection problems. Support vector data description (SVDD) is one of the most suitable methods for detection, especially for large data sets. Nevertheless, the selection of kernel parameters is one of the main problems in SVDD. This paper presents a method, inspired by ensemble learning, for improving performance of SVDD without tuning its kernel parameters. Additionally, a classifier selection technique is proposed to get more gain. The proposed approach also aims to solve the small sample size problem, which is very important for processing high-dimensional data in HSI. The algorithm is applied to two HSI data sets for detection problems. In the first HSI data set, various targets are detected; in the second HSI data set, oil spill detection in situ is realized. The experimental results demonstrate the feasibility and performance improvement of the proposed algorithm for oil spill detection problems.
TCGA2BED: extracting, extending, integrating, and querying The Cancer Genome Atlas.
Cumbo, Fabio; Fiscon, Giulia; Ceri, Stefano; Masseroli, Marco; Weitschek, Emanuel
2017-01-03
Data extraction and integration methods are becoming essential to effectively access and take advantage of the huge amounts of heterogeneous genomics and clinical data increasingly available. In this work, we focus on The Cancer Genome Atlas, a comprehensive archive of tumoral data containing the results of high-throughout experiments, mainly Next Generation Sequencing, for more than 30 cancer types. We propose TCGA2BED a software tool to search and retrieve TCGA data, and convert them in the structured BED format for their seamless use and integration. Additionally, it supports the conversion in CSV, GTF, JSON, and XML standard formats. Furthermore, TCGA2BED extends TCGA data with information extracted from other genomic databases (i.e., NCBI Entrez Gene, HGNC, UCSC, and miRBase). We also provide and maintain an automatically updated data repository with publicly available Copy Number Variation, DNA-methylation, DNA-seq, miRNA-seq, and RNA-seq (V1,V2) experimental data of TCGA converted into the BED format, and their associated clinical and biospecimen meta data in attribute-value text format. The availability of the valuable TCGA data in BED format reduces the time spent in taking advantage of them: it is possible to efficiently and effectively deal with huge amounts of cancer genomic data integratively, and to search, retrieve and extend them with additional information. The BED format facilitates the investigators allowing several knowledge discovery analyses on all tumor types in TCGA with the final aim of understanding pathological mechanisms and aiding cancer treatments.
Bio-charcoal production from municipal organic solid wastes
NASA Astrophysics Data System (ADS)
AlKhayat, Z. Q.
2017-08-01
The economic and environmental problems of handling the increasingly huge amounts of urban and/or suburban organic municipal solid wastes MSW, from collection to end disposal, in addition to the big fluctuations in power supply and other energy form costs for the various civilian needs, is studied for Baghdad city, the ancient and glamorous capital of Iraq, and a simple control device is suggested, built and tested by carbonizing these dried organic wastes in simple environment friendly bio-reactor in order to produce low pollution potential, economical and local charcoal capsules that might be useful for heating, cooking and other municipal uses. That is in addition to the solve of solid wastes management problem which involves huge human and financial resources and causes many lethal health and environmental problems. Leftovers of different social level residential campuses were collected, classified for organic materials then dried in order to be supplied into the bio-reactor, in which it is burnt and then mixed with small amounts of sugar sucrose that is extracted from Iraqi planted sugar cane, to produce well shaped charcoal capsules. The burning process is smoke free as the closed burner’s exhaust pipe is buried 1m underground hole, in order to use the subsurface soil as natural gas filter. This process has proved an excellent performance of handling about 120kg/day of classified MSW, producing about 80-100 kg of charcoal capsules, by the use of 200 l reactor volume.
Mechanical Verification of Cryptographic Protocols
NASA Astrophysics Data System (ADS)
Cheng, Xiaochun; Ma, Xiaoqi; Huang, Scott C.-H.; Cheng, Maggie
Information security is playing an increasingly important role in modern society, driven especially by the uptake of the Internet for information transfer. Large amount of information is transmitted everyday through the Internet, which is often the target of malicious attacks. In certain areas, this issue is vital. For example, military departments of governments often transmit a great amount of top-secret data, which, if divulged, could become a huge threat to the public and to national security. Even in our daily life, it is also necessary to protect information. Consider e-commerce systems as an example. No one is willing to purchase anything over the Internet before being assured that all their personal and financial information will always be kept secure and will never be leaked to any unauthorised person or organisation.
Mining key elements for severe convection prediction based on CNN
NASA Astrophysics Data System (ADS)
Liu, Ming; Pan, Ning; Zhang, Changan; Sha, Hongzhou; Zhang, Bolei; Liu, Liang; Zhang, Meng
2017-04-01
Severe convective weather is a kind of weather disasters accompanied by heavy rainfall, gust wind, hail, etc. Along with recent developments on remote sensing and numerical modeling, there are high-volume and long-term observational and modeling data accumulated to capture massive severe convective events over particular areas and time periods. With those high-volume and high-variety weather data, most of the existing studies and methods carry out the dynamical laws, cause analysis, potential rule study, and prediction enhancement by utilizing the governing equations from fluid dynamics and thermodynamics. In this study, a key-element mining method is proposed for severe convection prediction based on convolution neural network (CNN). It aims to identify the key areas and key elements from huge amounts of historical weather data including conventional measurements, weather radar, satellite, so as numerical modeling and/or reanalysis data. Under this manner, the machine-learning based method could help the human forecasters on their decision-making on operational weather forecasts on severe convective weathers by extracting key information from the real-time and historical weather big data. In this paper, it first utilizes computer vision technology to complete the data preprocessing work of the meteorological variables. Then, it utilizes the information such as radar map and expert knowledge to annotate all images automatically. And finally, by using CNN model, it cloud analyze and evaluate each weather elements (e.g., particular variables, patterns, features, etc.), and identify key areas of those critical weather elements, then help forecasters quickly screen out the key elements from huge amounts of observation data by current weather conditions. Based on the rich weather measurement and model data (up to 10 years) over Fujian province in China, where the severe convective weathers are very active during the summer months, experimental tests are conducted with the new machine-learning method via CNN models. Based on the analysis of those experimental results and case studies, the proposed new method have below benefits for the severe convection prediction: (1) helping forecasters to narrow down the scope of analysis and saves lead-time for those high-impact severe convection; (2) performing huge amount of weather big data by machine learning methods rather relying on traditional theory and knowledge, which provide new method to explore and quantify the severe convective weathers; (3) providing machine learning based end-to-end analysis and processing ability with considerable scalability on data volumes, and accomplishing the analysis work without human intervention.
ERIC Educational Resources Information Center
Ojo, Olugbenga David; Olakulehin, Felix Kayode
2006-01-01
This paper examined the nature of open and distance learning institutions as organizations where synergy of efforts of all personnel is required in order to achieve the aims and objectives of the institution. It explored the huge infrastructural and personnel requirements of distance learning institutions, especially at inception, and the…
Information Pre-Processing using Domain Meta-Ontology and Rule Learning System
NASA Astrophysics Data System (ADS)
Ranganathan, Girish R.; Biletskiy, Yevgen
Around the globe, extraordinary amounts of documents are being created by Enterprises and by users outside these Enterprises. The documents created in the Enterprises constitute the main focus of the present chapter. These documents are used to perform numerous amounts of machine processing. While using thesedocuments for machine processing, lack of semantics of the information in these documents may cause misinterpretation of the information, thereby inhibiting the productiveness of computer assisted analytical work. Hence, it would be profitable to the Enterprises if they use well defined domain ontologies which will serve as rich source(s) of semantics for the information in the documents. These domain ontologies can be created manually, semi-automatically or fully automatically. The focus of this chapter is to propose an intermediate solution which will enable relatively easy creation of these domain ontologies. The process of extracting and capturing domain ontologies from these voluminous documents requires extensive involvement of domain experts and application of methods of ontology learning that are substantially labor intensive; therefore, some intermediate solutions which would assist in capturing domain ontologies must be developed. This chapter proposes a solution in this direction which involves building a meta-ontology that will serve as an intermediate information source for the main domain ontology. This chapter proposes a solution in this direction which involves building a meta-ontology as a rapid approach in conceptualizing a domain of interest from huge amount of source documents. This meta-ontology can be populated by ontological concepts, attributes and relations from documents, and then refined in order to form better domain ontology either through automatic ontology learning methods or some other relevant ontology building approach.
Cloud4Psi: cloud computing for 3D protein structure similarity searching.
Mrozek, Dariusz; Małysiak-Mrozek, Bożena; Kłapciński, Artur
2014-10-01
Popular methods for 3D protein structure similarity searching, especially those that generate high-quality alignments such as Combinatorial Extension (CE) and Flexible structure Alignment by Chaining Aligned fragment pairs allowing Twists (FATCAT) are still time consuming. As a consequence, performing similarity searching against large repositories of structural data requires increased computational resources that are not always available. Cloud computing provides huge amounts of computational power that can be provisioned on a pay-as-you-go basis. We have developed the cloud-based system that allows scaling of the similarity searching process vertically and horizontally. Cloud4Psi (Cloud for Protein Similarity) was tested in the Microsoft Azure cloud environment and provided good, almost linearly proportional acceleration when scaled out onto many computational units. Cloud4Psi is available as Software as a Service for testing purposes at: http://cloud4psi.cloudapp.net/. For source code and software availability, please visit the Cloud4Psi project home page at http://zti.polsl.pl/dmrozek/science/cloud4psi.htm. © The Author 2014. Published by Oxford University Press.
SILVA tree viewer: interactive web browsing of the SILVA phylogenetic guide trees.
Beccati, Alan; Gerken, Jan; Quast, Christian; Yilmaz, Pelin; Glöckner, Frank Oliver
2017-09-30
Phylogenetic trees are an important tool to study the evolutionary relationships among organisms. The huge amount of available taxa poses difficulties in their interactive visualization. This hampers the interaction with the users to provide feedback for the further improvement of the taxonomic framework. The SILVA Tree Viewer is a web application designed for visualizing large phylogenetic trees without requiring the download of any software tool or data files. The SILVA Tree Viewer is based on Web Geographic Information Systems (Web-GIS) technology with a PostgreSQL backend. It enables zoom and pan functionalities similar to Google Maps. The SILVA Tree Viewer enables access to two phylogenetic (guide) trees provided by the SILVA database: the SSU Ref NR99 inferred from high-quality, full-length small subunit sequences, clustered at 99% sequence identity and the LSU Ref inferred from high-quality, full-length large subunit sequences. The Tree Viewer provides tree navigation, search and browse tools as well as an interactive feedback system to collect any kinds of requests ranging from taxonomy to data curation and improving the tool itself.
The application of dynamic programming in production planning
NASA Astrophysics Data System (ADS)
Wu, Run
2017-05-01
Nowadays, with the popularity of the computers, various industries and fields are widely applying computer information technology, which brings about huge demand for a variety of application software. In order to develop software meeting various needs with most economical cost and best quality, programmers must design efficient algorithms. A superior algorithm can not only soul up one thing, but also maximize the benefits and generate the smallest overhead. As one of the common algorithms, dynamic programming algorithms are used to solving problems with some sort of optimal properties. When solving problems with a large amount of sub-problems that needs repetitive calculations, the ordinary sub-recursive method requires to consume exponential time, and dynamic programming algorithm can reduce the time complexity of the algorithm to the polynomial level, according to which we can conclude that dynamic programming algorithm is a very efficient compared to other algorithms reducing the computational complexity and enriching the computational results. In this paper, we expound the concept, basic elements, properties, core, solving steps and difficulties of the dynamic programming algorithm besides, establish the dynamic programming model of the production planning problem.
Kandel, Pragya; Kunwar, Ritu; Lamichhane, Prabhat; Karki, Surendra
2017-02-08
Water sources classified as "improved" may not necessarily provide safe drinking water for householders. We analyzed data from Nepal Multiple Indicator Cluster Survey 2014 to explore the extent of fecal contamination of household drinking water. Fecal contamination was detected in 81.2% (95% confidence interval [CI]: 77.9-84.2) household drinking water from improved sources and 89.6% (95% CI: 80.4-94.7) in water samples from unimproved sources. In adjusted analysis, there was no difference in odds of fecal contamination of household drinking water between improved and unimproved sources. We observed significantly lower odds of fecal contamination of drinking water in households in higher wealth quintiles, where soap and water were available for handwashing and in households employing water treatment. The extent of contamination of drinking water as observed in this study highlights the huge amount of effort required to ensure the provision of safely managed water in Nepal by 2030 as aimed in sustainable development goals. © The American Society of Tropical Medicine and Hygiene.
The history of aggregate development in the denver, Co area
Langer, W.H.
2009-01-01
At the start of the 20th century Denver's population was 203,795. Most streets were unpaved. Buildings were constructed of wood frame or masonry. Transport was by horse-drawn-wagon or rail. Statewide, aggregate consumption was less than 0.25 metric tons per person per year. One hundred years later Denver had a population of 2,365,345. Today Denver is a major metropolitan area at the crossroads of two interstates, home to a new international airport, and in the process of expanding its light rail transit system. The skyline is punctuated with skyscrapers. The urban center is surrounded with edge cities. These changes required huge amounts of aggregate. Statewide, aggregate consumption increased 50 fold to over 13 metric tons per person per year. Denver has a large potential supply of aggregate, but sand and gravel quality decreases downstream from the mountain front and potential sources of crushed stone occur in areas prized for their scenic beauty. These issues, along with urban encroachment and citizen opposition, have complicated aggregate development and have paved a new path for future aggregate development including sustainable resource management and reclamation techniques.
Fernández-Navajas, Ángel; Merello, Paloma; Beltrán, Pedro; García-Diego, Fernando-Juan
2013-01-01
Cultural Heritage preventive conservation requires the monitoring of the parameters involved in the process of deterioration of artworks. Thus, both long-term monitoring of the environmental parameters as well as further analysis of the recorded data are necessary. The long-term monitoring at frequencies higher than 1 data point/day generates large volumes of data that are difficult to store, manage and analyze. This paper presents software which uses a free open source database engine that allows managing and interacting with huge amounts of data from environmental monitoring of cultural heritage sites. It is of simple operation and offers multiple capabilities, such as detection of anomalous data, inquiries, graph plotting and mean trajectories. It is also possible to export the data to a spreadsheet for analyses with more advanced statistical methods (principal component analysis, ANOVA, linear regression, etc.). This paper also deals with a practical application developed for the Renaissance frescoes of the Cathedral of Valencia. The results suggest infiltration of rainwater in the vault and weekly relative humidity changes related with the religious service schedules. PMID:23447005
Sign: large-scale gene network estimation environment for high performance computing.
Tamada, Yoshinori; Shimamura, Teppei; Yamaguchi, Rui; Imoto, Seiya; Nagasaki, Masao; Miyano, Satoru
2011-01-01
Our research group is currently developing software for estimating large-scale gene networks from gene expression data. The software, called SiGN, is specifically designed for the Japanese flagship supercomputer "K computer" which is planned to achieve 10 petaflops in 2012, and other high performance computing environments including Human Genome Center (HGC) supercomputer system. SiGN is a collection of gene network estimation software with three different sub-programs: SiGN-BN, SiGN-SSM and SiGN-L1. In these three programs, five different models are available: static and dynamic nonparametric Bayesian networks, state space models, graphical Gaussian models, and vector autoregressive models. All these models require a huge amount of computational resources for estimating large-scale gene networks and therefore are designed to be able to exploit the speed of 10 petaflops. The software will be available freely for "K computer" and HGC supercomputer system users. The estimated networks can be viewed and analyzed by Cell Illustrator Online and SBiP (Systems Biology integrative Pipeline). The software project web site is available at http://sign.hgc.jp/ .
FliPer: checking the reliability of global seismic parameters from automatic pipelines
NASA Astrophysics Data System (ADS)
Bugnet, L.; García, R. A.; Davies, G. R.; Mathur, S.; Corsaro, E.
2017-12-01
Our understanding of stars through asteroseismic data analysis is limited by our ability to take advantage of the huge amount of observed stars provided by space missions such as CoRoT, \\keplerp, \\ktop, and soon TESS and PLATO. Global seismic pipelines provide global stellar parameters such as mass and radius using the mean seismic parameters, as well as the effective temperature. These pipelines are commonly used automatically on thousands of stars observed by K2 for 3 months (and soon TESS for at least ˜ 1 month). However, pipelines are not immune from misidentifying noise peaks and stellar oscillations. Therefore, new validation techniques are required to assess the quality of these results. We present a new metric called FliPer (Flicker in Power), which takes into account the average variability at all measured time scales. The proper calibration of \\powvar enables us to obtain good estimations of global stellar parameters such as surface gravity that are robust against the influence of noise peaks and hence are an excellent way to find faults in asteroseismic pipelines.
Kandel, Pragya; Kunwar, Ritu; Lamichhane, Prabhat; Karki, Surendra
2017-01-01
Water sources classified as “improved” may not necessarily provide safe drinking water for householders. We analyzed data from Nepal Multiple Indicator Cluster Survey 2014 to explore the extent of fecal contamination of household drinking water. Fecal contamination was detected in 81.2% (95% confidence interval [CI]: 77.9–84.2) household drinking water from improved sources and 89.6% (95% CI: 80.4–94.7) in water samples from unimproved sources. In adjusted analysis, there was no difference in odds of fecal contamination of household drinking water between improved and unimproved sources. We observed significantly lower odds of fecal contamination of drinking water in households in higher wealth quintiles, where soap and water were available for handwashing and in households employing water treatment. The extent of contamination of drinking water as observed in this study highlights the huge amount of effort required to ensure the provision of safely managed water in Nepal by 2030 as aimed in sustainable development goals. PMID:27821687
Perdigón-Melón, J A; Carbajo, J B; Petre, A L; Rosal, R; García-Calvo, E
2010-09-15
A coupled coagulation-Fenton process was applied for the treatment of cosmetic industry effluents. In a first step, FeSO(4) was used as coagulant and the non-precipitated Fe(2+) remaining in dissolution was used as catalyst in the further Fenton process. In the coagulation process a huge decrease in total organic carbon (TOC) was achieved, but the high concentration of phenol derivatives was not diminished. The decrease in TOC in the coagulation step significantly reduces the amount of H(2)O(2) required in the Fenton process for phenol depletion. The coupled process, using a H(2)O(2) dose of only 2 g l(-1), reduced TOC and total phenol to values lower than 40 and 0.10 mg l(-1), respectively. The short reaction period (less than 15 min) in TOC and phenol degradation bodes well for improving treatment in a continuous regime. The combination of both processes significantly reduced the ecotoxicity of raw effluent and markedly increased its biodegradability, thus allowing easier treatment by the conventional biological units in conventional sewage treatment plants (STPs). Copyright 2010 Elsevier B.V. All rights reserved.
Can multilinguality improve Biomedical Word Sense Disambiguation?
Duque, Andres; Martinez-Romo, Juan; Araujo, Lourdes
2016-12-01
Ambiguity in the biomedical domain represents a major issue when performing Natural Language Processing tasks over the huge amount of available information in the field. For this reason, Word Sense Disambiguation is critical for achieving accurate systems able to tackle complex tasks such as information extraction, summarization or document classification. In this work we explore whether multilinguality can help to solve the problem of ambiguity, and the conditions required for a system to improve the results obtained by monolingual approaches. Also, we analyze the best ways to generate those useful multilingual resources, and study different languages and sources of knowledge. The proposed system, based on co-occurrence graphs containing biomedical concepts and textual information, is evaluated on a test dataset frequently used in biomedicine. We can conclude that multilingual resources are able to provide a clear improvement of more than 7% compared to monolingual approaches, for graphs built from a small number of documents. Also, empirical results show that automatically translated resources are a useful source of information for this particular task. Copyright © 2016 Elsevier Inc. All rights reserved.
Cloud4Psi: cloud computing for 3D protein structure similarity searching
Mrozek, Dariusz; Małysiak-Mrozek, Bożena; Kłapciński, Artur
2014-01-01
Summary: Popular methods for 3D protein structure similarity searching, especially those that generate high-quality alignments such as Combinatorial Extension (CE) and Flexible structure Alignment by Chaining Aligned fragment pairs allowing Twists (FATCAT) are still time consuming. As a consequence, performing similarity searching against large repositories of structural data requires increased computational resources that are not always available. Cloud computing provides huge amounts of computational power that can be provisioned on a pay-as-you-go basis. We have developed the cloud-based system that allows scaling of the similarity searching process vertically and horizontally. Cloud4Psi (Cloud for Protein Similarity) was tested in the Microsoft Azure cloud environment and provided good, almost linearly proportional acceleration when scaled out onto many computational units. Availability and implementation: Cloud4Psi is available as Software as a Service for testing purposes at: http://cloud4psi.cloudapp.net/. For source code and software availability, please visit the Cloud4Psi project home page at http://zti.polsl.pl/dmrozek/science/cloud4psi.htm. Contact: dariusz.mrozek@polsl.pl PMID:24930141
A Grid Metadata Service for Earth and Environmental Sciences
NASA Astrophysics Data System (ADS)
Fiore, Sandro; Negro, Alessandro; Aloisio, Giovanni
2010-05-01
Critical challenges for climate modeling researchers are strongly connected with the increasingly complex simulation models and the huge quantities of produced datasets. Future trends in climate modeling will only increase computational and storage requirements. For this reason the ability to transparently access to both computational and data resources for large-scale complex climate simulations must be considered as a key requirement for Earth Science and Environmental distributed systems. From the data management perspective (i) the quantity of data will continuously increases, (ii) data will become more and more distributed and widespread, (iii) data sharing/federation will represent a key challenging issue among different sites distributed worldwide, (iv) the potential community of users (large and heterogeneous) will be interested in discovery experimental results, searching of metadata, browsing collections of files, compare different results, display output, etc.; A key element to carry out data search and discovery, manage and access huge and distributed amount of data is the metadata handling framework. What we propose for the management of distributed datasets is the GRelC service (a data grid solution focusing on metadata management). Despite the classical approaches, the proposed data-grid solution is able to address scalability, transparency, security and efficiency and interoperability. The GRelC service we propose is able to provide access to metadata stored in different and widespread data sources (relational databases running on top of MySQL, Oracle, DB2, etc. leveraging SQL as query language, as well as XML databases - XIndice, eXist, and libxml2 based documents, adopting either XPath or XQuery) providing a strong data virtualization layer in a grid environment. Such a technological solution for distributed metadata management leverages on well known adopted standards (W3C, OASIS, etc.); (ii) supports role-based management (based on VOMS), which increases flexibility and scalability; (iii) provides full support for Grid Security Infrastructure, which means (authorization, mutual authentication, data integrity, data confidentiality and delegation); (iv) is compatible with existing grid middleware such as gLite and Globus and finally (v) is currently adopted at the Euro-Mediterranean Centre for Climate Change (CMCC - Italy) to manage the entire CMCC data production activity as well as in the international Climate-G testbed.
Pharmaceutical R&D in the spotlight: why is there still unmet medical need?
Schmid, Esther F; Smith, Dennis A
2007-12-01
Huge amounts of money and knowledge have been poured into biomedical research for decades. Yet, in some disease areas next to no progress has been made in providing medical treatment. Importantly, it is not only neglected diseases where unmet medical need remains, but many diseases of 'rich' countries are also affected. Occasionally, new therapies exacerbate the medical need gap, such as in cancer. Our paper discusses some of the reasons why this might be and why all of society needs to find solutions to address unmet medical need.
NASA Astrophysics Data System (ADS)
Regel-Rosocka, Magdalena
2018-03-01
E-waste amount is growing at about 4% annually, and has become the fastest growing waste stream in the industrialized world. Over 50 million tons of e-waste are produced globally each year, and some of them end up in landfills causing danger of toxic chemicals leakage over time. E-waste is also sent to developing countries where informal processing of waste electrical and electronic equipment (WEEE) causes serious health and pollution problems. A huge interest in recovery of valuable metals from WEEE is clearly visible in a great number of scientific, popular scientific publications or government and industrial reports.
Engineering Infrastructures: Problems of Safety and Security in the Russian Federation
NASA Astrophysics Data System (ADS)
Makhutov, Nikolay A.; Reznikov, Dmitry O.; Petrov, Vitaly P.
Modern society cannot exist without stable and reliable engineering infrastructures (EI), whose operation is vital for any national economy. These infrastructures include energy, transportation, water and gas supply systems, telecommunication and cyber systems, etc. Their performance is commensurate with storing and processing huge amounts of information, energy and hazardous substances. Ageing infrastructures are deteriorating — with operating conditions declining from normal to emergency and catastrophic. The complexity of engineering infrastructures and their interdependence with other technical systems makes them vulnerable to emergency situations triggered by natural and manmade catastrophes or terrorist attacks.
The Sky is for Everyone — Outreach and Education with the Virtual Observatory
NASA Astrophysics Data System (ADS)
Freistetter, F.; Iafrate, G.; Ramella, M.; Aida-Wp5 Team
2010-12-01
The Virtual Observatory (VO) is an international project to collect astronomical data (images, spectra, simulations, mission-logs, etc.), organise them and develop tools that let astronomers access this huge amount of information. The VO not only simplifies the work of professional astronomers, it is also a valuable tool for education and public outreach. For teachers and astronomers who actively promote astronomy to the public, the VO is a great opportunity to access and use real astronomical data, and have a taste of the daily life of astronomers.
[Plagiarism in medical schools, and its prevention].
Annane, Djillali; Annane, Frédérique
2012-09-01
The plagiarism has become very common in universities and medical school. Undoubtedly, the easy access to a huge amount of electronic documents is one explanation for the increasing prevalence of plagiarism among students. While most of universities and medical school have clear statements and rules about plagiarism, available tools for the detection of plagiarism remain inefficient and dedicate training program for students and teachers too scarce. As lack of time is one reason for students to choose plagiarism, it should be one main target for educational programs. Copyright © 2012. Published by Elsevier Masson SAS.
Ocean circulation and climate during the past 120,000 years
NASA Astrophysics Data System (ADS)
Rahmstorf, Stefan
2002-09-01
Oceans cover more than two-thirds of our blue planet. The waters move in a global circulation system, driven by subtle density differences and transporting huge amounts of heat. Ocean circulation is thus an active and highly nonlinear player in the global climate game. Increasingly clear evidence implicates ocean circulation in abrupt and dramatic climate shifts, such as sudden temperature changes in Greenland on the order of 5-10 °C and massive surges of icebergs into the North Atlantic Ocean - events that have occurred repeatedly during the last glacial cycle.
The current status of NORM/TENORM industries and establishment of regulatory framework in Korea.
Chang, Byung-Uck; Kim, Yongjae; Oh, Jang-Jin
2011-07-01
During the last several years, a nationwide survey on naturally occurring radioactive material (NORM)/technologically enhanced naturally occurring radioactive materials (TENORM) industries has been conducted. Because of the rapid economic growth in Korea, the huge amount of raw materials, including NORM have been consumed in various industrial areas, and some representative TENORM industries exist in Korea. Recently, the Korean government decided to establish a regulatory framework for natural radiation, including NORM/TENORM and is making efforts to introduce relevant publically consent regulations on the basis of international safety standards.
NASA Astrophysics Data System (ADS)
Levene, Michael John
In all attempts to emulate the considerable powers of the brain, one is struck by both its immense size, parallelism, and complexity. While the fields of neural networks, artificial intelligence, and neuromorphic engineering have all attempted oversimplifications on the considerable complexity, all three can benefit from the inherent scalability and parallelism of optics. This thesis looks at specific aspects of three modes in which optics, and particularly volume holography, can play a part in neural computation. First, holography serves as the basis of highly-parallel correlators, which are the foundation of optical neural networks. The huge input capability of optical neural networks make them most useful for image processing and image recognition and tracking. These tasks benefit from the shift invariance of optical correlators. In this thesis, I analyze the capacity of correlators, and then present several techniques for controlling the amount of shift invariance. Of particular interest is the Fresnel correlator, in which the hologram is displaced from the Fourier plane. In this case, the amount of shift invariance is limited not just by the thickness of the hologram, but by the distance of the hologram from the Fourier plane. Second, volume holography can provide the huge storage capacity and high speed, parallel read-out necessary to support large artificial intelligence systems. However, previous methods for storing data in volume holograms have relied on awkward beam-steering or on as-yet non- existent cheap, wide-bandwidth, tunable laser sources. This thesis presents a new technique, shift multiplexing, which is capable of very high densities, but which has the advantage of a very simple implementation. In shift multiplexing, the reference wave consists of a focused spot a few millimeters in front of the hologram. Multiplexing is achieved by simply translating the hologram a few tens of microns or less. This thesis describes the theory for how shift multiplexing works based on an unconventional, but very intuitive, analysis of the optical far-field. A more detailed analysis based on a path-integral interpretation of the Born approximation is also derived. The capacity of shift multiplexing is compared with that of angle and wavelength multiplexing. The last part of this thesis deals with the role of optics in neuromorphic engineering. Up until now, most neuromorphic engineering has involved one or a few VLSI circuits emulating early sensory systems. However, optical interconnects will be required in order to push towards more ambitious goals, such as the simulation of early visual cortex. I describe a preliminary approach to designing such a system, and show how shift multiplexing can be used to simultaneously store and implement the immense interconnections required by such a project.
What ails the practice of medicine: The Atlas has shrugged
Mishra, Sundeep
2015-01-01
Health-care providers are currently facing a huge challenge. At one end they are expending a huge amount of time and energies on health-care delivery including time spent on upgradation of their knowledge and skills (to remain abreast with the field and be able to provide state-of-art patient care), sometimes even at the expense of themselves and their families. On the other hand they are not receiving adequate re-imbursement for their efforts. To compound the problem several “traders” have entered the profession who are well adept in the materialistic approach abandoning the ethics (which currently happens to be the flavor of society in general), giving a bad name to the whole profession and causing severe grief, embarrassment and even dis-illusion to an average physician. The solution to the problem may lie in weeding out these “black sheep” as also realization by the society that the whole profession should not be wrongly labeled, rather a hard toiling and a morally driven practioner should be given his/her due worth PMID:25820040
Williams, Hawys; Spencer, Karen; Sanders, Caroline; Lund, David; Whitley, Edgar A; Kaye, Jane; Dixon, William G
2015-01-13
With one million people treated every 36 hours, routinely collected UK National Health Service (NHS) health data has huge potential for medical research. Advances in data acquisition from electronic patient records (EPRs) means such data are increasingly digital and can be anonymised for research purposes. NHS England's care.data initiative recently sought to increase the amount and availability of such data. However, controversy and uncertainty following the care.data public awareness campaign led to a delay in rollout, indicating that the success of EPR data for medical research may be threatened by a loss of patient and public trust. The sharing of sensitive health care data can only be done through maintaining such trust in a constantly evolving ethicolegal and political landscape. We propose that a dynamic consent model, whereby patients can electronically control consent through time and receive information about the uses of their data, provides a transparent, flexible, and user-friendly means to maintain public trust. This could leverage the huge potential of the EPR for medical research and, ultimately, patient and societal benefit.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feng, Yixiong; Hu, Bingtao; Hao, He
With the development of communication and control technology, intelligent transportation systems have received increasing attention from both industry and academia. Intelligent transportation systems are supported by the Internet of Things, Cyber-Physical System, Artificial Intelligence, Cloud Computing and many other technologies, which supply fundamental information for connected and automated vehicles. Although plenty of studies have provided different formulations for intelligent transportation systems, many of them depend on Master Control Center. However, a centralized control mode requires a huge amount of data transmission and high level of hardware configuration and may cause communication delay and privacy leak. Some distributed architectures have beenmore » proposed to overcome the above problems but systematized technologies to collect and exchange information, process large amounts of data, model the dynamics of vehicles, and safely control the connected and automated vehicles are not explored in detail. In this paper, we proposed a novel distributed cyber-physical system for connected and automated vehicles in which every vehicle is modeled as a double-integrator using edge computing to analyze information collected from its nearest neighbors. The vehicles are supposed to travel along a desired trajectory and to maintain a rigid formation geometry. Related methodologies for the proposed system are illustrated and experiments are conducted showing that the performance of the connected and automated vehicles matches very well with analytic predictions. Some design guidelines and open questions are provided for the future study.« less
Feng, Yixiong; Hu, Bingtao; Hao, He; ...
2018-02-14
With the development of communication and control technology, intelligent transportation systems have received increasing attention from both industry and academia. Intelligent transportation systems are supported by the Internet of Things, Cyber-Physical System, Artificial Intelligence, Cloud Computing and many other technologies, which supply fundamental information for connected and automated vehicles. Although plenty of studies have provided different formulations for intelligent transportation systems, many of them depend on Master Control Center. However, a centralized control mode requires a huge amount of data transmission and high level of hardware configuration and may cause communication delay and privacy leak. Some distributed architectures have beenmore » proposed to overcome the above problems but systematized technologies to collect and exchange information, process large amounts of data, model the dynamics of vehicles, and safely control the connected and automated vehicles are not explored in detail. In this paper, we proposed a novel distributed cyber-physical system for connected and automated vehicles in which every vehicle is modeled as a double-integrator using edge computing to analyze information collected from its nearest neighbors. The vehicles are supposed to travel along a desired trajectory and to maintain a rigid formation geometry. Related methodologies for the proposed system are illustrated and experiments are conducted showing that the performance of the connected and automated vehicles matches very well with analytic predictions. Some design guidelines and open questions are provided for the future study.« less
Towards Personalized Medicine Mediated by in Vitro Virus-Based Interactome Approaches
Ohashi, Hiroyuki; Miyamoto-Sato, Etsuko
2014-01-01
We have developed a simple in vitro virus (IVV) selection system based on cell-free co-translation, using a highly stable and efficient mRNA display method. The IVV system is applicable to the high-throughput and comprehensive analysis of proteins and protein–ligand interactions. Huge amounts of genomic sequence data have been generated over the last decade. The accumulated genetic alterations and the interactome networks identified within cells represent a universal feature of a disease, and knowledge of these aspects can help to determine the optimal therapy for the disease. The concept of the “integrome” has been developed as a means of integrating large amounts of data. We have developed an interactome analysis method aimed at providing individually-targeted health care. We also consider future prospects for this system. PMID:24756093
Economical and environmentally-friendly approaches for usage of onion (Allium cepa L.) waste.
Sharma, Kavita; Mahato, Neelima; Nile, Shivraj Hariram; Lee, Eul Tal; Lee, Yong Rok
2016-08-10
Onion (Allium cepa L.) is one of the most commonly cultivated crops across the globe, and its production is increasing every year due to increasing consumer demand. Simultaneously, huge amounts of waste are produced from different parts of the onion, which ultimately affect the environment in various ways. Hence, proper usage as well as disposal of this waste is important from the environmental aspect. This review summarizes various usage methods of onion waste material, and processes involved to achieve maximum benefits. Processing industries produce the largest amount of onion waste. Other sources are storage systems, domestic usage and cultivation fields. Particular emphasis has been given to the methods used for better extraction and usage of onion waste under specific topics: viz. organic synthesis, production of biogas, absorbent for pollutants and value added products.
Analyzing huge pathology images with open source software.
Deroulers, Christophe; Ameisen, David; Badoual, Mathilde; Gerin, Chloé; Granier, Alexandre; Lartaud, Marc
2013-06-06
Digital pathology images are increasingly used both for diagnosis and research, because slide scanners are nowadays broadly available and because the quantitative study of these images yields new insights in systems biology. However, such virtual slides build up a technical challenge since the images occupy often several gigabytes and cannot be fully opened in a computer's memory. Moreover, there is no standard format. Therefore, most common open source tools such as ImageJ fail at treating them, and the others require expensive hardware while still being prohibitively slow. We have developed several cross-platform open source software tools to overcome these limitations. The NDPITools provide a way to transform microscopy images initially in the loosely supported NDPI format into one or several standard TIFF files, and to create mosaics (division of huge images into small ones, with or without overlap) in various TIFF and JPEG formats. They can be driven through ImageJ plugins. The LargeTIFFTools achieve similar functionality for huge TIFF images which do not fit into RAM. We test the performance of these tools on several digital slides and compare them, when applicable, to standard software. A statistical study of the cells in a tissue sample from an oligodendroglioma was performed on an average laptop computer to demonstrate the efficiency of the tools. Our open source software enables dealing with huge images with standard software on average computers. They are cross-platform, independent of proprietary libraries and very modular, allowing them to be used in other open source projects. They have excellent performance in terms of execution speed and RAM requirements. They open promising perspectives both to the clinician who wants to study a single slide and to the research team or data centre who do image analysis of many slides on a computer cluster. The virtual slide(s) for this article can be found here:http://www.diagnosticpathology.diagnomx.eu/vs/5955513929846272.
Analyzing huge pathology images with open source software
2013-01-01
Background Digital pathology images are increasingly used both for diagnosis and research, because slide scanners are nowadays broadly available and because the quantitative study of these images yields new insights in systems biology. However, such virtual slides build up a technical challenge since the images occupy often several gigabytes and cannot be fully opened in a computer’s memory. Moreover, there is no standard format. Therefore, most common open source tools such as ImageJ fail at treating them, and the others require expensive hardware while still being prohibitively slow. Results We have developed several cross-platform open source software tools to overcome these limitations. The NDPITools provide a way to transform microscopy images initially in the loosely supported NDPI format into one or several standard TIFF files, and to create mosaics (division of huge images into small ones, with or without overlap) in various TIFF and JPEG formats. They can be driven through ImageJ plugins. The LargeTIFFTools achieve similar functionality for huge TIFF images which do not fit into RAM. We test the performance of these tools on several digital slides and compare them, when applicable, to standard software. A statistical study of the cells in a tissue sample from an oligodendroglioma was performed on an average laptop computer to demonstrate the efficiency of the tools. Conclusions Our open source software enables dealing with huge images with standard software on average computers. They are cross-platform, independent of proprietary libraries and very modular, allowing them to be used in other open source projects. They have excellent performance in terms of execution speed and RAM requirements. They open promising perspectives both to the clinician who wants to study a single slide and to the research team or data centre who do image analysis of many slides on a computer cluster. Virtual slides The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5955513929846272 PMID:23829479
Advanced Optical Burst Switched Network Concepts
NASA Astrophysics Data System (ADS)
Nejabati, Reza; Aracil, Javier; Castoldi, Piero; de Leenheer, Marc; Simeonidou, Dimitra; Valcarenghi, Luca; Zervas, Georgios; Wu, Jian
In recent years, as the bandwidth and the speed of networks have increased significantly, a new generation of network-based applications using the concept of distributed computing and collaborative services is emerging (e.g., Grid computing applications). The use of the available fiber and DWDM infrastructure for these applications is a logical choice offering huge amounts of cheap bandwidth and ensuring global reach of computing resources [230]. Currently, there is a great deal of interest in deploying optical circuit (wavelength) switched network infrastructure for distributed computing applications that require long-lived wavelength paths and address the specific needs of a small number of well-known users. Typical users are particle physicists who, due to their international collaborations and experiments, generate enormous amounts of data (Petabytes per year). These users require a network infrastructures that can support processing and analysis of large datasets through globally distributed computing resources [230]. However, providing wavelength granularity bandwidth services is not an efficient and scalable solution for applications and services that address a wider base of user communities with different traffic profiles and connectivity requirements. Examples of such applications may be: scientific collaboration in smaller scale (e.g., bioinformatics, environmental research), distributed virtual laboratories (e.g., remote instrumentation), e-health, national security and defense, personalized learning environments and digital libraries, evolving broadband user services (i.e., high resolution home video editing, real-time rendering, high definition interactive TV). As a specific example, in e-health services and in particular mammography applications due to the size and quantity of images produced by remote mammography, stringent network requirements are necessary. Initial calculations have shown that for 100 patients to be screened remotely, the network would have to securely transport 1.2 GB of data every 30 s [230]. According to the above explanation it is clear that these types of applications need a new network infrastructure and transport technology that makes large amounts of bandwidth at subwavelength granularity, storage, computation, and visualization resources potentially available to a wide user base for specified time durations. As these types of collaborative and network-based applications evolve addressing a wide range and large number of users, it is infeasible to build dedicated networks for each application type or category. Consequently, there should be an adaptive network infrastructure able to support all application types, each with their own access, network, and resource usage patterns. This infrastructure should offer flexible and intelligent network elements and control mechanism able to deploy new applications quickly and efficiently.
Identifying, Quantifying, Extracting and Enhancing Implicit Parallelism
ERIC Educational Resources Information Center
Agarwal, Mayank
2009-01-01
The shift of the microprocessor industry towards multicore architectures has placed a huge burden on the programmers by requiring explicit parallelization for performance. Implicit Parallelization is an alternative that could ease the burden on programmers by parallelizing applications "under the covers" while maintaining sequential semantics…
ERIC Educational Resources Information Center
Haapaniemi, Peter
1990-01-01
Describes imaging technology, which allows huge numbers of words and illustrations to be reduced to tiny fraction of space required by originals and discusses current applications. Highlights include image processing system at National Archives; use by banks for high-speed check processing; engineering document management systems (EDMS); folder…
Lessons learnt on the analysis of large sequence data in animal genomics.
Biscarini, F; Cozzi, P; Orozco-Ter Wengel, P
2018-04-06
The 'omics revolution has made a large amount of sequence data available to researchers and the industry. This has had a profound impact in the field of bioinformatics, stimulating unprecedented advancements in this discipline. Mostly, this is usually looked at from the perspective of human 'omics, in particular human genomics. Plant and animal genomics, however, have also been deeply influenced by next-generation sequencing technologies, with several genomics applications now popular among researchers and the breeding industry. Genomics tends to generate huge amounts of data, and genomic sequence data account for an increasing proportion of big data in biological sciences, due largely to decreasing sequencing and genotyping costs and to large-scale sequencing and resequencing projects. The analysis of big data poses a challenge to scientists, as data gathering currently takes place at a faster pace than does data processing and analysis, and the associated computational burden is increasingly taxing, making even simple manipulation, visualization and transferring of data a cumbersome operation. The time consumed by the processing and analysing of huge data sets may be at the expense of data quality assessment and critical interpretation. Additionally, when analysing lots of data, something is likely to go awry-the software may crash or stop-and it can be very frustrating to track the error. We herein review the most relevant issues related to tackling these challenges and problems, from the perspective of animal genomics, and provide researchers that lack extensive computing experience with guidelines that will help when processing large genomic data sets. © 2018 Stichting International Foundation for Animal Genetics.
Contextual descriptors and neural networks for scene analysis in VHR SAR images
NASA Astrophysics Data System (ADS)
Del Frate, Fabio; Picchiani, Matteo; Falasco, Alessia; Schiavon, Giovanni
2016-10-01
The development of SAR technology during the last decade has made it possible to collect a huge amount of data over many regions of the world. In particular, the availability of SAR images from different sensors, with metric or sub-metric spatial resolution, offers novel opportunities in different fields as land cover, urban monitoring, soil consumption etc. On the other hand, automatic approaches become crucial for the exploitation of such a huge amount of information. In such a scenario, especially if single polarization images are considered, the main issue is to select appropriate contextual descriptors, since the backscattering coefficient of a single pixel may not be sufficient to classify an object on the scene. In this paper a comparison among three different approaches for contextual features definition is presented so as to design optimum procedures for VHR SAR scene understanding. The first approach is based on Gray Level Co- Occurrence Matrix since it is widely accepted and several studies have used it for land cover classification with SAR data. The second approach is based on the Fourier spectra and it has been already proposed with positive results for this kind of problems, the third one is based on Auto-associative Neural Networks which have been already proven effective for features extraction from polarimetric SAR images. The three methods are evaluated in terms of the accuracy of the classified scene when the features extracted using each method are considered as input to a neural network classificator and applied on different Cosmo-SkyMed spotlight products.
Odoardi, Sara; Fisichella, Marco; Romolo, Francesco Saverio; Strano-Rossi, Sabina
2015-09-01
The increasing number of new psychoactive substances (NPS) present in the illicit market render their identification in biological fluids/tissues of great concern for clinical and forensic toxicology. Analytical methods able to detect the huge number of substances that can be used are sought, considering also that many NPS are not detected by the standard immunoassays generally used for routine drug screening. The aim of this work was to develop a method for the screening of different classes of NPS (a total of 78 analytes including cathinones, synthetic cannabinoids, phenethylamines, piperazines, ketamine and analogues, benzofurans, tryptamines) from blood samples. The simultaneous extraction of analytes was performed by Dispersive Liquid/Liquid Microextraction DLLME, a very rapid, cheap and efficient extraction technique that employs microliters amounts of organic solvents. Analyses were performed by a target Ultrahigh Performance Liquid Chromatography tandem Mass Spectrometry (UHPLC-MS/MS) method in multiple reaction monitoring (MRM). The method allowed the detection of the studied analytes with limits of detection (LODs) ranging from 0.2 to 2ng/mL. The proposed DLLME method can be used as an alternative to classical liquid/liquid or solid-phase extraction techniques due to its rapidity, necessity to use only microliters amounts of organic solvents, cheapness, and to its ability to extract simultaneously a huge number of analytes also from different chemical classes. The method was then applied to 60 authentic real samples from forensic cases, demonstrating its suitability for the screening of a wide number of NPS. Copyright © 2015 Elsevier B.V. All rights reserved.
de la Piedra, Antonio; Braeken, An; Touhafi, Abdellah
2013-01-01
Typically, commercial sensor nodes are equipped with MCUsclocked at a low-frequency (i.e., within the 4–12 MHz range). Consequently, executing cryptographic algorithms in those MCUs generally requires a huge amount of time. In this respect, the required energy consumption can be higher than using a separate accelerator based on a Field-programmable Gate Array (FPGA) that is switched on when needed. In this manuscript, we present the design of a cryptographic accelerator suitable for an FPGA-based sensor node and compliant with the IEEE802.15.4 standard. All the embedded resources of the target platform (Xilinx Artix-7) have been maximized in order to provide a cost-effective solution. Moreover, we have added key negotiation capabilities to the IEEE 802.15.4 security suite based on Elliptic Curve Cryptography (ECC;. Our results suggest that tailored accelerators based on FPGA can behave better in terms of energy than contemporary software solutions for motes, such as the TinyECC and NanoECC libraries. In this regard, a point multiplication (PM) can be performed between 8.58- and 15.4-times faster, 3.40- to 23.59-times faster (Elliptic Curve Diffie-Hellman, ECDH) and between 5.45- and 34.26-times faster (Elliptic Curve Integrated Encryption Scheme, ECIES). Moreover, the energy consumption was also improved with a factor of 8.96 (PM). PMID:23899936
El-Said, Waleed A; Yoon, Jinho; Choi, Jeong-Woo
2018-01-01
Discovering new anticancer drugs and screening their efficacy requires a huge amount of resources and time-consuming processes. The development of fast, sensitive, and nondestructive methods for the in vitro and in vivo detection of anticancer drugs' effects and action mechanisms have been done to reduce the time and resources required to discover new anticancer drugs. For the in vitro and in vivo detection of the efficiency, distribution, and action mechanism of anticancer drugs, the applications of electrochemical techniques such as electrochemical cell chips and optical techniques such as surface-enhanced Raman spectroscopy (SERS) have been developed based on the nanostructured surface. Research focused on electrochemical cell chips and the SERS technique have been reviewed here; electrochemical cell chips based on nanostructured surfaces have been developed for the in vitro detection of cell viability and the evaluation of the effects of anticancer drugs, which showed the high capability to evaluate the cytotoxic effects of several chemicals at low concentrations. SERS technique based on the nanostructured surface have been used as label-free, simple, and nondestructive techniques for the in vitro and in vivo monitoring of the distribution, mechanism, and metabolism of different anticancer drugs at the cellular level. The use of electrochemical cell chips and the SERS technique based on the nanostructured surface should be good tools to detect the effects and action mechanisms of anticancer drugs.
Health Recommender Systems: Concepts, Requirements, Technical Basics and Challenges
Wiesner, Martin; Pfeifer, Daniel
2014-01-01
During the last decades huge amounts of data have been collected in clinical databases representing patients' health states (e.g., as laboratory results, treatment plans, medical reports). Hence, digital information available for patient-oriented decision making has increased drastically but is often scattered across different sites. As as solution, personal health record systems (PHRS) are meant to centralize an individual's health data and to allow access for the owner as well as for authorized health professionals. Yet, expert-oriented language, complex interrelations of medical facts and information overload in general pose major obstacles for patients to understand their own record and to draw adequate conclusions. In this context, recommender systems may supply patients with additional laymen-friendly information helping to better comprehend their health status as represented by their record. However, such systems must be adapted to cope with the specific requirements in the health domain in order to deliver highly relevant information for patients. They are referred to as health recommender systems (HRS). In this article we give an introduction to health recommender systems and explain why they are a useful enhancement to PHR solutions. Basic concepts and scenarios are discussed and a first implementation is presented. In addition, we outline an evaluation approach for such a system, which is supported by medical experts. The construction of a test collection for case-related recommendations is described. Finally, challenges and open issues are discussed. PMID:24595212
NASA Astrophysics Data System (ADS)
El-Said, Waleed A.; Yoon, Jinho; Choi, Jeong-Woo
2018-04-01
Discovering new anticancer drugs and screening their efficacy requires a huge amount of resources and time-consuming processes. The development of fast, sensitive, and nondestructive methods for the in vitro and in vivo detection of anticancer drugs' effects and action mechanisms have been done to reduce the time and resources required to discover new anticancer drugs. For the in vitro and in vivo detection of the efficiency, distribution, and action mechanism of anticancer drugs, the applications of electrochemical techniques such as electrochemical cell chips and optical techniques such as surface-enhanced Raman spectroscopy (SERS) have been developed based on the nanostructured surface. Research focused on electrochemical cell chips and the SERS technique have been reviewed here; electrochemical cell chips based on nanostructured surfaces have been developed for the in vitro detection of cell viability and the evaluation of the effects of anticancer drugs, which showed the high capability to evaluate the cytotoxic effects of several chemicals at low concentrations. SERS technique based on the nanostructured surface have been used as label-free, simple, and nondestructive techniques for the in vitro and in vivo monitoring of the distribution, mechanism, and metabolism of different anticancer drugs at the cellular level. The use of electrochemical cell chips and the SERS technique based on the nanostructured surface should be good tools to detect the effects and action mechanisms of anticancer drugs.
de la Piedra, Antonio; Braeken, An; Touhafi, Abdellah
2013-07-29
Typically, commercial sensor nodes are equipped with MCUsclocked at a low-frequency (i.e., within the 4-12 MHz range). Consequently, executing cryptographic algorithms in those MCUs generally requires a huge amount of time. In this respect, the required energy consumption can be higher than using a separate accelerator based on a Field-programmable Gate Array (FPGA) that is switched on when needed. In this manuscript, we present the design of a cryptographic accelerator suitable for an FPGA-based sensor node and compliant with the IEEE802.15.4 standard. All the embedded resources of the target platform (Xilinx Artix-7) have been maximized in order to provide a cost-effective solution. Moreover, we have added key negotiation capabilities to the IEEE 802.15.4 security suite based on Elliptic Curve Cryptography (ECC). Our results suggest that tailored accelerators based on FPGA can behave better in terms of energy than contemporary software solutions for motes, such as the TinyECC and NanoECC libraries. In this regard, a point multiplication (PM) can be performed between 8.58- and 15.4-times faster, 3.40- to 23.59-times faster (Elliptic Curve Diffie-Hellman, ECDH) and between 5.45- and 34.26-times faster (Elliptic Curve Integrated Encryption Scheme, ECIES). Moreover, the energy consumption was also improved with a factor of 8.96 (PM).
NASA Astrophysics Data System (ADS)
Vastianos, George E.; Argyreas, Nick D.; Xilouris, Chris K.; Thomopoulos, Stelios C. A.
2015-05-01
The field of Homeland Security focuses on the air, land, and sea borders surveillance in order to prevent illegal activities while facilitating lawful travel and trade. The achievement of this goal requires collaboration of complex decentralized systems and services, and transfer of huge amount of information between the remote surveillance areas and the command & control centers. It becomes obvious that the effectiveness of the provided security depends highly on the available communication capabilities between the interconnected areas. Although nowadays the broadband communication between remote places is presumed easy because of the extensive infrastructure inside residential areas, it becomes a real challenge when the required information should be acquired from locations where no infrastructure is available such as mountain or sea areas. The Integrated Systems Lab of NCSR Demokritos within the PERSEUS FP7- SEC-2011-261748 project has developed a wireless broadband telecommunication system that combines different communication channels from subGHz to microwave frequencies and provides secure IP connectivity between sea surveillance vessels and the Command and Control Centers (C3). The system was deployed in Fast Patrol Boats of the Hellenic Coast Guard that are used for maritime surveillance in sea boarders and tested successfully in two demonstration exercises for irregular migration and smuggling scenarios in the Aegean Archipelagos. This paper describes in detail the system architecture in terms of hardware and software and the evaluation measurements of the system communication capabilities.
Information Measures for Statistical Orbit Determination
ERIC Educational Resources Information Center
Mashiku, Alinda K.
2013-01-01
The current Situational Space Awareness (SSA) is faced with a huge task of tracking the increasing number of space objects. The tracking of space objects requires frequent and accurate monitoring for orbit maintenance and collision avoidance using methods for statistical orbit determination. Statistical orbit determination enables us to obtain…
NASA Astrophysics Data System (ADS)
Sakamoto, Shingo X.; Sasa, Shuji; Sawayama, Shuhei; Tsujimoto, Ryo; Terauchi, Genki; Yagi, Hiroshi; Komatsu, Teruhisa
2012-10-01
Seaweed beds are very important for abalones and sea urchins as a habitat. In Sanriku Coast, these animals are target species of coastal fisheries. The huge tsunami hit Sanriku Coast facing Pacific Ocean on 11 March 2011. It is needed for fishermen to know present situation of seaweed beds and understand damages of the huge tsunami on natural environments to recover coastal fisheries. We selected Shizugawa Bay as a study site because abalone catch of Shizugawa Bay occupied the first position in Sanriku Coast. To evaluate impact of tsunami on seaweed beds, we compared high spatial resolution satellite image of Shizugawa Bay before the tsunami with that after the tsunami by remote sensing with ground surveys to know impact of the tsunami on seaweed beds. We used two multi-band imageries of commercial high-resolution satellite, Geoeye-1, which were taken on 4 November 2009 before the tsunami and on 22 February 2012 after the tsunami. Although divers observed the tsunami damaged a very small part of Eisenia bicyclis distributions on rock substrates at the bay head, it was not observed clearly by satellite image analysis. On the other hand, we found increase in seaweed beds after the tsunami from the image analysis. The tsunami broke concrete breakwaters, entrained a large amount of rocks and pebble from land to the sea, and disseminated them in the bay. Thus, hard substrates suitable for attachment of seaweeds were increased. Ground surveys revealed that seaweeds consisting of E. bicyclis, Sargassum and Laminaria species grew on these hard substrates on the sandy bottom.
Senapati, Tapas; Senapati, Dulal; Singh, Anant Kumar; Fan, Zhen; Kanchanapally, Rajashekhar; Ray, Paresh Chandra
2011-10-07
Contamination of the environment with toxic Hg(II) is becoming a huge concern throughout the world now. Driven by the need, this communication reports for the first time a tryptophan protected popcorn shaped gold nanomaterials based SERS probe for rapid, easy and highly selective recognition of Hg(II) ions in the 5 ppb level from aqueous solution, with high sensitivity and selectivity over competing analytes. We demonstrate that our SERS assay is capable of measuring the amount of Hg(II) in alkaline battery. This journal is © The Royal Society of Chemistry 2011
Bioinformatics in proteomics: application, terminology, and pitfalls.
Wiemer, Jan C; Prokudin, Alexander
2004-01-01
Bioinformatics applies data mining, i.e., modern computer-based statistics, to biomedical data. It leverages on machine learning approaches, such as artificial neural networks, decision trees and clustering algorithms, and is ideally suited for handling huge data amounts. In this article, we review the analysis of mass spectrometry data in proteomics, starting with common pre-processing steps and using single decision trees and decision tree ensembles for classification. Special emphasis is put on the pitfall of overfitting, i.e., of generating too complex single decision trees. Finally, we discuss the pros and cons of the two different decision tree usages.
Implementation of DFT application on ternary optical computer
NASA Astrophysics Data System (ADS)
Junjie, Peng; Youyi, Fu; Xiaofeng, Zhang; Shuai, Kong; Xinyu, Wei
2018-03-01
As its characteristics of huge number of data bits and low energy consumption, optical computing may be used in the applications such as DFT etc. which needs a lot of computation and can be implemented in parallel. According to this, DFT implementation methods in full parallel as well as in partial parallel are presented. Based on resources ternary optical computer (TOC), extensive experiments were carried out. Experimental results show that the proposed schemes are correct and feasible. They provide a foundation for further exploration of the applications on TOC that needs a large amount calculation and can be processed in parallel.
Plenoptic image watermarking to preserve copyright
NASA Astrophysics Data System (ADS)
Ansari, A.; Dorado, A.; Saavedra, G.; Martinez Corral, M.
2017-05-01
Common camera loses a huge amount of information obtainable from scene as it does not record the value of individual rays passing a point and it merely keeps the summation of intensities of all the rays passing a point. Plenoptic images can be exploited to provide a 3D representation of the scene and watermarking such images can be helpful to protect the ownership of these images. In this paper we propose a method for watermarking the plenoptic images to achieve this aim. The performance of the proposed method is validated by experimental results and a compromise is held between imperceptibility and robustness.
Molecular science for drug development and biomedicine.
Zhong, Wei-Zhu; Zhou, Shu-Feng
2014-11-04
With the avalanche of biological sequences generated in the postgenomic age, molecular science is facing an unprecedented challenge, i.e., how to timely utilize the huge amount of data to benefit human beings. Stimulated by such a challenge, a rapid development has taken place in molecular science, particularly in the areas associated with drug development and biomedicine, both experimental and theoretical. The current thematic issue was launched with the focus on the topic of "Molecular Science for Drug Development and Biomedicine", in hopes to further stimulate more useful techniques and findings from various approaches of molecular science for drug development and biomedicine.[...].
NASA Astrophysics Data System (ADS)
Kosmidis, Kosmas; Kalampokis, Alkiviadis; Argyrakis, Panos
2006-10-01
We use the detrended fluctuation analysis (DFA) and the Grassberger-Proccacia analysis (GP) methods in order to study language characteristics. Despite that we construct our signals using only word lengths or word frequencies, excluding in this way huge amount of information from language, the application of GP analysis indicates that linguistic signals may be considered as the manifestation of a complex system of high dimensionality, different from random signals or systems of low dimensionality such as the Earth climate. The DFA method is additionally able to distinguish a natural language signal from a computer code signal. This last result may be useful in the field of cryptography.
Yauy, Kevin; Gatinois, Vincent; Guignard, Thomas; Sati, Satish; Puechberty, Jacques; Gaillard, Jean Baptiste; Schneider, Anouck; Pellestor, Franck
2018-01-01
Apparition of next-generation sequencing (NGS) was a breakthrough on knowledge of genome structure. Bioinformatic tools are a key point to analyze this huge amount of data from NGS and characterize the three-dimensional organization of chromosomes. This chapter describes usage of different browsers to explore publicly available online data and to search for possible 3D chromatin changes involved during complex chromosomal rearrangements as chromothripsis. Their pathogenic impact on clinical phenotype and gene misexpression can also be evaluated with annotated databases.
ERIC Educational Resources Information Center
Perera, Srinath; Babatunde, Solomon Olusola; Zhou, Lei; Pearson, John; Ekundayo, Damilola
2017-01-01
Recognition of the huge variation between professional graduate degree programmes and employer requirements, especially in the construction industry, necessitated a need for assessing and developing competencies that aligned with professionally oriented programmes. The purpose of this research is to develop a competency mapping framework (CMF) in…
An Experimental Study towards Young Adults: Communication Skills Education
ERIC Educational Resources Information Center
Guclu, Sultan
2016-01-01
Problem Statement: Located in each discipline, communication also bears a huge importance in the health sector. Communication with patients and relatives require more moderate and friendly because of their sensitive situation. Developing communication skills of students in health sciences will be able to communicate effectively with patients. In…
Petroleum Sludge as gypsum replacement in cement plants: Its Impact on Cement Strength
NASA Astrophysics Data System (ADS)
Benlamoudi, Ali; Kadir, Aeslina Abdul; Khodja, Mohamed
2017-08-01
Due to high cost of cement manufacturing and the huge amount of resources exhaustion, companies are trying to incorporate alternative raw materials or by-products into cement production so as to produce alternative sustainable cement. Petroleum sludge is a dangerous waste that poses serious imparts on soil and groundwater. Given that this sludge contains a high percentage of anhydrite (CaSO4), which is the main component of gypsum (CaSO4.2H2O), it may play the same gypsum role in strength development. In this research, a total replacement of gypsum (100%) has been substituted by petroleum sludge in cement production and has led to an increase of 28.8% in UCS values after 28 curing days. Nevertheless, the burning of this waste has emitted a considerable amount of carbon monoxide (CO) gas that needs to be carefully considered prior to use petroleum sludge within cement plants.
Health on the Net Foundation: assessing the quality of health web pages all over the world.
Boyer, Célia; Gaudinat, Arnaud; Baujard, Vincent; Geissbühler, Antoine
2007-01-01
The Internet provides a great amount of information and has become one of the communication media which is most widely used [1]. However, the problem is no longer finding information but assessing the credibility of the publishers as well as the relevance and accuracy of the documents retrieved from the web. This problem is particularly relevant in the medical area which has a direct impact on the well-being of citizens. In this paper, we assume that the quality of web pages can be controlled, even when a huge amount of documents has to be reviewed. But this must be supported by both specific automatic tools and human expertise. In this context, we present various initiatives of the Health on the Net Foundation informing the citizens about the reliability of the medical content on the web.
‘White revolution’ to ‘white pollution’—agricultural plastic film mulch in China
NASA Astrophysics Data System (ADS)
Liu, E. K.; He, W. Q.; Yan, C. R.
2014-09-01
Plastic film mulching has played an important role in Chinese agriculture due to its soil warming and moisture conservation effects. With the help of plastic film mulch technology, grain and cash crop yields have increased by 20-35% and 20-60%, respectively. The area of plastic film coverage in China reached approximately 20 million hectares, and the amount of plastic film used reached 1.25 million tons in 2011. While producing huge benefits, plastic film mulch technology has also brought on a series of pollution hazards. Large amounts of residual plastic film have detrimental effects on soil structure, water and nutrient transport and crop growth, thereby disrupting the agricultural environment and reducing crop production. To control pollution, the Chinese government urgently needs to elevate plastic film standards. Meanwhile, research and development of biodegradable mulch film and multi-functional mulch recovery machinery will help promote effective control and management of residual mulch pollution.
Mining chemical information from open patents
2011-01-01
Linked Open Data presents an opportunity to vastly improve the quality of science in all fields by increasing the availability and usability of the data upon which it is based. In the chemical field, there is a huge amount of information available in the published literature, the vast majority of which is not available in machine-understandable formats. PatentEye, a prototype system for the extraction and semantification of chemical reactions from the patent literature has been implemented and is discussed. A total of 4444 reactions were extracted from 667 patent documents that comprised 10 weeks' worth of publications from the European Patent Office (EPO), with a precision of 78% and recall of 64% with regards to determining the identity and amount of reactants employed and an accuracy of 92% with regards to product identification. NMR spectra reported as product characterisation data are additionally captured. PMID:21999425
Fuzzy Document Clustering Approach using WordNet Lexical Categories
NASA Astrophysics Data System (ADS)
Gharib, Tarek F.; Fouad, Mohammed M.; Aref, Mostafa M.
Text mining refers generally to the process of extracting interesting information and knowledge from unstructured text. This area is growing rapidly mainly because of the strong need for analysing the huge and large amount of textual data that reside on internal file systems and the Web. Text document clustering provides an effective navigation mechanism to organize this large amount of data by grouping their documents into a small number of meaningful classes. In this paper we proposed a fuzzy text document clustering approach using WordNet lexical categories and Fuzzy c-Means algorithm. Some experiments are performed to compare efficiency of the proposed approach with the recently reported approaches. Experimental results show that Fuzzy clustering leads to great performance results. Fuzzy c-means algorithm overcomes other classical clustering algorithms like k-means and bisecting k-means in both clustering quality and running time efficiency.
NASA Astrophysics Data System (ADS)
Bhuyan, S. K.; Samal, S.; Pattnaik, D.; Sahu, A.; Swain, B.; Thiyagarajan, T. K.; Mishra, S. C.
2018-03-01
The environment is being contaminated with advancement of new technology, day by day. One of the primary sources for this contamination is the industrial waste. Industrialization is the prime reason behind the prosperity of any country to meet the materialistic demand. To run the industries, a huge amount of (electric) power is needed and hence need for thermal power plants to serve the purpose. In present scenario, coal fired thermal power plants are set up which generates a huge quantity of Fly ash. Consumption of industrial waste (Fly ash), continually a major concern for human race. In recent years, fly ash is being utilized for various purposes i.e. making bricks, mine reclamation, production of cements etc. The presence of Silica and Alumina in fly ash makes it useful for thermal barrier applications also. The plasma spray technology has the advantage of being able to process any types of metal/ceramic mineral, low-grade-ore minerals etc. to make value-added products and also to deposit ceramics, metals and a combination of these to deposit composite coatings with desired microstructure and required properties on a range of substrate materials. The present work focuses on utilization of fly ash mixing with bauxite (ore mineral) for a high valued application. Fly ash with 10 and 20% bauxite addition is used to deposit plasma spray overlay coatings at different power levels (10-20kW) on aluminum and mild steel substrates. Adhesion strength and surface roughness of the coatings are evaluated. Phase composition analysis of the coatings were done using X-ray diffraction analysis. Surface morphology of the coatings was studied using a scanning electron microscope (SEM). Maximum adhesion strength of 4.924 MPa is obtained for the composition fly ash and bauxite (10%), coated on mild steel at 16kW torch power level. The surface roughness (Ra) of the coatings is found to vary between 10.0102 to 17.2341 micron.
Suplatov, Dmitry; Popova, Nina; Zhumatiy, Sergey; Voevodin, Vladimir; Švedas, Vytas
2016-04-01
Rapid expansion of online resources providing access to genomic, structural, and functional information associated with biological macromolecules opens an opportunity to gain a deeper understanding of the mechanisms of biological processes due to systematic analysis of large datasets. This, however, requires novel strategies to optimally utilize computer processing power. Some methods in bioinformatics and molecular modeling require extensive computational resources. Other algorithms have fast implementations which take at most several hours to analyze a common input on a modern desktop station, however, due to multiple invocations for a large number of subtasks the full task requires a significant computing power. Therefore, an efficient computational solution to large-scale biological problems requires both a wise parallel implementation of resource-hungry methods as well as a smart workflow to manage multiple invocations of relatively fast algorithms. In this work, a new computer software mpiWrapper has been developed to accommodate non-parallel implementations of scientific algorithms within the parallel supercomputing environment. The Message Passing Interface has been implemented to exchange information between nodes. Two specialized threads - one for task management and communication, and another for subtask execution - are invoked on each processing unit to avoid deadlock while using blocking calls to MPI. The mpiWrapper can be used to launch all conventional Linux applications without the need to modify their original source codes and supports resubmission of subtasks on node failure. We show that this approach can be used to process huge amounts of biological data efficiently by running non-parallel programs in parallel mode on a supercomputer. The C++ source code and documentation are available from http://biokinet.belozersky.msu.ru/mpiWrapper .
NASA Astrophysics Data System (ADS)
Schulz-von der Gathen, Volker
2015-09-01
Over the last decade a huge variety of atmospheric pressure plasma jets has been developed and applied for plasma medicine. The efficiency of these non-equilibrium plasmas for biological application is based on the generated amounts of reactive species and radiation. The gas temperatures stay within a range tolerable for temperature-sensitive tissues. The variety of different discharge geometries complicates a direct comparison. In addition, in plasma-medicine the combination of plasma with reactive components, ambient air, as well as biologic tissue - typically also incorporating fluids - results in a complex system. Thus, real progress in plasma-medicine requires a profound knowledge of species, their fluxes and processes hitting biological tissues. That will allow in particular the necessary tailoring of the discharge to fit the conditions. The complexity of the problem can only be overcome by a common effort of many groups and requires a comparison of their results. A reference device based on the already well-investigated micro-scaled atmospheric pressure plasma jet is presented. It is developed in the frame of the European COST initiative MP1101 to establish a publicly available, stable and reproducible source, where required plasma conditions can be investigated. Here we present the design and the ideas behind. The presentation discusses the requirements for the reference source and operation conditions. Biological references are also defined by the initiative. A specific part of the talk will be attributed to the reproducibility of results from various samples of the device. Funding by the DFG within the Package Project PAK816 ``Plasma Cell Interaction in Dermatology'' and the Research Unit FOR 1123 ``Physics of microplasmas'' is gratefully acknowledged.
Huang, Kuan-Ju; Shih, Wei-Yeh; Chang, Jui Chung; Feng, Chih Wei; Fang, Wai-Chi
2013-01-01
This paper presents a pipeline VLSI design of fast singular value decomposition (SVD) processor for real-time electroencephalography (EEG) system based on on-line recursive independent component analysis (ORICA). Since SVD is used frequently in computations of the real-time EEG system, a low-latency and high-accuracy SVD processor is essential. During the EEG system process, the proposed SVD processor aims to solve the diagonal, inverse and inverse square root matrices of the target matrices in real time. Generally, SVD requires a huge amount of computation in hardware implementation. Therefore, this work proposes a novel design concept for data flow updating to assist the pipeline VLSI implementation. The SVD processor can greatly improve the feasibility of real-time EEG system applications such as brain computer interfaces (BCIs). The proposed architecture is implemented using TSMC 90 nm CMOS technology. The sample rate of EEG raw data adopts 128 Hz. The core size of the SVD processor is 580×580 um(2), and the speed of operation frequency is 20MHz. It consumes 0.774mW of power during the 8-channel EEG system per execution time.
NASA Astrophysics Data System (ADS)
Maj, P.; Kasiński, K.; Gryboś, P.; Szczygieł, R.; Kozioł, A.
2015-12-01
Integrated circuits designed for specific applications generally use non-standard communication methods. Hybrid pixel detector readout electronics produces a huge amount of data as a result of number of frames per seconds. The data needs to be transmitted to a higher level system without limiting the ASIC's capabilities. Nowadays, the Camera Link interface is still one of the fastest communication methods, allowing transmission speeds up to 800 MB/s. In order to communicate between a higher level system and the ASIC with a dedicated protocol, an FPGA with dedicated code is required. The configuration data is received from the PC and written to the ASIC. At the same time, the same FPGA should be able to transmit the data from the ASIC to the PC at the very high speed. The camera should be an embedded system enabling autonomous operation and self-monitoring. In the presented solution, at least three different hardware platforms are used—FPGA, microprocessor with real-time operating system and the PC with end-user software. We present the use of a single software platform for high speed data transfer from 65k pixel camera to the personal computer.
NASA Astrophysics Data System (ADS)
Oiknine, Yaniv; August, Isaac Y.; Revah, Liat; Stern, Adrian
2016-05-01
Recently we introduced a Compressive Sensing Miniature Ultra-Spectral Imaging (CS-MUSI) system. The system is based on a single Liquid Crystal (LC) cell and a parallel sensor array where the liquid crystal cell performs spectral encoding. Within the framework of compressive sensing, the CS-MUSI system is able to reconstruct ultra-spectral cubes captured with only an amount of ~10% samples compared to a conventional system. Despite the compression, the technique is extremely complex computationally, because reconstruction of ultra-spectral images requires processing huge data cubes of Gigavoxel size. Fortunately, the computational effort can be alleviated by using separable operation. An additional way to reduce the reconstruction effort is to perform the reconstructions on patches. In this work, we consider processing on various patch shapes. We present an experimental comparison between various patch shapes chosen to process the ultra-spectral data captured with CS-MUSI system. The patches may be one dimensional (1D) for which the reconstruction is carried out spatially pixel-wise, or two dimensional (2D) - working on spatial rows/columns of the ultra-spectral cube, as well as three dimensional (3D).
Lima, Jakelyne; Cerdeira, Louise Teixeira; Bol, Erick; Schneider, Maria Paula Cruz; Silva, Artur; Azevedo, Vasco; Abelém, Antônio Jorge Gomes
2012-01-01
Improvements in genome sequencing techniques have resulted in generation of huge volumes of data. As a consequence of this progress, the genome assembly stage demands even more computational power, since the incoming sequence files contain large amounts of data. To speed up the process, it is often necessary to distribute the workload among a group of machines. However, this requires hardware and software solutions specially configured for this purpose. Grid computing try to simplify this process of aggregate resources, but do not always offer the best performance possible due to heterogeneity and decentralized management of its resources. Thus, it is necessary to develop software that takes into account these peculiarities. In order to achieve this purpose, we developed an algorithm aimed to optimize the functionality of de novo assembly software ABySS in order to optimize its operation in grids. We run ABySS with and without the algorithm we developed in the grid simulator SimGrid. Tests showed that our algorithm is viable, flexible, and scalable even on a heterogeneous environment, which improved the genome assembly time in computational grids without changing its quality. PMID:22461785
Real-Time Plasma Process Condition Sensing and Abnormal Process Detection
Yang, Ryan; Chen, Rongshun
2010-01-01
The plasma process is often used in the fabrication of semiconductor wafers. However, due to the lack of real-time etching control, this may result in some unacceptable process performances and thus leads to significant waste and lower wafer yield. In order to maximize the product wafer yield, a timely and accurately process fault or abnormal detection in a plasma reactor is needed. Optical emission spectroscopy (OES) is one of the most frequently used metrologies in in-situ process monitoring. Even though OES has the advantage of non-invasiveness, it is required to provide a huge amount of information. As a result, the data analysis of OES becomes a big challenge. To accomplish real-time detection, this work employed the sigma matching method technique, which is the time series of OES full spectrum intensity. First, the response model of a healthy plasma spectrum was developed. Then, we defined a matching rate as an indictor for comparing the difference between the tested wafers response and the health sigma model. The experimental results showed that this proposal method can detect process faults in real-time, even in plasma etching tools. PMID:22219683
Chaotic Traversal (CHAT): Very Large Graphs Traversal Using Chaotic Dynamics
NASA Astrophysics Data System (ADS)
Changaival, Boonyarit; Rosalie, Martin; Danoy, Grégoire; Lavangnananda, Kittichai; Bouvry, Pascal
2017-12-01
Graph Traversal algorithms can find their applications in various fields such as routing problems, natural language processing or even database querying. The exploration can be considered as a first stepping stone into knowledge extraction from the graph which is now a popular topic. Classical solutions such as Breadth First Search (BFS) and Depth First Search (DFS) require huge amounts of memory for exploring very large graphs. In this research, we present a novel memoryless graph traversal algorithm, Chaotic Traversal (CHAT) which integrates chaotic dynamics to traverse large unknown graphs via the Lozi map and the Rössler system. To compare various dynamics effects on our algorithm, we present an original way to perform the exploration of a parameter space using a bifurcation diagram with respect to the topological structure of attractors. The resulting algorithm is an efficient and nonresource demanding algorithm, and is therefore very suitable for partial traversal of very large and/or unknown environment graphs. CHAT performance using Lozi map is proven superior than the, commonly known, Random Walk, in terms of number of nodes visited (coverage percentage) and computation time where the environment is unknown and memory usage is restricted.
Fingerprint multicast in secure video streaming.
Zhao, H Vicky; Liu, K J Ray
2006-01-01
Digital fingerprinting is an emerging technology to protect multimedia content from illegal redistribution, where each distributed copy is labeled with unique identification information. In video streaming, huge amount of data have to be transmitted to a large number of users under stringent latency constraints, so the bandwidth-efficient distribution of uniquely fingerprinted copies is crucial. This paper investigates the secure multicast of anticollusion fingerprinted video in streaming applications and analyzes their performance. We first propose a general fingerprint multicast scheme that can be used with most spread spectrum embedding-based multimedia fingerprinting systems. To further improve the bandwidth efficiency, we explore the special structure of the fingerprint design and propose a joint fingerprint design and distribution scheme. From our simulations, the two proposed schemes can reduce the bandwidth requirement by 48% to 87%, depending on the number of users, the characteristics of video sequences, and the network and computation constraints. We also show that under the constraint that all colluders have the same probability of detection, the embedded fingerprints in the two schemes have approximately the same collusion resistance. Finally, we propose a fingerprint drift compensation scheme to improve the quality of the reconstructed sequences at the decoder's side without introducing extra communication overhead.
sRNAtoolboxVM: Small RNA Analysis in a Virtual Machine.
Gómez-Martín, Cristina; Lebrón, Ricardo; Rueda, Antonio; Oliver, José L; Hackenberg, Michael
2017-01-01
High-throughput sequencing (HTS) data for small RNAs (noncoding RNA molecules that are 20-250 nucleotides in length) can now be routinely generated by minimally equipped wet laboratories; however, the bottleneck in HTS-based research has shifted now to the analysis of such huge amount of data. One of the reasons is that many analysis types require a Linux environment but computers, system administrators, and bioinformaticians suppose additional costs that often cannot be afforded by small to mid-sized groups or laboratories. Web servers are an alternative that can be used if the data is not subjected to privacy issues (what very often is an important issue with medical data). However, in any case they are less flexible than stand-alone programs limiting the number of workflows and analysis types that can be carried out.We show in this protocol how virtual machines can be used to overcome those problems and limitations. sRNAtoolboxVM is a virtual machine that can be executed on all common operating systems through virtualization programs like VirtualBox or VMware, providing the user with a high number of preinstalled programs like sRNAbench for small RNA analysis without the need to maintain additional servers and/or operating systems.
Advantage of four-electrode over two-electrode defibrillators
NASA Astrophysics Data System (ADS)
Bragard, J.; Šimić, A.; Laroze, D.; Elorza, J.
2015-12-01
Defibrillation is the standard clinical treatment used to stop ventricular fibrillation. An electrical device delivers a controlled amount of electrical energy via a pair of electrodes in order to reestablish a normal heart rate. We propose a technique that is a combination of biphasic shocks applied with a four-electrode system rather than the standard two-electrode system. We use a numerical model of a one-dimensional ring of cardiac tissue in order to test and evaluate the benefit of this technique. We compare three different shock protocols, namely a monophasic and two types of biphasic shocks. The results obtained by using a four-electrode system are compared quantitatively with those obtained with the standard two-electrode system. We find that a huge reduction in defibrillation threshold is achieved with the four-electrode system. For the most efficient protocol (asymmetric biphasic), we obtain a reduction in excess of 80% in the energy required for a defibrillation success rate of 90%. The mechanisms of successful defibrillation are also analyzed. This reveals that the advantage of asymmetric biphasic shocks with four electrodes lies in the duration of the cathodal and anodal phase of the shock.
Latif, Rabia; Abbas, Haider; Assar, Saïd
2014-11-01
Wireless Body Area Networks (WBANs) have emerged as a promising technology that has shown enormous potential in improving the quality of healthcare, and has thus found a broad range of medical applications from ubiquitous health monitoring to emergency medical response systems. The huge amount of highly sensitive data collected and generated by WBAN nodes requires an ascendable and secure storage and processing infrastructure. Given the limited resources of WBAN nodes for storage and processing, the integration of WBANs and cloud computing may provide a powerful solution. However, despite the benefits of cloud-assisted WBAN, several security issues and challenges remain. Among these, data availability is the most nagging security issue. The most serious threat to data availability is a distributed denial of service (DDoS) attack that directly affects the all-time availability of a patient's data. The existing solutions for standalone WBANs and sensor networks are not applicable in the cloud. The purpose of this review paper is to identify the most threatening types of DDoS attacks affecting the availability of a cloud-assisted WBAN and review the state-of-the-art detection mechanisms for the identified DDoS attacks.
A Split-Path Schema-Based RFID Data Storage Model in Supply Chain Management
Fan, Hua; Wu, Quanyuan; Lin, Yisong; Zhang, Jianfeng
2013-01-01
In modern supply chain management systems, Radio Frequency IDentification (RFID) technology has become an indispensable sensor technology and massive RFID data sets are expected to become commonplace. More and more space and time are needed to store and process such huge amounts of RFID data, and there is an increasing realization that the existing approaches cannot satisfy the requirements of RFID data management. In this paper, we present a split-path schema-based RFID data storage model. With a data separation mechanism, the massive RFID data produced in supply chain management systems can be stored and processed more efficiently. Then a tree structure-based path splitting approach is proposed to intelligently and automatically split the movement paths of products. Furthermore, based on the proposed new storage model, we design the relational schema to store the path information and time information of tags, and some typical query templates and SQL statements are defined. Finally, we conduct various experiments to measure the effect and performance of our model and demonstrate that it performs significantly better than the baseline approach in both the data expression and path-oriented RFID data query performance. PMID:23645112
Multi-level and hybrid modelling approaches for systems biology.
Bardini, R; Politano, G; Benso, A; Di Carlo, S
2017-01-01
During the last decades, high-throughput techniques allowed for the extraction of a huge amount of data from biological systems, unveiling more of their underling complexity. Biological systems encompass a wide range of space and time scales, functioning according to flexible hierarchies of mechanisms making an intertwined and dynamic interplay of regulations. This becomes particularly evident in processes such as ontogenesis, where regulative assets change according to process context and timing, making structural phenotype and architectural complexities emerge from a single cell, through local interactions. The information collected from biological systems are naturally organized according to the functional levels composing the system itself. In systems biology, biological information often comes from overlapping but different scientific domains, each one having its own way of representing phenomena under study. That is, the different parts of the system to be modelled may be described with different formalisms. For a model to have improved accuracy and capability for making a good knowledge base, it is good to comprise different system levels, suitably handling the relative formalisms. Models which are both multi-level and hybrid satisfy both these requirements, making a very useful tool in computational systems biology. This paper reviews some of the main contributions in this field.
Zrelli, K; Barilero, T; Cavatore, E; Berthoumieux, H; Le Saux, T; Croquette, V; Lemarchand, A; Gosse, C; Jullien, L
2011-04-01
Biological samples exhibit huge molecular diversity over large concentration ranges. Titrating a given compound in such mixtures is often difficult, and innovative strategies emphasizing selectivity are thus demanded. To overcome limitations inherent to thermodynamics, we here present a generic technique where discrimination relies on the dynamics of interaction between the target of interest and a probe introduced in excess. Considering an ensemble of two-state exchanging reactants submitted to temperature modulation, we first demonstrate that the amplitude of the out-of-phase concentration oscillations is maximum for every compound involved in a reaction whose equilibrium constant is equal to unity and whose relaxation time is equal to the inverse of the excitation angular frequency. Taking advantage of this feature, we next devise a highly specific detection protocol and validate it using a microfabricated resistive heater and an epifluorescence microscope, as well as labeled oligonucleotides to model species displaying various dynamic properties. As expected, quantification of a sought for strand is obtained even if interfering reagents are present in similar amounts. Moreover, our approach does not require any separation and is compatible with imaging. It could then benefit some of the numerous binding assays performed every day in life sciences.
Comparative and Quantitative Global Proteomics Approaches: An Overview
Deracinois, Barbara; Flahaut, Christophe; Duban-Deweer, Sophie; Karamanos, Yannis
2013-01-01
Proteomics became a key tool for the study of biological systems. The comparison between two different physiological states allows unravelling the cellular and molecular mechanisms involved in a biological process. Proteomics can confirm the presence of proteins suggested by their mRNA content and provides a direct measure of the quantity present in a cell. Global and targeted proteomics strategies can be applied. Targeted proteomics strategies limit the number of features that will be monitored and then optimise the methods to obtain the highest sensitivity and throughput for a huge amount of samples. The advantage of global proteomics strategies is that no hypothesis is required, other than a measurable difference in one or more protein species between the samples. Global proteomics methods attempt to separate quantify and identify all the proteins from a given sample. This review highlights only the different techniques of separation and quantification of proteins and peptides, in view of a comparative and quantitative global proteomics analysis. The in-gel and off-gel quantification of proteins will be discussed as well as the corresponding mass spectrometry technology. The overview is focused on the widespread techniques while keeping in mind that each approach is modular and often recovers the other. PMID:28250403
Needs of ergonomic design at control units in production industries.
Levchuk, I; Schäfer, A; Lang, K-H; Gebhardt, Hj; Klussmann, A
2012-01-01
During the last decades, an increasing use of innovative technologies in manufacturing areas was monitored. A huge amount of physical workload was replaced by the change from conventional machine tools to computer-controlled units. CNC systems spread in current production processes. Because of this, machine operators today mostly have an observational function. This caused increasing of static work (e.g., standing, sitting) and cognitive demands (e.g., process observation). Machine operators have a high responsibility, because mistakes may lead to human injuries as well as to product losses - and in consequence may lead to high monetary losses (for the company) as well. Being usable often means for a CNC machine being efficient. An intuitive usability and an ergonomic organization of CNC workplaces can be an essential basis to reduce the risk of failures in operation as well as physical complaints (e.g. pain or diseases because of bad body posture during work). In contrast to conventional machines, CNC machines are equipped both with hardware and software. An intuitive and clear-sighted operating of CNC systems is a requirement for quick learning of new systems. Within this study, a survey was carried out among trainees learning the operation of CNC machines.
NASA Astrophysics Data System (ADS)
Deng, M.; di, L.
2007-12-01
Data integration and analysis are the foundation for the scientific investigation in Earth science. In the past several decades, huge amounts of Earth science data have been collected mainly through remote sensing. Those data have become the treasure for Earth science research. Training students how to discover and use the huge volume of Earth science data in research become one of the most important trainings for making a student a qualified scientist. Being developed by a NASA funded project, the GeoBrain system has adopted and implemented the latest Web services and knowledge management technologies for providing innovative methods in publishing, accessing, visualizing, and analyzing geospatial data and in building/sharing geoscience knowledge. It provides a data-rich online learning and research environment enabled by wealthy data and information available at NASA Earth Observing System (EOS) Data and Information System (EOSDIS). Students, faculty members, and researchers from institutes worldwide can easily access, analyze, and model with the huge amount of NASA EOS data just like they possess such vast resources locally at their desktops. Although still in development, the GeoBrain system has been operational since 2005. A number of education materials have been developed for facilitating the use of GeoBrain as a powerful education tool for Earth science education at both undergraduate and graduate levels. Thousands of online higher-education users worldwide have used GeoBrain services. A number of faculty members in multiple universities have been funded as GeoBrain education partners to explore the use of GeoBrain in the classroom teaching and student research. By summarizing and analyzing the feedbacks from the online users and the education partners, this presentation presents the user experiences on using GeoBrain in Earth science teaching and research. The feedbacks on classroom use of GeoBrain have demonstrated that GeoBrain is very useful for facilitating the transition of both undergraduate and graduate students from learners to investigators. They feedbacks have also shown the system can improve teaching effectiveness, refine student's learning habit, and inspire students" interests in pursuing Earth sciences as their career. The interaction with the education users of GeoBrain provides much needed guidance and lessens-learned for future development and promotion of GeoBrain.
NASA Astrophysics Data System (ADS)
Lee, Jonghyun; Yoon, Hongkyu; Kitanidis, Peter K.; Werth, Charles J.; Valocchi, Albert J.
2016-07-01
Characterizing subsurface properties is crucial for reliable and cost-effective groundwater supply management and contaminant remediation. With recent advances in sensor technology, large volumes of hydrogeophysical and geochemical data can be obtained to achieve high-resolution images of subsurface properties. However, characterization with such a large amount of information requires prohibitive computational costs associated with "big data" processing and numerous large-scale numerical simulations. To tackle such difficulties, the principal component geostatistical approach (PCGA) has been proposed as a "Jacobian-free" inversion method that requires much smaller forward simulation runs for each iteration than the number of unknown parameters and measurements needed in the traditional inversion methods. PCGA can be conveniently linked to any multiphysics simulation software with independent parallel executions. In this paper, we extend PCGA to handle a large number of measurements (e.g., 106 or more) by constructing a fast preconditioner whose computational cost scales linearly with the data size. For illustration, we characterize the heterogeneous hydraulic conductivity (K) distribution in a laboratory-scale 3-D sand box using about 6 million transient tracer concentration measurements obtained using magnetic resonance imaging. Since each individual observation has little information on the K distribution, the data were compressed by the zeroth temporal moment of breakthrough curves, which is equivalent to the mean travel time under the experimental setting. Only about 2000 forward simulations in total were required to obtain the best estimate with corresponding estimation uncertainty, and the estimated K field captured key patterns of the original packing design, showing the efficiency and effectiveness of the proposed method.
Environmental Metagenomics: The Data Assembly and Data Analysis Perspectives
NASA Astrophysics Data System (ADS)
Kumar, Vinay; Maitra, S. S.; Shukla, Rohit Nandan
2015-03-01
Novel gene finding is one of the emerging fields in the environmental research. In the past decades the research was focused mainly on the discovery of microorganisms which were capable of degrading a particular compound. A lot of methods are available in literature about the cultivation and screening of these novel microorganisms. All of these methods are efficient for screening of microbes which can be cultivated in the laboratory. Microorganisms which live in extreme conditions like hot springs, frozen glaciers, acid mine drainage, etc. cannot be cultivated in the laboratory, this is because of incomplete knowledge about their growth requirements like temperature, nutrients and their mutual dependence on each other. The microbes that can be cultivated correspond only to less than 1 % of the total microbes which are present in the earth. Rest of the 99 % of uncultivated majority remains inaccessible. Metagenomics transcends the culture requirements of microbes. In metagenomics DNA is directly extracted from the environmental samples such as soil, seawater, acid mine drainage etc., followed by construction and screening of metagenomic library. With the ongoing research, a huge amount of metagenomic data is accumulating. Understanding this data is an essential step to extract novel genes of industrial importance. Various bioinformatics tools have been designed to analyze and annotate the data produced from the metagenome. The Bio-informatic requirements of metagenomics data analysis are different in theory and practice. This paper reviews the tools that are available for metagenomic data analysis and the capability such tools—what they can do and their web availability.
Chiu, Mei Choi; Pun, Chi Seng; Wong, Hoi Ying
2017-08-01
Investors interested in the global financial market must analyze financial securities internationally. Making an optimal global investment decision involves processing a huge amount of data for a high-dimensional portfolio. This article investigates the big data challenges of two mean-variance optimal portfolios: continuous-time precommitment and constant-rebalancing strategies. We show that both optimized portfolios implemented with the traditional sample estimates converge to the worst performing portfolio when the portfolio size becomes large. The crux of the problem is the estimation error accumulated from the huge dimension of stock data. We then propose a linear programming optimal (LPO) portfolio framework, which applies a constrained ℓ 1 minimization to the theoretical optimal control to mitigate the risk associated with the dimensionality issue. The resulting portfolio becomes a sparse portfolio that selects stocks with a data-driven procedure and hence offers a stable mean-variance portfolio in practice. When the number of observations becomes large, the LPO portfolio converges to the oracle optimal portfolio, which is free of estimation error, even though the number of stocks grows faster than the number of observations. Our numerical and empirical studies demonstrate the superiority of the proposed approach. © 2017 Society for Risk Analysis.
High Pressure Gas Filled RF Cavity Beam Test at the Fermilab MuCool Test Area
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freemire, Ben
2013-05-01
The high energy physics community is continually looking to push the limits with respect to the energy and luminosity of particle accelerators. In the realm of leptons, only electron colliders have been built to date. Compared to hadrons, electrons lose a large amount of energy when accelerated in a ring through synchrotron radiation. A solution to this problem is to build long, straight accelerators for electrons, which has been done with great success. With a new generation of lepton colliders being conceived, building longer, more powerful accelerators is not the most enticing option. Muons have been proposed as an alternativemore » particle to electrons. Muons lose less energy to synchrotron radiation and a Muon Collider can provide luminosity within a much smaller energy range than a comparable electron collider. This allows a circular collider to be built with higher attainable energy than any present electron collider. As part of the accelerator, but separate from the collider, it would also be possible to allow the muons to decay to study neutrinos. The possibility of a high energy, high luminosity muon collider and an abundant, precise source of neutrinos is an attractive one. The technological challenges of building a muon accelerator are many and diverse. Because the muon is an unstable particle, a muon beam must be cooled and accelerated to the desired energy within a short amount of time. This requirement places strict requisites on the type of acceleration and focusing that can be used. Muons are generated as tertiary beams with a huge phase space, so strong magnetic fields are required to capture and focus them. Radio frequency (RF) cavities are needed to capture, bunch and accelerate the muons. Unfortunately, traditional vacuum RF cavities have been shown to break down in the magnetic fields necessary for capture and focusing.« less
NASA Astrophysics Data System (ADS)
Panda, K. P.; Jha, M. K.; Sharma, S. P.
2017-12-01
Various parts of the world face acute shortage of groundwater. To solve groundwater problems various approaches are followed. Interlinking of the river is one of the approaches. The southern part of the West Bengal province of India receives huge amount of rainfall (annual 1200mm). Instead of huge amount of rainfall some parts of the area are problematic for groundwater occurrence. Characterization of aquifer in this area is very important for sustainable development of water supply and artificial recharge schemes. Electrical resistivity survey was performed at regular interval from Kharagpur (north) to Subarnrekha River (south) to map the lithological variations in this area. It covers around 25 kilometers distance from Kharagpur with latitude and longitude (22°19'7.3"N 87°18'40"E) to Subarnrekha River (22°15'49.4" N 87°16'45.1" E). To locating a suitable area for artificial recharge and for the characterization of aquifers vertical electrical sounding is a robust method. Resistivity soundings were carried out with an interval of 2 to 3 kilometers. Subsurface resistivity distribution has been interpreted by using very fast simulated annealing (VFSA) global optimization technique. The study reveals that northern part of the area is problematic and does not have suitable aquifer systems. Resistivity distribution is suitable in the southern part of area and corresponds to clayey sand. Interpreted resistivity in the northern part of the area is relatively high and reveals impervious laterite layer. In southern part of the area resistivity varies between 5-10 Ohm-m at depth below 80 m. Based on the resistivity model different types of geologic units are classified and the zone of interests for aquifer has been demarcated.
Direct AUC optimization of regulatory motifs.
Zhu, Lin; Zhang, Hong-Bo; Huang, De-Shuang
2017-07-15
The discovery of transcription factor binding site (TFBS) motifs is essential for untangling the complex mechanism of genetic variation under different developmental and environmental conditions. Among the huge amount of computational approaches for de novo identification of TFBS motifs, discriminative motif learning (DML) methods have been proven to be promising for harnessing the discovery power of accumulated huge amount of high-throughput binding data. However, they have to sacrifice accuracy for speed and could fail to fully utilize the information of the input sequences. We propose a novel algorithm called CDAUC for optimizing DML-learned motifs based on the area under the receiver-operating characteristic curve (AUC) criterion, which has been widely used in the literature to evaluate the significance of extracted motifs. We show that when the considered AUC loss function is optimized in a coordinate-wise manner, the cost function of each resultant sub-problem is a piece-wise constant function, whose optimal value can be found exactly and efficiently. Further, a key step of each iteration of CDAUC can be efficiently solved as a computational geometry problem. Experimental results on real world high-throughput datasets illustrate that CDAUC outperforms competing methods for refining DML motifs, while being one order of magnitude faster. Meanwhile, preliminary results also show that CDAUC may also be useful for improving the interpretability of convolutional kernels generated by the emerging deep learning approaches for predicting TF sequences specificities. CDAUC is available at: https://drive.google.com/drive/folders/0BxOW5MtIZbJjNFpCeHlBVWJHeW8 . dshuang@tongji.edu.cn. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
NASA Astrophysics Data System (ADS)
Greulich, Karl-Otto; Monajembashi, Shamci; Celeda, D.; Endlich, N.; Eickhoff, Holger; Hoyer, Carsten; Leitz, G.; Weber, Gerd; Scheef, J.; Rueterjans, H.
1994-12-01
Genomes of higher organisms are larger than one typically expects. For example, the DNA of a single human cell is almost two meters long, the DNA in the human body covers the distance Earth-Sun approximately 140 times. This is often not considered in typical molecular biological approaches for DNA diagnostics, where usually only DNA of the length of a gene is investigated. Also, one basic aspect of sequencing the human genome is not really solved: the problem how to prepare the huge amounts of DNA required. Approaches from biomedical optics combined with new developments in single molecule biotechnology may at least contribute some parts of the puzzle. A large genome can be partitioned into portions comprising approximately 1% of the whole DNA using a laser microbeam. The single DNA fragment can be amplified by the polymerase chain reaction in order to obtain a sufficient amount of molecules for conventional DNA diagnostics or for analysis by octanucleotide hybridization. When not amplified by biotechnological processes, the individual DNA molecule can be visualized in the light microscope and can be manipulated and dissected with the laser microbeam trap. The DNA probes obtained by single molecule biotechnology can be employed for fluorescence in situ introduced into plant cells and subcellular structures even when other techniques fail. Since the laser microbeam trap allows to work in the interior of a cell without opening it, subcellular structures can be manipulated. For example, in algae, such structures can be moved out of their original position and used to study intracellular viscosities.
Challenges for Wireless Mesh Networks to provide reliable carrier-grade services
NASA Astrophysics Data System (ADS)
von Hugo, D.; Bayer, N.
2011-08-01
Provision of mobile and wireless services today within a competitive environment and driven by a huge amount of steadily emerging new services and applications is both challenge and chance for radio network operators. Deployment and operation of an infrastructure for mobile and wireless broadband connectivity generally requires planning effort and large investments. A promising approach to reduce expenses for radio access networking is offered by Wireless Mesh Networks (WMNs). Here traditional dedicated backhaul connections to each access point are replaced by wireless multi-hop links between neighbouring access nodes and few gateways to the backbone employing standard radio technology. Such a solution provides at the same time high flexibility in both deployment and the amount of offered capacity and shall reduce overall expenses. On the other hand currently available mesh solutions do not provide carrier grade service quality and reliability and often fail to cope with high traffic load. EU project CARMEN (CARrier grade MEsh Networks) was initiated to incorporate different heterogeneous technologies and new protocols to allow for reliable transmission over "best effort" radio channels, to support a reliable mobility and network management, self-configuration and dynamic resource usage, and thus to offer a permanent or temporary broadband access at high cost efficiency. The contribution provides an overview on preliminary project results with focus on main technical challenges from a research and implementation point of view. Especially impact of mesh topology on the overall system performance in terms of throughput and connection reliability and aspects of a dedicated hybrid mobility management solution will be discussed.
The Zone of Proximal Development in the Learning of Mathematics
ERIC Educational Resources Information Center
Siyepu, Sibawu
2013-01-01
South Africa has a huge shortage of skilled workers in various fields such as engineering, applied sciences, accountancy, architecture, medicine and law. Mathematics is a requirement for entry in these careers to enable learners to grasp the content of various subjects in these disciplines. Despite that, in South Africa, learners' performance in…
Communicative Discourse in Second Language Classrooms: From Building Skills to Becoming Skillful
ERIC Educational Resources Information Center
Suleiman, Mahmoud
2013-01-01
The dynamics of the communicative discourse is a natural process that requires an application of a wide range of skills and strategies. In particular, linguistic discourse and the interaction process have a huge impact on promoting literacy and academic skills in all students especially English language learners (ELLs). Using interactive…
selectSNP – An R package for selecting SNPs optimal for genetic evaluation
USDA-ARS?s Scientific Manuscript database
There has been a huge increase in the number of SNPs in the public repositories. This has made it a challenge to design low and medium density SNP panels, which requires careful selection of available SNPs considering many criteria, such as map position, allelic frequency, possible biological functi...
ICT-Enabled Distance Education in Community Development in the Philippines
ERIC Educational Resources Information Center
Ramos, Angelo Juan; Nangit, Genevieve; Ranga, Adelina I.; Trinona, Jerome
2007-01-01
The Philippines, an archipelago of 7,100 islands in Southeast Asia, faces new opportunities as it competes with the rest of the world in the information and communication technology (ICT) arena. New industries including call centers, foreign medical transcription services, and the hugely popular multiplayer online gaming systems require the…
Interior Pathways to Dissipation of Mesoscale Energy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nadiga, Balasubramanya T.
This talk at Goethe University asks What Powers Overturning Circulation? How does Ocean Circulation Equilibrate? There is a HUGE reservoir of energy sitting in the interior ocean. Can fluid dynamic instabilities contribute to the mixing required to drive global overturning circulation? Study designed to eliminate distinguished horizontal surfaces such as bottom BL and surface layer
NASA Astrophysics Data System (ADS)
Yamakoshi, T.; Shimizu, Y.; Osanai, N.; Sasahara, K.; Tamura, K.; Doshida, S.; Tsutsui, K.
2009-04-01
On March 26, 2006, a gigantic landslide occurred on the caldera wall of Mt. Bawakaraeng, Indonesia. This paper quantitatively shows the temporal change in gully erosion and sediment yield from the huge amount of the deposit of the landslide by analyzing satellite images. Firstly, the landslide buried the original river channel completely. In the next year, gully erosion dominated the entire landslide deposit, and parts of the gully bed were found to have eroded by up to 60 m. The total amount of sediment discharged from the landslide deposit was estimated to be 36 million m3. In the second year after the landslide, the severe widespread degradation almost ceased and river bed aggradation started to occur in some places. The total amount of discharged sediment drastically decreased and was estimated to be 8.3 million m3. In the third year, the total amount of sediment discharge declined further. On the other hand, satellite-derived DEMs showed that the width of gullies has increased. The drastic decrease in sediment discharge might have occurred because of the reduction in the erosive force applied by water flow whose depth was inevitably reduced as a result of the widening of gully channels.
NASA Astrophysics Data System (ADS)
Lee, Khil-Ha; Kim, Sung-Wook; Kim, Sang-Hyun
2014-05-01
Many volcanic craters and calderas are filled with large amounts of water that can pose significant flood hazards to downstream communities due to their high elevation and the potential for catastrophic releases of water. Recent reports pointed out the Baekdusan volcano that is located between the border of China and North Korea as a potential active volcano. Since Millennium Eruption around 1000 AD, smaller eruptions have occurred at roughly 100-year intervals, with the last one in 1903. Sudden release of huge volume of water stored in temporarily elevated caldera lakes are a recurrent feature of volcanic environments, due to the case with which outlet channels are blocked by and re-cut through, unwelded pyroclastic deposits. The volcano is showing signs of waking from a century-long slumber recently. Volcanic floods, including breakouts from volcanic lakes, can affect communities beyond the areas immediately affected by a volcanic eruption and cause significant hydrological hazards because floods from lake-filled calderas may be particularly large and high. Although a number of case studies have been presented in the literature, investigation of the underlying physical processes is required as well as a method for interpreting the process of the rapid release of water stored in a caldera lake. The development of various forecasting techniques to prevent and minimize economic and social damage is in urgent need. This study focuses on constructing a flood hazard map triggered by the magma effusion in the Baekdusan volcano. A physically-based uplift model was developed to compute the amount of water and time to peak flow. The ordinary differential equation was numerically solved using the finite difference method and Newton-Raphson iteration method was used to solve nonlinear equation. The magma effusion rate into the caldera lake is followed by examples at other volcanic activities. As a result, the hydrograph serves as an upper boundary condition when hydrodynamic model, called FLO-2D runs to simulate channel routing downstream to give the maximum water level. Once probable inundation areas are identified by the huge volume of water in the caldera lake, the unique geography, and the limited control capability, a potential hazard assessment can be represented. The study will contribute to build a geohazard map for the decision-makers and practitioners. Keywords: Volcanic flood, Caldera lake, Hazard assessment, Magma effusion Acknowledgement This research was supported by a grant [NEMA-BAEKDUSAN-2012-1-2] from the Volcanic Disaster Preparedness Research Center sponsored by National Emergency Management Agency of Korea.
NASA Astrophysics Data System (ADS)
Lambert, I. B.
2012-04-01
This presentation will consider the adequacy of global uranium and thorium resources to meet realistic nuclear power demand scenarios over the next half century. It is presented on behalf of, and based on evaluations by, the Uranium Group - a joint initiative of the OECD Nuclear Energy Agency and the International Atomic Energy Agency, of which the author is a Vice Chair. The Uranium Group produces a biennial report on Uranium Resources, Production and Demand based on information from some 40 countries involved in the nuclear fuel cycle, which also briefly reviews thorium resources. Uranium: In 2008, world production of uranium amounted to almost 44,000 tonnes (tU). This supplied approximately three-quarters of world reactor requirements (approx. 59,000 tU), the remainder being met by previously mined uranium (so-called secondary sources). Information on availability of secondary sources - which include uranium from excess inventories, dismantling nuclear warheads, tails and spent fuel reprocessing - is incomplete, but such sources are expected to decrease in market importance after 2013. In 2008, the total world Reasonably Assured plus Inferred Resources of uranium (recoverable at less than 130/kgU) amounted to 5.4 million tonnes. In addition, it is clear that there are vast amounts of uranium recoverable at higher costs in known deposits, plus many as yet undiscovered deposits. The Uranium Group has concluded that the uranium resource base is more than adequate to meet projected high-case requirements for nuclear power for at least half a century. This conclusion does not assume increasing replacement of uranium by fuels from reprocessing current reactor wastes, or by thorium, nor greater reactor efficiencies, which are likely to ameliorate future uranium demand. However, progressively increasing quantities of uranium will need to be mined, against a backdrop of the relatively small number of producing facilities around the world, geopolitical uncertainties and strong opposition to growth of nuclear power in a number of quarters - it is vital that the market provides incentives for exploration and development of environmentally sustainable mining operations. Thorium: World Reasonably Assured plus Inferred Resources of thorium are estimated at over 2.2 million tonnes, in hard rock and heavy mineral sand deposits. At least double this amount is considered to occur in as yet undiscovered thorium deposits. Currently, demand for thorium is insignificant, but even a major shift to thorium-fueled reactors would not make significant inroads into the huge resource base over the next half century.
De Luca, Leonardo; Granatelli, Antonino
2017-06-01
A sensation of self-awareness on the relativity of our certainties comes over looking to the huge amount of data on antithrombotic therapies assessed in patients with ST-elevation myocardial infarction (STEMI) undergoing primary percutaneous coronary intervention (pPCI). This sensation can be compared to the so-called "overview effect", a cognitive shift in awareness reported by some astronauts during spaceflight, often while viewing the Earth from orbit. In this review we will mention drugs floated like meteors in the Universe of STEMI treatment and we will discuss the body of evidence on oral and intravenous antithrombotic therapies for patients undergoing pPCI.
Design Considerations for a Web-based Database System of ELISpot Assay in Immunological Research
Ma, Jingming; Mosmann, Tim; Wu, Hulin
2005-01-01
The enzyme-linked immunospot (ELISpot) assay has been a primary means in immunological researches (such as HIV-specific T cell response). Due to huge amount of data involved in ELISpot assay testing, the database system is needed for efficient data entry, easy retrieval, secure storage, and convenient data process. Besides, the NIH has recently issued a policy to promote the sharing of research data (see http://grants.nih.gov/grants/policy/data_sharing). The Web-based database system will be definitely benefit to data sharing among broad research communities. Here are some considerations for a database system of ELISpot assay (DBSEA). PMID:16779326
NASA Astrophysics Data System (ADS)
Aberasturi, M.; Solano, E.; Martín, E.
2015-05-01
Low-mass stars and brown dwarfs (with spectral types M, L, T and Y) are the most common objects in the Milky Way. A complete census of these objects is necessary to understand the theories about their complex structure and formation processes. In order to increase the number of known objects in the Solar neighborhood (d<30 pc), we have made use of the Virtual Observatory which allows an efficient handling of the huge amount of information available in astronomical databases. We also used the WFC3 installed in the Hubble Space Telescope to look for T5+ dwarfs binaries.
Competitive speed eating: truth and consequences.
Levine, Marc S; Spencer, Geoffrey; Alavi, Abass; Metz, David C
2007-09-01
The purpose of our investigation was to assess the stomachs of a world-class speed-eating champion and of a control subject during a speed-eating test in our gastrointestinal fluoroscopy suite to determine how competitive speed eaters are able to eat so much so fast. Our observations suggest that successful speed eaters expand the stomach to form an enormous flaccid sac capable of accommodating huge amounts of food. We speculate that professional speed eaters eventually may develop morbid obesity, profound gastroparesis, intractable nausea and vomiting, and even the need for a gastrectomy. Despite its growing popularity, competitive speed eating is a potentially self-destructive form of behavior.
Saying goodbye to optical storage technology.
McLendon, Kelly; Babbitt, Cliff
2002-08-01
The days of using optical disk based mass storage devices for high volume applications like health care document imaging are coming to an end. The price/performance curve for redundant magnetic disks, known as RAID, is now more positive than for optical disks. All types of application systems, across many sectors of the marketplace are using these newer magnetic technologies, including insurance, banking, aerospace, as well as health care. The main components of these new storage technologies are RAID and SAN. SAN refers to storage area network, which is a complex mechanism of switches and connections that allow multiple systems to store huge amounts of data securely and safely.
Measurement of positron annihilation lifetimes for positron burst by multi-detector array
NASA Astrophysics Data System (ADS)
Wang, B. Y.; Kuang, P.; Liu, F. Y.; Han, Z. J.; Cao, X. Z.; Zhang, P.
2018-03-01
It is currently impossible to exploit the timing information in a gamma-ray pulse generated within nanoseconds when a high-intensity positron burst annihilation event occurs in a target using conventional single-detector methods. A state-of-the-art solution to the problem is proposed in this paper. In this approach, a multi-detector array composed of many independent detection cells mounted spherically around the target is designed to detect the time distribution of the annihilated gamma rays generated following, in particular, a positron burst emitting huge amounts of positrons in a short pulse duration, even less than a few nano- or picoseconds.
NASA Astrophysics Data System (ADS)
Brax, Christoffer; Niklasson, Lars
2009-05-01
Maritime Domain Awareness is important for both civilian and military applications. An important part of MDA is detection of unusual vessel activities such as piracy, smuggling, poaching, collisions, etc. Today's interconnected sensorsystems provide us with huge amounts of information over large geographical areas which can make the operators reach their cognitive capacity and start to miss important events. We propose and agent-based situation management system that automatically analyse sensor information to detect unusual activity and anomalies. The system combines knowledge-based detection with data-driven anomaly detection. The system is evaluated using information from both radar and AIS sensors.
Ergonomic analysis of construction worker's body postures using wearable mobile sensors.
Nath, Nipun D; Akhavian, Reza; Behzadan, Amir H
2017-07-01
Construction jobs are more labor-intensive compared to other industries. As such, construction workers are often required to exceed their natural physical capability to cope with the increasing complexity and challenges in this industry. Over long periods of time, this sustained physical labor causes bodily injuries to the workers which in turn, conveys huge losses to the industry in terms of money, time, and productivity. Various safety and health organizations have established rules and regulations that limit the amount and intensity of workers' physical movements to mitigate work-related bodily injuries. A precursor to enforcing and implementing such regulations and improving the ergonomics conditions on the jobsite is to identify physical risks associated with a particular task. Manually assessing a field activity to identify the ergonomic risks is not trivial and often requires extra effort which may render it to be challenging if not impossible. In this paper, a low-cost ubiquitous approach is presented and validated which deploys built-in smartphone sensors to unobtrusively monitor workers' bodily postures and autonomously identify potential work-related ergonomic risks. Results indicates that measurements of trunk and shoulder flexions of a worker by smartphone sensory data are very close to corresponding measurements by observation. The proposed method is applicable for workers in various occupations who are exposed to WMSDs due to awkward postures. Examples include, but are not limited to industry laborers, carpenters, welders, farmers, health assistants, teachers, and office workers. Copyright © 2017 Elsevier Ltd. All rights reserved.
Review of optical wireless communications for data centers
NASA Astrophysics Data System (ADS)
Arnon, Shlomi
2017-10-01
A data center (DC) is a facility either physical or virtual, for running applications, searching, storage, management and dissemination of information known as cloud computing, which consume a huge amount of energy. A DC includes thousands of servers, communication and storage equipment and a support system including an air conditioning system, security, monitoring equipment and electricity regulator units. Data center operators face the challenges of meeting exponentially increasing demands for network bandwidth without unreasonable increases in operation and infrastructure cost. In order to meet the requirements of moderate increase in operation and infrastructure cost technology, a revolution is required. One way to overcome the shortcomings of traditional static (wired) data center architectures is use of a hybrid network based on fiber and optical wireless communication (OWC) or free space optics (FSO). The OWC link could be deployed on top of the existing cable/fiber network layer, so that live migration could be done easily and dynamically. In that case the network topology is flexible and adapts quickly to changes in traffic, heat distribution, power consumption and characteristics of the applications. In addition, OWC could provide an easy way to maintain and scale up data centers. As a result total cost of ownership could be reduced and the return on investment could be increased. In this talk we will review the main OWC technologies applicable for data centers, indicate how energy could be saved using OWC multichannel communication and discuss the issue of OWC pointing accuracy for data center scenario.
Workflows for Full Waveform Inversions
NASA Astrophysics Data System (ADS)
Boehm, Christian; Krischer, Lion; Afanasiev, Michael; van Driel, Martin; May, Dave A.; Rietmann, Max; Fichtner, Andreas
2017-04-01
Despite many theoretical advances and the increasing availability of high-performance computing clusters, full seismic waveform inversions still face considerable challenges regarding data and workflow management. While the community has access to solvers which can harness modern heterogeneous computing architectures, the computational bottleneck has fallen to these often manpower-bounded issues that need to be overcome to facilitate further progress. Modern inversions involve huge amounts of data and require a tight integration between numerical PDE solvers, data acquisition and processing systems, nonlinear optimization libraries, and job orchestration frameworks. To this end we created a set of libraries and applications revolving around Salvus (http://salvus.io), a novel software package designed to solve large-scale full waveform inverse problems. This presentation focuses on solving passive source seismic full waveform inversions from local to global scales with Salvus. We discuss (i) design choices for the aforementioned components required for full waveform modeling and inversion, (ii) their implementation in the Salvus framework, and (iii) how it is all tied together by a usable workflow system. We combine state-of-the-art algorithms ranging from high-order finite-element solutions of the wave equation to quasi-Newton optimization algorithms using trust-region methods that can handle inexact derivatives. All is steered by an automated interactive graph-based workflow framework capable of orchestrating all necessary pieces. This naturally facilitates the creation of new Earth models and hopefully sparks new scientific insights. Additionally, and even more importantly, it enhances reproducibility and reliability of the final results.
Data oriented job submission scheme for the PHENIX user analysis in CCJ
NASA Astrophysics Data System (ADS)
Nakamura, T.; En'yo, H.; Ichihara, T.; Watanabe, Y.; Yokkaichi, S.
2011-12-01
The RIKEN Computing Center in Japan (CCJ) has been developed to make it possible analyzing huge amount of data corrected by the PHENIX experiment at RHIC. The corrected raw data or reconstructed data are transferred via SINET3 with 10 Gbps bandwidth from Brookheaven National Laboratory (BNL) by using GridFTP. The transferred data are once stored in the hierarchical storage management system (HPSS) prior to the user analysis. Since the size of data grows steadily year by year, concentrations of the access request to data servers become one of the serious bottlenecks. To eliminate this I/O bound problem, 18 calculating nodes with total 180 TB local disks were introduced to store the data a priori. We added some setup in a batch job scheduler (LSF) so that user can specify the requiring data already distributed to the local disks. The locations of data are automatically obtained from a database, and jobs are dispatched to the appropriate node which has the required data. To avoid the multiple access to a local disk from several jobs in a node, techniques of lock file and access control list are employed. As a result, each job can handle a local disk exclusively. Indeed, the total throughput was improved drastically as compared to the preexisting nodes in CCJ, and users can analyze about 150 TB data within 9 hours. We report this successful job submission scheme and the feature of the PC cluster.
Anguera, A; Barreiro, J M; Lara, J A; Lizcano, D
2016-01-01
One of the major challenges in the medical domain today is how to exploit the huge amount of data that this field generates. To do this, approaches are required that are capable of discovering knowledge that is useful for decision making in the medical field. Time series are data types that are common in the medical domain and require specialized analysis techniques and tools, especially if the information of interest to specialists is concentrated within particular time series regions, known as events. This research followed the steps specified by the so-called knowledge discovery in databases (KDD) process to discover knowledge from medical time series derived from stabilometric (396 series) and electroencephalographic (200) patient electronic health records (EHR). The view offered in the paper is based on the experience gathered as part of the VIIP project. Knowledge discovery in medical time series has a number of difficulties and implications that are highlighted by illustrating the application of several techniques that cover the entire KDD process through two case studies. This paper illustrates the application of different knowledge discovery techniques for the purposes of classification within the above domains. The accuracy of this application for the two classes considered in each case is 99.86% and 98.11% for epilepsy diagnosis in the electroencephalography (EEG) domain and 99.4% and 99.1% for early-age sports talent classification in the stabilometry domain. The KDD techniques achieve better results than other traditional neural network-based classification techniques.
Development of DKB ETL module in case of data conversion
NASA Astrophysics Data System (ADS)
Kaida, A. Y.; Golosova, M. V.; Grigorieva, M. A.; Gubin, M. Y.
2018-05-01
Modern scientific experiments involve the producing of huge volumes of data that requires new approaches in data processing and storage. These data themselves, as well as their processing and storage, are accompanied by a valuable amount of additional information, called metadata, distributed over multiple informational systems and repositories, and having a complicated, heterogeneous structure. Gathering these metadata for experiments in the field of high energy nuclear physics (HENP) is a complex issue, requiring the quest for solutions outside the box. One of the tasks is to integrate metadata from different repositories into some kind of a central storage. During the integration process, metadata taken from original source repositories go through several processing steps: metadata aggregation, transformation according to the current data model and loading it to the general storage in a standardized form. The R&D project of ATLAS experiment on LHC, Data Knowledge Base, is aimed to provide fast and easy access to significant information about LHC experiments for the scientific community. The data integration subsystem, being developed for the DKB project, can be represented as a number of particular pipelines, arranging data flow from data sources to the main DKB storage. The data transformation process, represented by a single pipeline, can be considered as a number of successive data transformation steps, where each step is implemented as an individual program module. This article outlines the specifics of program modules, used in the dataflow, and describes one of the modules developed and integrated into the data integration subsystem of DKB.
Technology and outcomes assessment in lung transplantation.
Yusen, Roger D
2009-01-15
Lung transplantation offers the hope of prolonged survival and significant improvement in quality of life to patients that have advanced lung diseases. However, the medical literature lacks strong positive evidence and shows conflicting information regarding survival and quality of life outcomes related to lung transplantation. Decisions about the use of lung transplantation require an assessment of trade-offs: do the potential health and quality of life benefits outweigh the potential risks and harms? No amount of theoretical reasoning can resolve this question; empiric data are needed. Rational analyses of these trade-offs require valid measurements of the benefits and harms to the patients in all relevant domains that affect survival and quality of life. Lung transplant systems and registries mainly focus outcomes assessment on patient survival on the waiting list and after transplantation. Improved analytic approaches allow comparisons of the survival effects of lung transplantation versus continued waiting. Lung transplant entities do not routinely collect quality of life data. However, the medical community and the public want to know how lung transplantation affects quality of life. Given the huge stakes for the patients, the providers, and the healthcare systems, key stakeholders need to further support quality of life assessment in patients with advanced lung disease that enter into the lung transplant systems. Studies of lung transplantation and its related technologies should assess patients with tools that integrate both survival and quality of life information. Higher quality information obtained will lead to improved knowledge and more informed decision making.
Current challenges in genome annotation through structural biology and bioinformatics.
Furnham, Nicholas; de Beer, Tjaart A P; Thornton, Janet M
2012-10-01
With the huge volume in genomic sequences being generated from high-throughout sequencing projects the requirement for providing accurate and detailed annotations of gene products has never been greater. It is proving to be a huge challenge for computational biologists to use as much information as possible from experimental data to provide annotations for genome data of unknown function. A central component to this process is to use experimentally determined structures, which provide a means to detect homology that is not discernable from just the sequence and permit the consequences of genomic variation to be realized at the molecular level. In particular, structures also form the basis of many bioinformatics methods for improving the detailed functional annotations of enzymes in combination with similarities in sequence and chemistry. Copyright © 2012. Published by Elsevier Ltd.
ERIC Educational Resources Information Center
Al-Maliky, Salam J. Bash
2012-01-01
Huge environmental and health crises such as the use of Depleted Uranium (DU) munitions during the military activities against Iraq and the required responses are amongst the fields that Iraqi higher education institutions (HEIs) may have a crucial role. Similar international cases, such as Agent Orange (Vietnam), Three Mile Island (USA) and…
ERIC Educational Resources Information Center
Ward, Victoria Ann
2014-01-01
Attention deficit hyperactivity disorder (ADHD) diagnosis rates have increased significantly in recent times. A teacher's role is crucial in determining if a child will be referred for an ADHD assessment. Teachers' opinions and observations are also required for and play a huge role in the actual assessment process. For this reason, their…
Fighting for Electives: Lessons in Change
ERIC Educational Resources Information Center
Tempel, Melissa Bollow
2010-01-01
READ 180, a special class to help students read better, is just one of many boxed reading intervention programs popping up all over the United States and being used by districts to meet the requirements of No Child Left Behind (NCLB) at a huge cost to students. The Milwaukee Public School District calls it "a research-based reading…
ERIC Educational Resources Information Center
Kaufman, Roger
2010-01-01
With huge financial challenges being imposed on higher education, some react to crises to make changes and meet financial requirements. Changes are made that would be unthinkable without imposed demands. Two examples of universities that successfully responded to limited budgets to make major changes in organization, structure, and programs are…
Emission characteristics of volatile organic compounds from semiconductor manufacturing.
Chein, HungMin; Chen, Tzu Ming
2003-08-01
A huge amount of volatile organic compounds (VOCs) is produced and emitted with waste gases from semiconductor manufacturing processes, such as cleaning, etching, and developing. VOC emissions from semiconductor factories located at Science-Based Industrial Park, Hsin-chu, Taiwan, were measured and characterized in this study. A total of nine typical semiconductor fabricators (fabs) were monitored over a 12-month period (October 2000-September 2001). A flame ionization analyzer was employed to measure the VOC emission rate continuously in a real-time fashion. The amount of chemical use was adopted from the data that were reported to the Environmental Protection Bureau in Hsin-chu County as per the regulation of the Taiwan Environmental Protection Administration. The VOC emission factor, defined as the emission rate (kg/month) divided by the amount of chemical use (L/month), was determined to be 0.038 +/- 0.016 kg/L. A linear regression equation is proposed to fit the data with the correlation coefficient (R2)=0.863. The emission profiles of VOCs, which were drawn using the gas chromatograph/mass spectrometer analysis method, show that isopropyl alcohol is the dominant compound in most of the fabs.
Beer, Wood, and Welfare ‒ The Impact of Improved Stove Use Among Dolo-Beer Breweries
2015-01-01
Local beer breweries in Burkina Faso absorb a considerable amount of urban woodfuel demand. We assess the woodfuel savings caused by the adoption of improved brewing stoves by these micro-breweries and estimate the implied welfare effects through the woodfuel market on private households as well as the environmental effect. We find substantial wood savings among the breweries, 36% to 38% if they fully switch to an improved stove. In absolute amounts, they save about 0.176 kg of fuelwood per litre of dolo brewed. These savings imply huge reductions in CO2-emissions and reduce the overall demand for woodfuel, which is predominantly used by the poorer strata for cooking purposes. We provide estimates for the price decrease that might result from this and show that the urban poor are likely to benefit. Thus, the intervention under study is an example for a green growth intervention with pro-poor welfare gains – something green growth strategies should look for. PMID:26244341
LoyalTracker: Visualizing Loyalty Dynamics in Search Engines.
Shi, Conglei; Wu, Yingcai; Liu, Shixia; Zhou, Hong; Qu, Huamin
2014-12-01
The huge amount of user log data collected by search engine providers creates new opportunities to understand user loyalty and defection behavior at an unprecedented scale. However, this also poses a great challenge to analyze the behavior and glean insights into the complex, large data. In this paper, we introduce LoyalTracker, a visual analytics system to track user loyalty and switching behavior towards multiple search engines from the vast amount of user log data. We propose a new interactive visualization technique (flow view) based on a flow metaphor, which conveys a proper visual summary of the dynamics of user loyalty of thousands of users over time. Two other visualization techniques, a density map and a word cloud, are integrated to enable analysts to gain further insights into the patterns identified by the flow view. Case studies and the interview with domain experts are conducted to demonstrate the usefulness of our technique in understanding user loyalty and switching behavior in search engines.
The application of exergy to human-designed systems
NASA Astrophysics Data System (ADS)
Hamilton, P.
2012-12-01
Exergy is the portion of the total energy of a system that is available for conversion to useful work. Exergy takes into account both the quantity and quality of energy. Heat is the inevitable product of using any form of high-quality energy such as electricity. Modern commercial buildings and industrial facilities use large amounts of electricity and so produce huge amounts of heat. This heat energy typically is treated as a waste product and discharged to the environment and then high-quality energy sources are consumed to satisfy low-quality energy heating and cooling needs. Tens of thousands of buildings and even whole communities could meet much of their heating and cooling needs through the capture and reuse of heat energy. Yet the application of exergy principles often faces resistance because it challenges conventions about how we design, construct and operate human-engineered systems. This session will review several exergy case studies and conclude with an audience discussion of how exergy principles may be both applied and highlighted in formal and informal education settings.
Automatic Feature Extraction from Planetary Images
NASA Technical Reports Server (NTRS)
Troglio, Giulia; Le Moigne, Jacqueline; Benediktsson, Jon A.; Moser, Gabriele; Serpico, Sebastiano B.
2010-01-01
With the launch of several planetary missions in the last decade, a large amount of planetary images has already been acquired and much more will be available for analysis in the coming years. The image data need to be analyzed, preferably by automatic processing techniques because of the huge amount of data. Although many automatic feature extraction methods have been proposed and utilized for Earth remote sensing images, these methods are not always applicable to planetary data that often present low contrast and uneven illumination characteristics. Different methods have already been presented for crater extraction from planetary images, but the detection of other types of planetary features has not been addressed yet. Here, we propose a new unsupervised method for the extraction of different features from the surface of the analyzed planet, based on the combination of several image processing techniques, including a watershed segmentation and the generalized Hough Transform. The method has many applications, among which image registration and can be applied to arbitrary planetary images.
Classification Algorithms for Big Data Analysis, a Map Reduce Approach
NASA Astrophysics Data System (ADS)
Ayma, V. A.; Ferreira, R. S.; Happ, P.; Oliveira, D.; Feitosa, R.; Costa, G.; Plaza, A.; Gamba, P.
2015-03-01
Since many years ago, the scientific community is concerned about how to increase the accuracy of different classification methods, and major achievements have been made so far. Besides this issue, the increasing amount of data that is being generated every day by remote sensors raises more challenges to be overcome. In this work, a tool within the scope of InterIMAGE Cloud Platform (ICP), which is an open-source, distributed framework for automatic image interpretation, is presented. The tool, named ICP: Data Mining Package, is able to perform supervised classification procedures on huge amounts of data, usually referred as big data, on a distributed infrastructure using Hadoop MapReduce. The tool has four classification algorithms implemented, taken from WEKA's machine learning library, namely: Decision Trees, Naïve Bayes, Random Forest and Support Vector Machines (SVM). The results of an experimental analysis using a SVM classifier on data sets of different sizes for different cluster configurations demonstrates the potential of the tool, as well as aspects that affect its performance.
Converting citrus wastes into value-added products: Economic and environmently friendly approaches.
Sharma, Kavita; Mahato, Neelima; Cho, Moo Hwan; Lee, Yong Rok
2017-02-01
Citrus fruits, including oranges, grapefruits, lemons, limes, tangerines, and mandarins, are among the most widely cultivated fruits around the globe. Its production is increasing every year due to rising consumer demand. Citrus-processing industries generate huge amounts of wastes every year, and citrus peel waste alone accounts for almost 50% of the wet fruit mass. Citrus waste is of immense economic value as it contains an abundance of various flavonoids, carotenoids, dietary fiber, sugars, polyphenols, essential oils, and ascorbic acid, as well as considerable amounts of some trace elements. Citrus waste also contains high levels of sugars suitable for fermentation for bioethanol production. However, compounds such as D-limonene must be removed for efficient bioethanol production. The aim of the present article was to review the latest advances in various popular methods of extraction for obtaining value-added products from citrus waste/byproducts and their potential utility as a source of various functional compounds. Copyright © 2016 Elsevier Inc. All rights reserved.
Bio-inspired approach for intelligent unattended ground sensors
NASA Astrophysics Data System (ADS)
Hueber, Nicolas; Raymond, Pierre; Hennequin, Christophe; Pichler, Alexander; Perrot, Maxime; Voisin, Philippe; Moeglin, Jean-Pierre
2015-05-01
Improving the surveillance capacity over wide zones requires a set of smart battery-powered Unattended Ground Sensors capable of issuing an alarm to a decision-making center. Only high-level information has to be sent when a relevant suspicious situation occurs. In this paper we propose an innovative bio-inspired approach that mimics the human bi-modal vision mechanism and the parallel processing ability of the human brain. The designed prototype exploits two levels of analysis: a low-level panoramic motion analysis, the peripheral vision, and a high-level event-focused analysis, the foveal vision. By tracking moving objects and fusing multiple criteria (size, speed, trajectory, etc.), the peripheral vision module acts as a fast relevant event detector. The foveal vision module focuses on the detected events to extract more detailed features (texture, color, shape, etc.) in order to improve the recognition efficiency. The implemented recognition core is able to acquire human knowledge and to classify in real-time a huge amount of heterogeneous data thanks to its natively parallel hardware structure. This UGS prototype validates our system approach under laboratory tests. The peripheral analysis module demonstrates a low false alarm rate whereas the foveal vision correctly focuses on the detected events. A parallel FPGA implementation of the recognition core succeeds in fulfilling the embedded application requirements. These results are paving the way of future reconfigurable virtual field agents. By locally processing the data and sending only high-level information, their energy requirements and electromagnetic signature are optimized. Moreover, the embedded Artificial Intelligence core enables these bio-inspired systems to recognize and learn new significant events. By duplicating human expertise in potentially hazardous places, our miniature visual event detector will allow early warning and contribute to better human decision making.
Space Weather Research at the National Science Foundation
NASA Astrophysics Data System (ADS)
Moretto, T.
2015-12-01
There is growing recognition that the space environment can have substantial, deleterious, impacts on society. Consequently, research enabling specification and forecasting of hazardous space effects has become of great importance and urgency. This research requires studying the entire Sun-Earth system to understand the coupling of regions all the way from the source of disturbances in the solar atmosphere to the Earth's upper atmosphere. The traditional, region-based structure of research programs in Solar and Space physics is ill suited to fully support the change in research directions that the problem of space weather dictates. On the observational side, dense, distributed networks of observations are required to capture the full large-scale dynamics of the space environment. However, the cost of implementing these is typically prohibitive, especially for measurements in space. Thus, by necessity, the implementation of such new capabilities needs to build on creative and unconventional solutions. A particularly powerful idea is the utilization of new developments in data engineering and informatics research (big data). These new technologies make it possible to build systems that can collect and process huge amounts of noisy and inaccurate data and extract from them useful information. The shift in emphasis towards system level science for geospace also necessitates the development of large-scale and multi-scale models. The development of large-scale models capable of capturing the global dynamics of the Earth's space environment requires investment in research team efforts that go beyond what can typically be funded under the traditional grants programs. This calls for effective interdisciplinary collaboration and efficient leveraging of resources both nationally and internationally. This presentation will provide an overview of current and planned initiatives, programs, and activities at the National Science Foundation pertaining to space weathe research.
Lee, Jonghyun; Yoon, Hongkyu; Kitanidis, Peter K.; ...
2016-06-09
When characterizing subsurface properties is crucial for reliable and cost-effective groundwater supply management and contaminant remediation. With recent advances in sensor technology, large volumes of hydro-geophysical and geochemical data can be obtained to achieve high-resolution images of subsurface properties. However, characterization with such a large amount of information requires prohibitive computational costs associated with “big data” processing and numerous large-scale numerical simulations. To tackle such difficulties, the Principal Component Geostatistical Approach (PCGA) has been proposed as a “Jacobian-free” inversion method that requires much smaller forward simulation runs for each iteration than the number of unknown parameters and measurements needed inmore » the traditional inversion methods. PCGA can be conveniently linked to any multi-physics simulation software with independent parallel executions. In our paper, we extend PCGA to handle a large number of measurements (e.g. 106 or more) by constructing a fast preconditioner whose computational cost scales linearly with the data size. For illustration, we characterize the heterogeneous hydraulic conductivity (K) distribution in a laboratory-scale 3-D sand box using about 6 million transient tracer concentration measurements obtained using magnetic resonance imaging. Since each individual observation has little information on the K distribution, the data was compressed by the zero-th temporal moment of breakthrough curves, which is equivalent to the mean travel time under the experimental setting. Moreover, only about 2,000 forward simulations in total were required to obtain the best estimate with corresponding estimation uncertainty, and the estimated K field captured key patterns of the original packing design, showing the efficiency and effectiveness of the proposed method. This article is protected by copyright. All rights reserved.« less
Future Long-Baseline Neutrino Facilities and Detectors
Diwan, Milind; Edgecock, Rob; Hasegawa, Takuya; ...
2013-01-01
We review the ongoing effort in the US, Japan, and Europe of the scientific community to study the location and the detector performance of the next-generation long-baseline neutrino facility. For many decades, research on the properties of neutrinos and the use of neutrinos to study the fundamental building blocks of matter has unveiled new, unexpected laws of nature. Results of neutrino experiments have triggered a tremendous amount of development in theory: theories beyond the standard model or at least extensions of it and development of the standard solar model and modeling of supernova explosions as well as the development ofmore » theories to explain the matter-antimatter asymmetry in the universe. Neutrino physics is one of the most dynamic and exciting fields of research in fundamental particle physics and astrophysics. The next-generation neutrino detector will address two aspects: fundamental properties of the neutrino like mass hierarchy, mixing angles, and the CP phase, and low-energy neutrino astronomy with solar, atmospheric, and supernova neutrinos. Such a new detector naturally allows for major improvements in the search for nucleon decay. A next-generation neutrino observatory needs a huge, megaton scale detector which in turn has to be installed in a new, international underground laboratory, capable of hosting such a huge detector.« less
NASA Astrophysics Data System (ADS)
Ruggles, Clive L. N.
The colossal pyramids of the pharaohs Khufu (Cheops), Khafre (Chephren), and Menkaure (Mycerinus) have attracted a huge amount of astronomical interest over the years, both scholarly and popular. Less attention is usually given to the broader context of structures on the Giza Plateau. One of the most notorious ideas connecting the Giza Plateau with astronomy is that the three large pyramids are laid out on the ground so as to reflect the appearance of the three stars of Orion's Belt in the sky. This idea is unsupportable for several reasons but has succeeded in generating huge public interest. Of much greater serious interest is the fact that the three main pyramids were oriented cardinally to extraordinary precision, which raises the questions of why this was important and how it was achieved. Another idea that has attracted serious attention but also some confusion is that the orientations of some narrow shafts within Khufu's pyramid might have been deliberately aligned upon particular stars. The overall layout of monuments on the plateau may certainly have been designed so as to emphasize certain solar phenomena, for symbolic and ideological reasons relating to a dominant sun cult. It is also possible that it formed part of a wider cosmological "master plan" extending to other pyramids and temples up to 20 km distant.
Kleinau, Gunnar; Kreuchwig, Annika; Worth, Catherine L; Krause, Gerd
2010-06-01
The collection, description and molecular analysis of naturally occurring (pathogenic) mutations are important for understanding the functional mechanisms and malfunctions of biological units such as proteins. Numerous databases collate a huge amount of functional data or descriptions of mutations, but tools to analyse the molecular effects of genetic variations are as yet poorly provided. The goal of this work was therefore to develop a translational web-application that facilitates the interactive linkage of functional and structural data and which helps improve our understanding of the molecular basis of naturally occurring gain- or loss- of function mutations. Here we focus on the human glycoprotein hormone receptors (GPHRs), for which a huge number of mutations are known to cause diseases. We describe new options for interactive data analyses within three-dimensional structures, which enable the assignment of molecular relationships between structure and function. Strikingly, as the functional data are converted into relational percentage values, the system allows the comparison and classification of data from different GPHR subtypes and different experimental approaches. Our new application has been incorporated into a freely available database and website for the GPHRs (http://www.ssfa-gphr.de), but the principle development would also be applicable to other macromolecules.
The Relevance of HLA Sequencing in Population Genetics Studies
Sanchez-Mazas, Alicia
2014-01-01
Next generation sequencing (NGS) is currently being adapted by different biotechnological platforms to the standard typing method for HLA polymorphism, the huge diversity of which makes this initiative particularly challenging. Boosting the molecular characterization of the HLA genes through efficient, rapid, and low-cost technologies is expected to amplify the success of tissue transplantation by enabling us to find donor-recipient matching for rare phenotypes. But the application of NGS technologies to the molecular mapping of the MHC region also anticipates essential changes in population genetic studies. Huge amounts of HLA sequence data will be available in the next years for different populations, with the potential to change our understanding of HLA variation in humans. In this review, we first explain how HLA sequencing allows a better assessment of the HLA diversity in human populations, taking also into account the methodological difficulties it introduces at the statistical level; secondly, we show how analyzing HLA sequence variation may improve our comprehension of population genetic relationships by facilitating the identification of demographic events that marked human evolution; finally, we discuss the interest of both HLA and genome-wide sequencing and genotyping in detecting functionally significant SNPs in the MHC region, the latter having also contributed to the makeup of the HLA molecular diversity observed today. PMID:25126587
The relevance of HLA sequencing in population genetics studies.
Sanchez-Mazas, Alicia; Meyer, Diogo
2014-01-01
Next generation sequencing (NGS) is currently being adapted by different biotechnological platforms to the standard typing method for HLA polymorphism, the huge diversity of which makes this initiative particularly challenging. Boosting the molecular characterization of the HLA genes through efficient, rapid, and low-cost technologies is expected to amplify the success of tissue transplantation by enabling us to find donor-recipient matching for rare phenotypes. But the application of NGS technologies to the molecular mapping of the MHC region also anticipates essential changes in population genetic studies. Huge amounts of HLA sequence data will be available in the next years for different populations, with the potential to change our understanding of HLA variation in humans. In this review, we first explain how HLA sequencing allows a better assessment of the HLA diversity in human populations, taking also into account the methodological difficulties it introduces at the statistical level; secondly, we show how analyzing HLA sequence variation may improve our comprehension of population genetic relationships by facilitating the identification of demographic events that marked human evolution; finally, we discuss the interest of both HLA and genome-wide sequencing and genotyping in detecting functionally significant SNPs in the MHC region, the latter having also contributed to the makeup of the HLA molecular diversity observed today.
NASA Astrophysics Data System (ADS)
Singh, Arvind K.; Sherry, Angela; Gray, Neil D.; Jones, Martin D.; Röling, Wilfred F. M.; Head, Ian M.
The industrial revolution has led to significant increases in the consumption of petroleum hydrocarbons. Concomitant with this increase, hydrocarbon pollution has become a global problem resulting from emissions related to operational use, releases during production, pipeline failures and tanker spills. Importantly, in addition to these anthropogenic sources of hydrocarbon pollution, natural seeps alone account for about 50% of total petroleum hydrocarbon releases in the aquatic environment (National Research Council, 2003). The annual input from natural seeps would form a layer of hydrocarbons 20 molecules thick on the sea surface globally if it remained un-degraded (Prince, 2005). By contrast with natural seeps, many oil spills, e.g. Sea Empress (Milford Haven, UK), Prestige (Galicia, Spain), EXXON Valdez (Prince William Sound, Alaska, USA), released huge amounts of oil (thousands to hundreds of thousand tonnes; Table 24.1) in a locally confined area over a short period of time with a huge acute impact on the marine environment. These incidents have attracted the attention of both the general public and the scientific community due to their great impact on coastal ecosystems. Although many petroleum hydrocarbons are toxic, they are degraded by microbial consortia naturally present in marine ecosystems.
The threats from oil spills: now, then, and in the future.
Jernelöv, Arne
2010-01-01
The ongoing oil spill from the blown-out well by the name of Macondo, drilled by the ill-fated rig Deepwater Horizon, has many features in common with another blowout in the Mexican Gulf that happened three decades ago. Then the oil gushed out from the Ixtoc I well drilled by the Sedco 135-F semi-submersible rig. In the years between these catastrophes, the source and nature of oil spills have undergone large changes. Huge spills from tankers that ran aground or collided used to be what caught the headlines and caused large ecological damage. The number and size of such accidental spills have decreased significantly. Instead, spills from ageing, ill-maintained or sabotaged pipelines have increased, and places like Arctic Russia, the Niger Delta, and the northwestern Amazon have become sites of reoccurring oil pollution. As for blowouts, there is no clear trend with regard to the number of incidences or amounts of spilled oil, but deepwater blowouts are much harder to cap and thus tend to go on longer and result in the release of larger quantities of oil. Also, oil exploration and extraction is moving into ever-deeper water and into stormier and icier seas, increasing potential risks. The risk for reoccurring spills like the two huge Mexican Gulf ones is eminent and must be reduced.
ERIC Educational Resources Information Center
Lawlor, John; Marshall, Kevin; Tangney, Brendan
2016-01-01
It is generally accepted that intrinsic student motivation is a critical requirement for effective learning but formal learning in school places a huge reliance on extrinsic motivation to focus the learner. This reliance on extrinsic motivation is driven by the pressure on formal schooling to "deliver to the test." The experience of the…
Impact of Free Primary Education in Kenya: A Case Study of Private Schools in Kibera
ERIC Educational Resources Information Center
Tooley, James; Dixon, Pauline; Stanfield, James
2008-01-01
Free primary education (FPE) is widely assumed to be required to ensure that the poor gain enrolment. After the introduction of FPE (from January 2003) in Kenyan schools, huge increases in enrolment were officially reported. However, our research, conducted 10 months after the introduction of FPE in and around the informal settlement of Kibera,…
Corporate Donors Can Make a Huge Difference
ERIC Educational Resources Information Center
Bennett, Drew A.
2009-01-01
It is time to educate corporate America on the need to finance higher education by using a need-based giving standard. Corporations need to realize that two-year colleges significantly affect their work force and economy. Only 25 percent of the jobs in the United States require a degree from a four-year college, yet up to 75 percent of the jobs…
ERIC Educational Resources Information Center
Pittman, Karen; Yohalem, Nicole; Wilson-Ahlstrom, Alicia; Ferber, Thaddeus
2003-01-01
High school is becoming the next frontier for after-school advocates. The conceptual and practical leaps from programming for elementary and middle school students to high school students are significant, with huge marketing challenges. Arguing persuasively for investments in this population requires revisiting almost every strategic decision,…
Teachers' Experiences of Technology-Based Teaching and Learning in the Foundation Phase
ERIC Educational Resources Information Center
Hannaway, D. M.; Steyn, M. G.
2017-01-01
This paper presents one aspect of a larger scale doctoral study, namely the teachers' experiences of technology-based teaching and learning in the Foundation Phase. Technology is a huge driver of change and South African education has to change regularly to meet the requirements set out by the Department of Education, including the development of…
Configurable e-commerce-oriented distributed seckill system with high availability
NASA Astrophysics Data System (ADS)
Zhu, Liye
2018-04-01
The rapid development of e-commerce prompted the birth of seckill activity. Seckill activity greatly stimulated public shopping desire because of its significant attraction to customers. In a seckill activity, a limited number of products will be sold at varying degrees of discount, which brings a huge temptation for customers. The discounted products are usually sold out in seconds, which can be a huge challenge for e-commerce systems. In this case, a seckill system with high concurrency and high availability has very practical significance. This research cooperates with Huijin Department Store to design and implement a seckill system of e-commerce platform. The seckill system supports high concurrency network conditions and is highly available in unexpected situation. In addition, due to the short life cycle of seckill activity, the system has the flexibility to be configured and scalable, which means that it is able to add or re-move system resources on demand. Finally, this paper carried out the function test and the performance test of the whole system. The test results show that the system meets the functional requirements and performance requirements of suppliers, administrators as well as users.
Astronomers Discover Most Distant Galaxy Showing Key Evidence For Furious Star Formation
NASA Astrophysics Data System (ADS)
2003-12-01
Astronomers have discovered a key signpost of rapid star formation in a galaxy 11 billion light-years from Earth, seen as it was when the Universe was only 20 percent of its current age. Using the National Science Foundation's Very Large Array (VLA) radio telescope, the scientists found a huge quantity of dense interstellar gas -- the environment required for active star formation -- at the greatest distance yet detected. A furious spawning of the equivalent of 1,000 Suns per year in a distant galaxy dubbed the Cloverleaf may be typical of galaxies in the early Universe, the scientists say. Cloverleaf galaxy VLA image (green) of radio emission from HCN gas, superimposed on Hubble Space Telescope image of the Cloverleaf galaxy. The four images of the Cloverleaf are the result of gravitational lensing. CREDIT: NRAO/AUI/NSF, STScI (Click on Image for Larger Version) "This is a rate of star formation more than 300 times greater than that in our own Milky Way and similar spiral galaxies, and our discovery may provide important information about the formation and evolution of galaxies throughout the Universe," said Philip Solomon, of Stony Brook University in New York. While the raw material for star formation has been found in galaxies at even greater distances, the Cloverleaf is by far the most distant galaxy showing this essential signature of star formation. That essential signature comes in the form of a specific frequency of radio waves emitted by molecules of the gas hydrogen cyanide (HCN). "If you see HCN, you are seeing gas with the high density required to form stars," said Paul Vanden Bout of the National Radio Astronomy Observatory (NRAO). Solomon and Vanden Bout worked with Chris Carilli of NRAO and Michel Guelin of the Institute for Millimeter Astronomy in France. They reported their results in the December 11 issue of the scientific journal Nature. In galaxies like the Milky Way, dense gas traced by HCN but composed mainly of hydrogen molecules is always associated with regions of active star formation. What is different about the Cloverleaf is the huge quantity of dense gas along with very powerful infrared radiation from the star formation. Ten billion times the mass of the Sun is contained in dense, star-forming gas clouds. "At the rate this galaxy is seen to be forming stars, that dense gas will be used up in only about 10 million years," Solomon said. In addition to giving astronomers a fascinating glimpse of a huge burst of star formation in the early Universe, the new information about the Cloverleaf helps answer a longstanding question about bright galaxies of that era. Many distant galaxies have supermassive black holes at their cores, and those black holes power "central engines" that produce bright emission. Astronomers have wondered specifically about those distant galaxies that emit large amounts of infrared light, galaxies like the Cloverleaf which has a black hole and central engine. "Is this bright infrared light caused by the black-hole-powered core of the galaxy or by a huge burst of star formation? That has been the question. Now we know that, in at least one case, much of the infrared light is produced by intense star formation," Carilli said. The rapid star formation, called a starburst, and the black hole are both generating the bright infrared light in the Cloverleaf. The starburst is a major event in the formation and evolution of this galaxy. "This detection of HCN gives us a unique new window through which we can study star formation in the early Universe," Carilli said. The National Radio Astronomy Observatory is a facility of the National Science Foundation, operated under cooperative agreement by Associated Universities, Inc.
Sediment transport monitoring for sustainable hydropower development
NASA Astrophysics Data System (ADS)
Rüther, Nils; Guerrero, Massimo; Stokseth, Siri
2015-04-01
Due to the increasing demand of CO2 neutral energy not only in Europe but also in World, a relatively large amount of new hydro power plants (HPP) are built. In addition, will existing ones refurbished and renewed in order to run them more cost effective. A huge thread to HPPs is incoming sediments in suspension from the rivers upstream. The sediments settle in the reservoir and reduce the effective head and volume and reduce consequently the life time of the reservoir. In addition are the fine sediments causing severe damages to turbines and infrastructure of a HPP. For estimating the amount of incoming sediments in suspension and therefore planning efficient counter measures, it is essential to monitor the rivers within the catchment of the HPP for suspended sediments. This work is considerably time consuming and requires highly educated personnel and is therefore expensive. Consequently will this study present a method to measure suspended sediment concentrations and their grain size distribution with a dual frequency acoustic Doppler current profiler (ADCP). This method is more cost effective and reliable in comparison to traditional measurement methods. Having more detailed information about the sediments being transported in a river, the hydro power plant can be planned, built, and operated much more efficiently and sustainable. The two horizontal ADCPs are installed at a measurement cross section in the Devoll river in Albania. To verify the new method, the suspended load concentrations will be monitored also in the traditional ways at the same cross sections. It is planned to install turbidity measurement devices included with an automatic sampling devices. It is also planned to use an optical in situ measurement device (LISST SL by Sequoia Inc.) to have detailed information of sediment concentration and grain sizes over the depth.
Predictive factors for intraoperative excessive bleeding in Graves' disease.
Yamanouchi, Kosho; Minami, Shigeki; Hayashida, Naomi; Sakimura, Chika; Kuroki, Tamotsu; Eguchi, Susumu
2015-01-01
In Graves' disease, because a thyroid tends to have extreme vascularity, the amount of intraoperative blood loss (AIOBL) becomes significant in some cases. We sought to elucidate the predictive factors of the AIOBL. A total of 197 patients underwent thyroidectomy for Graves' disease between 2002 and 2012. We evaluated clinical factors that would be potentially related to AIOBL retrospectively. The median period between disease onset and surgery was 16 months (range: 1-480 months). Conventional surgery was performed in 125 patients, whereas video-assisted surgery was performed in 72 patients. Subtotal and near-total/total thyroidectomies were performed in 137 patients and 60 patients, respectively. The median weight of the thyroid was 45 g (range: 7.3-480.0 g). Univariate analysis revealed that the strongest correlation of AIOBL was noted with the weight of thyroid (p < 0.001). Additionally, AIOBL was correlated positively with the period between disease onset and surgery (p < 0.001) and negatively with preoperative free T4 (p < 0.01). Multivariate analysis showed that only the weight of the thyroid was independently correlated with AIOBL (p < 0.001). Four patients (2.0%) needed blood transfusion, including two requiring autotransfusion, whose thyroids were all weighing in excess of 200 g. The amount of drainage during the initial 6 hours and days until drain removal was correlated positively with AIOBL (p < 0.001, each). Occurrences of postoperative complications, such as recurrent laryngeal nerve palsy or hypoparathyroidism, and postoperative hospital stay were not correlated with AIOBL. A huge goiter presented as a predictive factor for excessive bleeding during surgery for Graves' disease, and preparation for blood transfusion should be considered in cases where thyroids weigh more than 200 g. Copyright © 2014. Published by Elsevier Taiwan.
Credit Assignment in Multiple Goal Embodied Visuomotor Behavior
Rothkopf, Constantin A.; Ballard, Dana H.
2010-01-01
The intrinsic complexity of the brain can lead one to set aside issues related to its relationships with the body, but the field of embodied cognition emphasizes that understanding brain function at the system level requires one to address the role of the brain-body interface. It has only recently been appreciated that this interface performs huge amounts of computation that does not have to be repeated by the brain, and thus affords the brain great simplifications in its representations. In effect the brain's abstract states can refer to coded representations of the world created by the body. But even if the brain can communicate with the world through abstractions, the severe speed limitations in its neural circuitry mean that vast amounts of indexing must be performed during development so that appropriate behavioral responses can be rapidly accessed. One way this could happen would be if the brain used a decomposition whereby behavioral primitives could be quickly accessed and combined. This realization motivates our study of independent sensorimotor task solvers, which we call modules, in directing behavior. The issue we focus on herein is how an embodied agent can learn to calibrate such individual visuomotor modules while pursuing multiple goals. The biologically plausible standard for module programming is that of reinforcement given during exploration of the environment. However this formulation contains a substantial issue when sensorimotor modules are used in combination: The credit for their overall performance must be divided amongst them. We show that this problem can be solved and that diverse task combinations are beneficial in learning and not a complication, as usually assumed. Our simulations show that fast algorithms are available that allot credit correctly and are insensitive to measurement noise. PMID:21833235
Biointervention makes leather processing greener: an integrated cleansing and tanning system.
Thanikaivelan, Palanisamy; Rao, Jonnalagadda Raghava; Nair, Balachandran Unni; Ramasami, Thirumalachari
2003-06-01
The do-undo methods adopted in conventional leather processing generate huge amounts of pollutants. In other words, conventional methods employed in leather processing subject the skin/hide to wide variations in pH. Pretanning and tanning processes alone contribute more than 90% of the total pollution from leather processing. Included in this is a great deal of solid wastes such as lime and chrome sludge. In the approach described here, the hair and flesh removal as well as fiber opening have been achieved using biocatalysts at pH 8.0 for cow hides. This was followed by a pickle-free chrome tanning, which does not require a basification step. Hence, this tanning technique involves primarily three steps, namely, dehairing, fiber opening, and tanning. It has been found that the extent of hair removal, opening up of fiber bundles, and penetration and distribution of chromium are comparable to that produced by traditional methods. This has been substantiated through scanning electron microscopic, stratigraphic chrome distribution analysis, and softness measurements. Performance of the leathers is shown to be on par with conventionally processed leathers through physical and hand evaluation. Importantly, softness of the leathers is numerically proven to be comparable with that of control. The process also demonstrates reduction in chemical oxygen demand load by 80%, total solids load by 85%, and chromium load by 80% as compared to the conventional process, thereby leading toward zero discharge. The input-output audit shows that the biocatalytic three-step tanning process employs a very low amount of chemicals, thereby reducing the discharge by 90% as compared to the conventional multistep processing. Furthermore, it is also demonstrated that the process is technoeconomically viable.
Torres, Ednildo Andrade; Cerqueira, Gilberto S; Tiago, M Ferrer; Quintella, Cristina M; Raboni, Massimo; Torretta, Vincenzo; Urbini, Giordano
2013-12-01
In Brazil, and mainly in the State of Bahia, crude vegetable oils are widely used in the preparation of food. Street stalls, restaurants and canteens make a great use of palm oil and soybean oil. There is also some use of castor oil, which is widely cultivated in the Sertão Region (within the State of Bahia), and widely applied in industry. This massive use in food preparation leads to a huge amount of waste oil of different types, which needs either to be properly disposed of, or recovered. At the Laboratorio Energia e Gas-LEN (Energy & Gas lab.) of the Universidade Federal da Bahia, a cycle of experiments were carried out to evaluate the recovery of waste oils for biodiesel production. The experiences were carried out on a laboratory scale and, in a semi-industrial pilot plant using waste oils of different qualities. In the transesterification process, applied waste vegetable oils were reacted with methanol with the support of a basic catalyst, such as NaOH or KOH. The conversion rate settled at between 81% and 85% (in weight). The most suitable molar ratio of waste oils to alcohol was 1:6, and the amount of catalyst required was 0.5% (of the weight of the incoming oil), in the case of NaOH, and 1%, in case of KOH. The quality of the biodiesel produced was tested to determine the final product quality. The parameters analyzed were the acid value, kinematic viscosity, monoglycerides, diglycerides, triglycerides, free glycerine, total glycerine, clearness; the conversion yield of the process was also evaluated. Copyright © 2013 Elsevier Ltd. All rights reserved.
The Diabetic foot: A global threat and a huge challenge for Greece
Papanas, N; Maltezos, E
2009-01-01
The diabetic foot continues to be a major cause of morbidity, posing a global threat. Substantial progress has been now accomplished in the treatment of foot lesions, but further improvement is required. Treatment options may be classified into established measures (revascularisation, casting and debridement) and new modalities. All therapeutic measures should be provided by specialised dedicated multidisciplinary foot clinics. In particular, the diabetic foot is a huge challenge for Greece. There is a dramatic need to increase the number of engaged foot care teams and their resources throughout the country. It is also desirable to continue education of both physicians and general diabetic population on the magnitude of the problem and on the suitable preventative measures. At the same time, more data on the prevalence and clinical manifestations of the diabetic foot in Greece should be carefully collected. Finally, additional research should investigate feasible ways of implementing current knowledge in everyday clinical practice. PMID:20011082
Weight optimization of ultra large space structures
NASA Technical Reports Server (NTRS)
Reinert, R. P.
1979-01-01
The paper describes the optimization of a solar power satellite structure for minimum mass and system cost. The solar power satellite is an ultra large low frequency and lightly damped space structure; derivation of its structural design requirements required accommodation of gravity gradient torques which impose primary loads, life up to 100 years in the rigorous geosynchronous orbit radiation environment, and prevention of continuous wave motion in a solar array blanket suspended from a huge, lightly damped structure subject to periodic excitations. The satellite structural design required a parametric study of structural configurations and consideration of the fabrication and assembly techniques, which resulted in a final structure which met all requirements at a structural mass fraction of 10%.
Phytoplankton bloom off Iceland
2014-08-13
A massive phytoplankton bloom stained the waters of the Atlantic Ocean north of Iceland with brilliant jewel tones in late summer, 2014. The Moderate Resolution Imaging Spectroradiometer (MODIS) aboard NASA’s Aqua satellite captured this true-color image on August 2. Huge colonies of the floating, plant-like organisms create swirls of green, teal and turquoise and cover over 80% of the visible ocean off the northeast coast of Iceland. Marine phytoplankton require just the right amount of sunlight, dissolved nutrients and water temperatures which are not too hot, nor too cold to spark explosive reproduction and result in blooms which can cover hundreds of square kilometers. Phytoplankton form the base of the marine food chain, and are a rich food source for zooplankton, fish and other marine species. Some species, however, can deplete the water of oxygen and may become toxic to marine life. Credit: NASA/GSFC/Jeff Schmaltz/MODIS Land Rapid Response Team NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
n-type thermoelectric material Mg2Sn0.75Ge0.25 for high power generation
Liu, Weishu; Kim, Hee Seok; Chen, Shuo; Jie, Qing; Lv, Bing; Yao, Mengliang; Ren, Zhensong; Opeil, Cyril P.; Wilson, Stephen; Chu, Ching-Wu; Ren, Zhifeng
2015-01-01
Thermoelectric power generation is one of the most promising techniques to use the huge amount of waste heat and solar energy. Traditionally, high thermoelectric figure-of-merit, ZT, has been the only parameter pursued for high conversion efficiency. Here, we emphasize that a high power factor (PF) is equivalently important for high power generation, in addition to high efficiency. A new n-type Mg2Sn-based material, Mg2Sn0.75Ge0.25, is a good example to meet the dual requirements in efficiency and output power. It was found that Mg2Sn0.75Ge0.25 has an average ZT of 0.9 and PF of 52 μW⋅cm−1⋅K−2 over the temperature range of 25–450 °C, a peak ZT of 1.4 at 450 °C, and peak PF of 55 μW⋅cm−1⋅K−2 at 350 °C. By using the energy balance of one-dimensional heat flow equation, leg efficiency and output power were calculated with Th = 400 °C and Tc = 50 °C to be of 10.5% and 6.6 W⋅cm−2 under a temperature gradient of 150 °C⋅mm−1, respectively. PMID:25733845
Yu, Peiqiang; Xin, Hangshu; Ban, Yajing; Zhang, Xuewei
2014-05-07
Recent advances in biofuel and bio-oil processing technology require huge supplies of energy feedstocks for processing. Very recently, new carinata seeds have been developed as energy feedstocks for biofuel and bio-oil production. The processing results in a large amount of coproducts, which are carinata meal. To date, there is no systematic study on interactive association between biopolymers and biofunctions in carinata seed as energy feedstocks for biofuel and bioethanol processing and their processing coproducts (carinata meal). Molecular spectroscopy with synchrotron and globar sources is a rapid and noninvasive analytical technique and is able to investigate molecular structure conformation in relation to biopolymer functions and bioavailability. However, to date, these techniques are seldom used in biofuel and bioethanol processing in other research laboratories. This paper aims to provide research progress and updates with molecular spectroscopy on the energy feedstock (carinata seed) and coproducts (carinata meal) from biofuel and bioethanol processing and show how to use these molecular techniques to study the interactive association between biopolymers and biofunctions in the energy feedstocks and their coproducts (carinata meal) from biofuel and bio-oil processing before and after biodegradation.
QoS-aware integrated fiber-wireless standard compliant architecture based on XGPON and EDCA
NASA Astrophysics Data System (ADS)
Kaur, Ravneet; Srivastava, Anand
2018-01-01
Converged Fiber-Wireless (FiWi) broadband access network proves to be a promising candidate that is reliable, robust, cost efficient, ubiquitous and capable of providing huge amount of bandwidth. To meet the ever-increasing bandwidth requirements, it has become very crucial to investigate the performance issues that arise with the deployment of next-generation Passive Optical Network (PON) and its integration with various wireless technologies. Apart from providing high speed internet access for mass use, this combined architecture aims to enable delivery of high quality and effective e-services in different categories including health, education, finance, banking, agriculture and e-government. In this work, we present an integrated architecture of 10-Gigabit-capable PON (XG-PON) and Enhanced Distributed Channel Access (EDCA) that combines the benefits of both technologies to meet the QoS demands of subscribers. Performance evaluation of the standards-compliant hybrid network is done using discrete-event Network Simulator-3 (NS-3) and results are reported in terms of throughput, average delay, average packet loss rate and fairness index. Per-class throughput signifies effectiveness of QoS distribution whereas aggregate throughput indicates effective utilization of wireless channel. This work has not been reported so far to the best of our knowledge.
PyGOLD: a python based API for docking based virtual screening workflow generation.
Patel, Hitesh; Brinkjost, Tobias; Koch, Oliver
2017-08-15
Molecular docking is one of the successful approaches in structure based discovery and development of bioactive molecules in chemical biology and medicinal chemistry. Due to the huge amount of computational time that is still required, docking is often the last step in a virtual screening approach. Such screenings are set as workflows spanned over many steps, each aiming at different filtering task. These workflows can be automatized in large parts using python based toolkits except for docking using the docking software GOLD. However, within an automated virtual screening workflow it is not feasible to use the GUI in between every step to change the GOLD configuration file. Thus, a python module called PyGOLD was developed, to parse, edit and write the GOLD configuration file and to automate docking based virtual screening workflows. The latest version of PyGOLD, its documentation and example scripts are available at: http://www.ccb.tu-dortmund.de/koch or http://www.agkoch.de. PyGOLD is implemented in Python and can be imported as a standard python module without any further dependencies. oliver.koch@agkoch.de, oliver.koch@tu-dortmund.de. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Green survivability in Fiber-Wireless (FiWi) broadband access network
NASA Astrophysics Data System (ADS)
Liu, Yejun; Guo, Lei; Gong, Bo; Ma, Rui; Gong, Xiaoxue; Zhang, Lincong; Yang, Jiangzi
2012-03-01
Fiber-Wireless (FiWi) broadband access network is a promising "last mile" access technology, because it integrates wireless and optical access technologies in terms of their respective merits, such as high capacity and stable transmission from optical access technology, and easy deployment and flexibility from wireless access technology. Since FiWi is expected to carry a large amount of traffic, numerous traffic flows may be interrupted by the failure of network components. Thus, survivability in FiWi is a key issue aiming at reliable and robust service. However, the redundant deployment of backup resource required for survivability usually causes huge energy consumption, which aggravates the global warming and accelerates the incoming of energy crisis. Thus, the energy-saving issue should be considered when it comes to survivability design. In this paper, we focus on the green survivability in FiWi, which is an innovative concept and remains untouched in the previous works to our best knowledge. We first review and discuss some challenging issues about survivability and energy-saving in FiWi, and then we propose some instructive solutions for its green survivability design. Therefore, our work in this paper will provide the technical references and research motivations for the energy-efficient and survivable FiWi development in the future.
Bai, Zhiliang; Chen, Shili; Jia, Lecheng; Zeng, Zhoumo
2018-01-01
Embracing the fact that one can recover certain signals and images from far fewer measurements than traditional methods use, compressive sensing (CS) provides solutions to huge amounts of data collection in phased array-based material characterization. This article describes how a CS framework can be utilized to effectively compress ultrasonic phased array images in time and frequency domains. By projecting the image onto its Discrete Cosine transform domain, a novel scheme was implemented to verify the potentiality of CS for data reduction, as well as to explore its reconstruction accuracy. The results from CIVA simulations indicate that both time and frequency domain CS can accurately reconstruct array images using samples less than the minimum requirements of the Nyquist theorem. For experimental verification of three types of artificial flaws, although a considerable data reduction can be achieved with defects clearly preserved, it is currently impossible to break Nyquist limitation in the time domain. Fortunately, qualified recovery in the frequency domain makes it happen, meaning a real breakthrough for phased array image reconstruction. As a case study, the proposed CS procedure is applied to the inspection of an engine cylinder cavity containing different pit defects and the results show that orthogonal matching pursuit (OMP)-based CS guarantees the performance for real application. PMID:29738452
Energy saving effect of desiccant ventilation system using Wakkanai siliceous shale
NASA Astrophysics Data System (ADS)
Nabeshima, Yuki; Togawa, Jun-ya; Nagano, Katsunori; Kazuyo, Tsuzuki
2017-10-01
The nuclear power station accident resulting from the Great East Japan Earthquake disaster has resulted in a constrained electricity supply. However, in this Asian region there is high temperature and high humidity and consequently dehumidification process requires a huge amount of energy. This is the reason for the increasing energy consumption in the residential and commercial sectors. Accordingly, a high efficiency air-conditioning system is needed to be developed. The desiccant ventilation system is effective to reduce energy consumption for the dehumidification process. This system is capable of dehumidifying without dew condensing unlike a conventional air-conditioning system. Then we focused on Wakkanai Siliceous Shale (WSS) as a desiccant material to develop a new desiccant ventilation system. This is low priced, high performance, new type of thing. The aim of this study is to develop a desiccant ventilation unit using the WSS rotor which can be regenerated with low-temperature by numerical calculation. The results of performance prediction of the desiccant unit, indicate that it is possible to regenerate the WSS rotor at low-temperature of between 35 - 45 °C. In addition, we produced an actual measurement for the desiccant unit and air-conditioning unit. This air-conditioning system was capable to reduce roughly 40 % of input energy consumption.
Food waste-to-energy conversion technologies: current status and future directions.
Pham, Thi Phuong Thuy; Kaushik, Rajni; Parshetti, Ganesh K; Mahmood, Russell; Balasubramanian, Rajasekhar
2015-04-01
Food waste represents a significantly fraction of municipal solid waste. Proper management and recycling of huge volumes of food waste are required to reduce its environmental burdens and to minimize risks to human health. Food waste is indeed an untapped resource with great potential for energy production. Utilization of food waste for energy conversion currently represents a challenge due to various reasons. These include its inherent heterogeneously variable compositions, high moisture contents and low calorific value, which constitute an impediment for the development of robust, large scale, and efficient industrial processes. Although a considerable amount of research has been carried out on the conversion of food waste to renewable energy, there is a lack of comprehensive and systematic reviews of the published literature. The present review synthesizes the current knowledge available in the use of technologies for food-waste-to-energy conversion involving biological (e.g. anaerobic digestion and fermentation), thermal and thermochemical technologies (e.g. incineration, pyrolysis, gasification and hydrothermal oxidation). The competitive advantages of these technologies as well as the challenges associated with them are discussed. In addition, the future directions for more effective utilization of food waste for renewable energy generation are suggested from an interdisciplinary perspective. Copyright © 2014 Elsevier Ltd. All rights reserved.
[Computing in medical practice].
Wechsler, Rudolf; Anção, Meide S; de Campos, Carlos José Reis; Sigulem, Daniel
2003-05-01
Currently, information technology is part of several aspects of our daily life. The objective of this paper is to analyze and discuss the use of information technology in both medical education and/or medical practice. Information was gathered through non-systematic bibliographic review, including articles, official regulations, book chapters and annals. Direct search and search of electronic databanks in Medline and Lilacs databases were also performed. This paper was structured in topics. First, there is a discussion on the electronic medical record. The following aspects are presented: history, functions, costs, benefits, ethical and legal issues, and positive and negative characteristics. Medical decision-support systems are also evaluated in view of the huge amount of information produced every year regarding healthcare. The impact of the Internet on the production and diffusion of knowledge is also analyzed. Telemedicine is assessed, since it presents new challenges to medical practice, and raises important ethical issues such as "virtual medical consultation." Finally, a practical experience of modernization of a pediatric outpatient center by the introduction of computers and telecommunication tools is described. Medical computing offers tools and instruments that support the administrative organization of medical visits, gather, store and process patient's data, generate diagnoses, provide therapeutical advice and access to information in order to improve medical knowledge and to make it available whenever and wherever adequate decision-making is required.
Stabilization of carbon dioxide and chromium slag via carbonation.
Wu, Xingxing; Yu, Binbin; Xu, Wei; Fan, Zheng; Wu, Zucheng; Zhang, Huimin
2017-08-01
As the main greenhouse gas, CO 2 is considered as a threat in the context of global warming. Many available technologies to reduce CO 2 emission was about CO 2 separation from coal combustion and geological sequestration. However, how to deal with the cost-effective storage of CO 2 has become a new challenge. Moreover, chromium pollution, the treatment of which requires huge energy consumption, has attracted people's widespread attention. This study is aimed to develop the sequestration of CO 2 via chromium slag. A dynamic leaching experiment of chromium slag was designed to testify the ability of CO 2 adsorption onto chromium slag and to release Cr(VI) for stabilization. The results showed that the accumulative amounts of Cr(VI) were ca. 2.6 mg/g released from the chromium slag after 24 h of leaching. In addition, ca. 89 mg/g CO 2 was adsorbed by using pure CO 2 in the experiment at 12 h. Calcite is the only carbonate species in the post-carbonated slag analyzed by powder X-ray diffraction and thermal analysis. The approach provides the feasibility of the utilization of chromium slag and sequestration of the carbon dioxide at the same time at ordinary temperatures and pressures.
Motion analysis for duplicate frame removal in wireless capsule endoscope
NASA Astrophysics Data System (ADS)
Lee, Hyun-Gyu; Choi, Min-Kook; Lee, Sang-Chul
2011-03-01
Wireless capsule endoscopy (WCE) has been intensively researched recently due to its convenience for diagnosis and extended detection coverage of some diseases. Typically, a full recording covering entire human digestive system requires about 8 to 12 hours for a patient carrying a capsule endoscope and a portable image receiver/recorder unit, which produces 120,000 image frames on average. In spite of the benefits of close examination, WCE based test has a barrier for quick diagnosis such that a trained diagnostician must examine a huge amount of images for close investigation, normally over 2 hours. The main purpose of our work is to present a novel machine vision approach to reduce diagnosis time by automatically detecting duplicated recordings caused by backward camera movement, typically containing redundant information, in small intestine. The developed technique could be integrated with a visualization tool which supports intelligent inspection method, such as automatic play speed control. Our experimental result shows high accuracy of the technique by detecting 989 duplicate image frames out of 10,000, equivalently to 9.9% data reduction, in a WCE video from a real human subject. With some selected parameters, we achieved the correct detection ratio of 92.85% and the false detection ratio of 13.57%.
The Mexican Social Security counterreform: pensions for profit.
Laurell, A C
1999-01-01
The social security counterreform, initiated in 1997, forms part of the neoliberal reorganization of Mexican society. The reform implies a profound change in the guiding principles of social security, as the public model based on integrality, solidarity, and redistribution is replaced by a model based on private administration of funds and services, individualization of entitlement, and reduction of rights. Its economic purpose is to move social services and benefits into the direct sphere of private capital accumulation. Although these changes will involve the whole social security system--old-age and disability pensions, health care, child care, and workers' compensation--they are most immediately evident in the pension scheme. The pay-as-you-go scheme is being replaced by privately managed individual retirement accounts which especially favor the big financial groups. These groups are gaining control over huge amounts of capital, are authorized to charge a high commission, and run no financial risks. The privatization of the system requires decisive state intervention with a legal change and a sizable state subsidy (1 to 1.5 percent of GNP) over five decades. The supposed positive impact on economic growth and employment is uncertain. A review of the new law and of the estimates of future annuities reveals shrinking pension coverage and inadequate incomes from pensions.
Towards precision medicine: from quantitative imaging to radiomics
Acharya, U. Rajendra; Hagiwara, Yuki; Sudarshan, Vidya K.; Chan, Wai Yee; Ng, Kwan Hoong
2018-01-01
Radiology (imaging) and imaging-guided interventions, which provide multi-parametric morphologic and functional information, are playing an increasingly significant role in precision medicine. Radiologists are trained to understand the imaging phenotypes, transcribe those observations (phenotypes) to correlate with underlying diseases and to characterize the images. However, in order to understand and characterize the molecular phenotype (to obtain genomic information) of solid heterogeneous tumours, the advanced sequencing of those tissues using biopsy is required. Thus, radiologists image the tissues from various views and angles in order to have the complete image phenotypes, thereby acquiring a huge amount of data. Deriving meaningful details from all these radiological data becomes challenging and raises the big data issues. Therefore, interest in the application of radiomics has been growing in recent years as it has the potential to provide significant interpretive and predictive information for decision support. Radiomics is a combination of conventional computer-aided diagnosis, deep learning methods, and human skills, and thus can be used for quantitative characterization of tumour phenotypes. This paper discusses the overview of radiomics workflow, the results of various radiomics-based studies conducted using various radiological images such as computed tomography (CT), magnetic resonance imaging (MRI), and positron-emission tomography (PET), the challenges we are facing, and the potential contribution of radiomics towards precision medicine. PMID:29308604
Phan, Philippe; Mezghani, Neila; Aubin, Carl-Éric; de Guise, Jacques A; Labelle, Hubert
2011-07-01
Adolescent idiopathic scoliosis (AIS) is a complex spinal deformity whose assessment and treatment present many challenges. Computer applications have been developed to assist clinicians. A literature review on computer applications used in AIS evaluation and treatment has been undertaken. The algorithms used, their accuracy and clinical usability were analyzed. Computer applications have been used to create new classifications for AIS based on 2D and 3D features, assess scoliosis severity or risk of progression and assist bracing and surgical treatment. It was found that classification accuracy could be improved using computer algorithms that AIS patient follow-up and screening could be done using surface topography thereby limiting radiation and that bracing and surgical treatment could be optimized using simulations. Yet few computer applications are routinely used in clinics. With the development of 3D imaging and databases, huge amounts of clinical and geometrical data need to be taken into consideration when researching and managing AIS. Computer applications based on advanced algorithms will be able to handle tasks that could otherwise not be done which can possibly improve AIS patients' management. Clinically oriented applications and evidence that they can improve current care will be required for their integration in the clinical setting.
GenoQuery: a new querying module for functional annotation in a genomic warehouse
Lemoine, Frédéric; Labedan, Bernard; Froidevaux, Christine
2008-01-01
Motivation: We have to cope with both a deluge of new genome sequences and a huge amount of data produced by high-throughput approaches used to exploit these genomic features. Crossing and comparing such heterogeneous and disparate data will help improving functional annotation of genomes. This requires designing elaborate integration systems such as warehouses for storing and querying these data. Results: We have designed a relational genomic warehouse with an original multi-layer architecture made of a databases layer and an entities layer. We describe a new querying module, GenoQuery, which is based on this architecture. We use the entities layer to define mixed queries. These mixed queries allow searching for instances of biological entities and their properties in the different databases, without specifying in which database they should be found. Accordingly, we further introduce the central notion of alternative queries. Such queries have the same meaning as the original mixed queries, while exploiting complementarities yielded by the various integrated databases of the warehouse. We explain how GenoQuery computes all the alternative queries of a given mixed query. We illustrate how useful this querying module is by means of a thorough example. Availability: http://www.lri.fr/~lemoine/GenoQuery/ Contact: chris@lri.fr, lemoine@lri.fr PMID:18586731
Biological evaluation of nanosilver incorporated cellulose pulp for hygiene products.
Kavitha Sankar, P C; Ramakrishnan, Reshmi; Rosemary, M J
2016-04-01
Cellulose pulp has a visible market share in personal hygiene products such as sanitary napkins and baby diapers. However it offers good surface for growth of microorganisms. Huge amount of research is going on in developing hygiene products that do not initiate microbial growth. The objective of the present work is to produce antibacterial cellulose pulp by depositing silver nanopowder on the cellulose fiber. The silver nanoparticles used were of less than 100 nm in size and were characterised using transmission electron microscopy and X-ray powder diffraction studies. Antibacterial activity of the functionalized cellulose pulp was proved by JIS L 1902 method. The in-vitro cytotoxicity, in-vivo vaginal irritation and intracutaneous reactivity studies were done with silver nanopowder incorporated cellulose pulp for introducing a new value added product to the market. Cytotoxicity evaluation suggested that the silver nanoparticle incorporated cellulose pulp is non-cytotoxic. No irritation and skin sensitization were identified in animals tested with specific extracts prepared from the test material in the in-vivo experiments. The results indicated that the silver nanopowder incorporated cellulose pulp meets the requirements of the standard practices recommended for evaluating the biological reactivity and has good biocompatibility, hence can be classified as a safe hygiene product. Copyright © 2015 Elsevier B.V. All rights reserved.
Kundu, Anupam; Sabhapandit, Sanjib; Dhar, Abhishek
2011-03-01
We present an algorithm for finding the probabilities of rare events in nonequilibrium processes. The algorithm consists of evolving the system with a modified dynamics for which the required event occurs more frequently. By keeping track of the relative weight of phase-space trajectories generated by the modified and the original dynamics one can obtain the required probabilities. The algorithm is tested on two model systems of steady-state particle and heat transport where we find a huge improvement from direct simulation methods.
GDRMS: a system for automatic extraction of the disease-centre relation
NASA Astrophysics Data System (ADS)
Yang, Ronggen; Zhang, Yue; Gong, Lejun
2012-01-01
With the rapidly increasing of biomedical literature, the deluge of new articles is leading to information overload. Extracting the available knowledge from the huge amount of biomedical literature has become a major challenge. GDRMS is developed as a tool that extracts the relationship between disease and gene, gene and gene from biomedical literatures using text mining technology. It is a ruled-based system which also provides disease-centre network visualization, constructs the disease-gene database, and represents a gene engine for understanding the function of the gene. The main focus of GDRMS is to provide a valuable opportunity to explore the relationship between disease and gene for the research community about etiology of disease.
Mobile Clinical Decision Support Systems in Our Hands - Great Potential but also a Concern.
Masic, Izet; Begic, Edin
2016-01-01
Due to the powerful computer resources as well as the availability of today's mobile devices, a special field of mobile systems for clinical decision support in medicine has been developed. The benefits of these applications (systems) are: availability of necessary hardware (mobile phones, tablets and phablets are widespread, and can be purchased at a relatively affordable price), availability of mobile applications (free or for a "small" amount of money) and also mobile applications are tailored for easy use and save time of clinicians in their daily work. In these systems lies a huge potential, and certainly a great economic benefit, so this issue must be approached multidisciplinary.
High-speed wavelength-division multiplexing quantum key distribution system.
Yoshino, Ken-ichiro; Fujiwara, Mikio; Tanaka, Akihiro; Takahashi, Seigo; Nambu, Yoshihiro; Tomita, Akihisa; Miki, Shigehito; Yamashita, Taro; Wang, Zhen; Sasaki, Masahide; Tajima, Akio
2012-01-15
A high-speed quantum key distribution system was developed with the wavelength-division multiplexing (WDM) technique and dedicated key distillation hardware engines. Two interferometers for encoding and decoding are shared over eight wavelengths to reduce the system's size, cost, and control complexity. The key distillation engines can process a huge amount of data from the WDM channels by using a 1 Mbit block in real time. We demonstrated a three-channel WDM system that simultaneously uses avalanche photodiodes and superconducting single-photon detectors. We achieved 12 h continuous key generation with a secure key rate of 208 kilobits per second through a 45 km field fiber with 14.5 dB loss.
How to Select the most Relevant Roughness Parameters of a Surface: Methodology Research Strategy
NASA Astrophysics Data System (ADS)
Bobrovskij, I. N.
2018-01-01
In this paper, the foundations for new methodology creation which provides solving problem of surfaces structure new standards parameters huge amount conflicted with necessary actual floors quantity of surfaces structure parameters which is related to measurement complexity decreasing are considered. At the moment, there is no single assessment of the importance of a parameters. The approval of presented methodology for aerospace cluster components surfaces allows to create necessary foundation, to develop scientific estimation of surfaces texture parameters, to obtain material for investigators of chosen technological procedure. The methods necessary for further work, the creation of a fundamental reserve and development as a scientific direction for assessing the significance of microgeometry parameters are selected.
Effect of hexagonal hillock on luminescence characteristic of multiple quantum wells structure
NASA Astrophysics Data System (ADS)
Du, Jinjuan; Xu, Shengrui; Li, Peixian; Zhang, Jincheng; Zhao, Ying; Peng, Ruoshi; Fan, Xiaomeng; Hao, Yue
2018-04-01
GaN based ultraviolet multiple quantum well structures grown on a c-plane sapphire substrate by metal organic chemical deposition showed a microstructure with a large amount of huge hexagonal hillocks. The polarity of the sample is confirmed by etching with sodium hydroxide solution. The luminous intensity distribution of a typical hexagonal hillock was investigated by the phototluminescent mapping and the luminous intensity at hillock top regions was found to be 15 times higher than that of the regions around hillocks. The reduction of dislocations, the decreasing of the quantum confirmed stack effect caused by semipolar plane and the inclination of the sidewalls of the hexagonal hillock were responsible for the enhancement of luminous intensity.
[The application of metabonomics in modern studies of Chinese materia medica].
Chen, Hai-Bin; Zhou, Hong-Guang; Yu, Xiao-Yi
2012-06-01
Metabonomics, a newly developing subject secondary to genomics, transcriptomics, and proteomics, is an important constituent part of systems biology. It is believed to be the final direction of the systems biology. It can be directly applied to understand the physiological and biochemical states by its "metabolome profile" as a whole. Therefore, it can provide a huge amount of information different from those originating from other "omics". In the modernization of Chinese materia medica research, the application of metabonomics methods and technologies has a broad potential for future development. Especially it is of important theoretical significance and application value in holistic efficacies evaluation, active ingredients studies, and safety research of Chinese materia medica.
Exoplanets: the quest for Earth twins.
Mayor, Michel; Udry, Stephane; Pepe, Francesco; Lovis, Christophe
2011-02-13
Today, more than 400 extra-solar planets have been discovered. They provide strong constraints on the structure and formation mechanisms of planetary systems. Despite this huge amount of data, we still have little information concerning the constraints for extra-terrestrial life, i.e. the frequency of Earth twins in the habitable zone and the distribution of their orbital eccentricities. On the other hand, these latter questions strongly excite general interest and trigger future searches for life in the Universe. The status of the extra-solar planets field--in particular with respect to very-low-mass planets--will be discussed and an outlook on the search for Earth twins will be given in this paper.
Arveti, Nagaraju; Reginald, S; Kumar, K Sunil; Harinath, V; Sreedhar, Y
2012-04-01
Termite mounds are abundant components of Tummalapalle area of uranium mineralization of Cuddapah District of Andhra Pradesh, India. The systematic research has been carried out on the application of termite mound sampling to mineral exploration in this region. The distribution of chemical elements Cu, Pb, Zn, Ni, Co, Cr, Li, Rb, Sr, Ba, and U were studied both in termite soils and adjacent surface soils. Uranium accumulations were noticed in seven termite mounds ranging from 10 to 36 ppm. A biogeochemical parameter called "Biological Absorption Coefficient" of the termite mounds indicated the termite affected soils contained huge amounts of chemical elements than the adjacent soils.
Development of the prototype data management system of the solar H-alpha full disk observation
NASA Astrophysics Data System (ADS)
Wei, Ka-Ning; Zhao, Shi-Qing; Li, Qiong-Ying; Chen, Dong
2004-06-01
The Solar Chromospheric Telescope in Yunnan Observatory generates about 2G bytes fits format data per day. Huge amounts of data will bring inconvenience for people to use. Hence, data searching and sharing are important at present. Data searching, on-line browsing, remote accesses and download are developed with a prototype data management system of the solar H-alpha full disk observation, and improved by the working flow technology. Based on Windows XP operating system and MySQL data management system, a prototype system of browse/server model is developed by JAVA and JSP. Data compression, searching, browsing, deletion need authority and download in real-time have been achieved.
CMS tracker towards the HL-LHC
NASA Astrophysics Data System (ADS)
Alunni Solestizi, L.
2015-01-01
In sight of the incoming new LHC era (High Luminosity - LHC), characterized by a jump forward in the precision boundary and in the event rate, all the CMS sub-detector are developing and studying innovative strategies of trigger, pattern recognition, event timing and so on. A crucial aspect will be the online event selection: a totally new paradigm is needed, given the huge amount of events. In this picture the most granular and innermost sub-detector, the tracker, will play a decisive role. The phase-2 tracker will be involved in the L1 Trigger and, taking advantage of both the Associative Memories and the FPGA, it can ensure a trigger decision in proper time and with satisfactory performances.
Constructing phylogenetic trees using interacting pathways.
Wan, Peng; Che, Dongsheng
2013-01-01
Phylogenetic trees are used to represent evolutionary relationships among biological species or organisms. The construction of phylogenetic trees is based on the similarities or differences of their physical or genetic features. Traditional approaches of constructing phylogenetic trees mainly focus on physical features. The recent advancement of high-throughput technologies has led to accumulation of huge amounts of biological data, which in turn changed the way of biological studies in various aspects. In this paper, we report our approach of building phylogenetic trees using the information of interacting pathways. We have applied hierarchical clustering on two domains of organisms-eukaryotes and prokaryotes. Our preliminary results have shown the effectiveness of using the interacting pathways in revealing evolutionary relationships.
Thirty years of critical care medicine
2010-01-01
Critical care medicine is a relatively young but rapidly evolving specialty. On the occasion of the 30th International Symposium on Intensive Care and Emergency Medicine, we put together some thoughts from a few of the leaders in critical care who have been actively involved in this field over the years. Looking back over the last 30 years, we reflect on areas in which, despite large amounts of research and technological and scientific advances, no major therapeutic breakthroughs have been made. We then look at the process of care and realize that, here, huge progress has been made. Lastly, we suggest how critical care medicine will continue to evolve for the better over the next 30 years. PMID:20550727
A remote condition monitoring system for wind-turbine based DG systems
NASA Astrophysics Data System (ADS)
Ma, X.; Wang, G.; Cross, P.; Zhang, X.
2012-05-01
In this paper, a remote condition monitoring system is proposed, which fundamentally consists of real-time monitoring modules on the plant side, a remote support centre and the communications between them. The paper addresses some of the key issues related on the monitoring system, including i) the implementation and configuration of a VPN connection, ii) an effective database system to be able to handle huge amount of monitoring data, and iii) efficient data mining techniques to convert raw data into useful information for plant assessment. The preliminary results have demonstrated that the proposed system is practically feasible and can be deployed to monitor the emerging new energy generation systems.
[The health problems which can brougth by 3G cell phones to our country].
Enöz, Murat
2009-01-01
At present, we are being exposed to electromagnetic pollution which is steadily increasing parallel to the technological advancements and which is invisible and unnoticeable in the short run. Electromagnetic waves which were previously used for therapeutic reasons have recently been uncontrollably used in daily life. By widespread use of 3rd generation (3G) cellular phones, the electromagnetic pollution has multiplied and brought us a huge amount of health dangers in our country. In this article, electromagnetic pollution, which is a comprehensive topic, and problems related with this kind of pollution which is rapidly increasing due to recent wide use of 3G cell phones are summarized in the light of the literature.
Huge ascending aortic aneurysm with an intraluminal thrombus in an embolic event-free patient
Parato, Vito Maurizio; Pezzuoli, Franco; Labanti, Benedetto; Baboci, Arben
2015-01-01
We present a case of an 87-year-old male patient with a huge ascending aortic aneurysm, filled by a huge thrombus most probably due to previous dissection. This finding was detected by two-dimensional transthoracic echocardiography and contrast-enhanced computed tomography (CT) angiography scan. The patient refused surgical treatment and was medically treated. Despite the huge and mobile intraluminal thrombus, the patient remained embolic event-free up to 6 years later, and this makes the case unique. PMID:25838924
ERIC Educational Resources Information Center
Skilbeck, Malcolm; Connell, Helen
2004-01-01
During the next decade, the teaching profession in Australia will be transformed. Due mainly to age related retirements there will be a massive turnover and a huge influx of new entrants. At the same time, it can be expected that there will be more exacting requirements and expectations of teachers as new professional standards are set to meet the…
What Will They Learn? A Survey of Core Requirements at Our Nation's Colleges and Universities
ERIC Educational Resources Information Center
Bako, Tom; Kempson, Lauri; Lakemacher, Heather; Markley, Eric
2010-01-01
The crisis in higher education is about more than money. It is about what one has been paying for, paying for dearly. The public, even in these hard times, supports higher education with its tax dollars. And families make huge sacrifices to send their sons and daughters to college. They deserve in return higher education that provides real…
Little ice bodies, huge ice lands, and the up-going of the big water body
NASA Astrophysics Data System (ADS)
Ultee, E.; Bassis, J. N.
2017-12-01
Ice moving out of the huge ice lands causes the big water body to go up. That can cause bad things to happen in places close to the big water body - the land might even disappear! If that happens, people living close to the big water body might lose their homes. Knowing how much ice will come out of the huge ice lands, and when, can help the world plan for the up-going of the big water body. We study the huge ice land closest to us. All around the edge of that huge ice land, there are smaller ice bodies that control how much ice makes it into the big water body. Most ways of studying the huge ice land with computers struggle to tell the computer about those little ice bodies, but we have found a new way. We will talk about our way of studying little ice bodies and how their moving brings about up-going of the big water.
Avcioglu, Nermin Hande; Sahal, Gulcan; Bilkay, Isil Seyis
2016-01-01
Microbial cells growing in biofilms, play a huge role in the spread of antimicrobial resistance. In this study, biofilm formation of Klebsiella strains belonging to 3 different Klebsiella species ( K. ornithinolytica , K. oxytoca and K. terrigena ), cooccurences' effect on biofilm formation amount and anti-biofilm effects of Citrus limon and Zingiber officinale essential oils on biofilm formations of highest biofilm forming K. ornithinolytica , K. oxytoca and K. terrigena strains were determined. Anti-biofilm effects of Citrus limon and Zingiber officinale essential oils on biofilm formations of highest biofilm forming K. ornithinolytica , K. oxytoca and K. terrigena strains were investigated. 57% of K. ornithinolytica strains and 50% of K. oxytoca strains were found as Strong Biofilm Forming (SBF), there wasn't any SBF strain in K. terrigena species. In addition to this, clinical materials of urine and sperm were found as the most frequent clinical materials for strong biofilm forming K. ornithinolytica and K. oxytoca isolations respectively (63%; 100%) Secondly, all K. ornithinolytica strains isolated from surgical intensive care unit and all K. oxytoca strains isolated from service units of urology were found as SBF. Apart from these, although the amount of biofilm, formed by co-occurence of K. ornithinolytica - K. oxytoca and K. oxytoca - K. terrigena were more than the amount ofbiofilm formed by themselves separately, biofilm formation amount of co-occurrence of K. ornitholytica - K. terrigena strains was lower than biofilm formation amount of K. ornithinolytica but higher than biofilm formation amount of K. terrigena . The antibiofilm effects of Citrus limonum and Zingiber officinale essential oils could be used against biofilm Klebsiella aquired infections.
Avcioglu, Nermin Hande; Sahal, Gulcan; Bilkay, Isil Seyis
2016-01-01
Background: Microbial cells growing in biofilms, play a huge role in the spread of antimicrobial resistance. In this study, biofilm formation of Klebsiella strains belonging to 3 different Klebsiella species (K. ornithinolytica, K. oxytoca and K. terrigena), cooccurences’ effect on biofilm formation amount and anti-biofilm effects of Citrus limon and Zingiber officinale essential oils on biofilm formations of highest biofilm forming K. ornithinolytica, K. oxytoca and K. terrigena strains were determined. Materials and Methods: Anti-biofilm effects of Citrus limon and Zingiber officinale essential oils on biofilm formations of highest biofilm forming K. ornithinolytica, K. oxytoca and K. terrigena strains were investigated. Results: 57% of K. ornithinolytica strains and 50% of K. oxytoca strains were found as Strong Biofilm Forming (SBF), there wasn’t any SBF strain in K. terrigena species. In addition to this, clinical materials of urine and sperm were found as the most frequent clinical materials for strong biofilm forming K. ornithinolytica and K. oxytoca isolations respectively (63%; 100%) Secondly, all K. ornithinolytica strains isolated from surgical intensive care unit and all K. oxytoca strains isolated from service units of urology were found as SBF. Apart from these, although the amount of biofilm, formed by co-occurence of K. ornithinolytica - K. oxytoca and K. oxytoca - K. terrigena were more than the amount ofbiofilm formed by themselves separately, biofilm formation amount of co-occurrence of K. ornitholytica - K. terrigena strains was lower than biofilm formation amount of K. ornithinolytica but higher than biofilm formation amount of K. terrigena. Conclusion: The antibiofilm effects of Citrus limonum and Zingiber officinale essential oils could be used against biofilm Klebsiella aquired infections. PMID:28480361
Object segmentation using graph cuts and active contours in a pyramidal framework
NASA Astrophysics Data System (ADS)
Subudhi, Priyambada; Mukhopadhyay, Susanta
2018-03-01
Graph cuts and active contours are two very popular interactive object segmentation techniques in the field of computer vision and image processing. However, both these approaches have their own well-known limitations. Graph cut methods perform efficiently giving global optimal segmentation result for smaller images. However, for larger images, huge graphs need to be constructed which not only takes an unacceptable amount of memory but also increases the time required for segmentation to a great extent. On the other hand, in case of active contours, initial contour selection plays an important role in the accuracy of the segmentation. So a proper selection of initial contour may improve the complexity as well as the accuracy of the result. In this paper, we have tried to combine these two approaches to overcome their above-mentioned drawbacks and develop a fast technique of object segmentation. Here, we have used a pyramidal framework and applied the mincut/maxflow algorithm on the lowest resolution image with the least number of seed points possible which will be very fast due to the smaller size of the image. Then, the obtained segmentation contour is super-sampled and and worked as the initial contour for the next higher resolution image. As the initial contour is very close to the actual contour, so fewer number of iterations will be required for the convergence of the contour. The process is repeated for all the high-resolution images and experimental results show that our approach is faster as well as memory efficient as compare to both graph cut or active contour segmentation alone.
[Psychiatrist burnout or psychiatric assistance burnout?
Manna, Vincenzo; Dicuonzo, Francesca
2018-01-01
In recent years, mature industrial countries are rapidly changing from production economies to service economies. In this new socio-economic context, particular attention has been paid to mental health problems in the workplace. The risk of burnout is significantly higher for certain occupations, in particular for health workers. Doctors and psychiatrists, in particular, quite frequently have to make quick decisions by dealing with a huge amount of requests, which often require considerable assumptions of responsibility. In Italy, the process of corporateization and regionalization of the National Health Service has oriented clinical practice, in psychiatry, towards the rationalization and optimization of available resources, to ensure appropriateness and fairness of performances. The challenge that will soon be faced in health policy, with the progressive aging of the population, will be the growing burden of chronicity, in a context of limited resources, which will necessarily require a managerial approach in structuring and delivering services. The management of change in psychiatric assistance, today in Italy, can not be separated from a deep motivating involvement ( engagement) of professionals. In other words, it is desirable, in the effort to contain expenditure and rationalize welfare processes, to shift from burnout to the engagement of psychiatrists, investing economic and human resources in mental health services. In this review, through a selective search of the relevant literature 2010-2017 conducted on PubMed (key words: stress, burnout, psychiatry, mental health), the information from original articles, reviews and book chapters was analyzed and summarized. about the presence of burnout syndrome among psychiatrists. This article examines the concept of burnout, its causes and the most appropriate preventive and therapeutic interventions applicable to psychiatrists.
NASA Astrophysics Data System (ADS)
Ramos, Antonio L. L.; Shao, Zhili; Holthe, Aleksander; Sandli, Mathias F.
2017-05-01
The introduction of the System-on-Chip (SoC) technology has brought exciting new opportunities for the development of smart low cost embedded systems spanning a wide range of applications. Currently available SoC devices are capable of performing high speed digital signal processing tasks in software while featuring relatively low development costs and reduced time-to-market. Unmanned aerial vehicles (UAV) are an application example that has shown tremendous potential in an increasing number of scenarios, ranging from leisure to surveillance as well as in search and rescue missions. Video capturing from UAV platforms is a relatively straightforward task that requires almost no preprocessing. However, that does not apply to audio signals, especially in cases where the data is to be used to support real-time decision making. In fact, the enormous amount of acoustic interference from the surroundings, including the noise from the UAVs propellers, becomes a huge problem. This paper discusses a real-time implementation of the NLMS adaptive filtering algorithm applied to enhancing acoustic signals captured from UAV platforms. The model relies on a combination of acoustic sensors and a computational inexpensive algorithm running on a digital signal processor. Given its simplicity, this solution can be incorporated into the main processing system of an UAV using the SoC technology, and run concurrently with other required tasks, such as flight control and communications. Simulations and real-time DSP-based implementations have shown significant signal enhancement results by efficiently mitigating the interference from the noise generated by the UAVs propellers as well as from other external noise sources.
Comparison of Reconstruction and Control algorithms on the ESO end-to-end simulator OCTOPUS
NASA Astrophysics Data System (ADS)
Montilla, I.; Béchet, C.; Lelouarn, M.; Correia, C.; Tallon, M.; Reyes, M.; Thiébaut, É.
Extremely Large Telescopes are very challenging concerning their Adaptive Optics requirements. Their diameters, the specifications demanded by the science for which they are being designed for, and the planned use of Extreme Adaptive Optics systems, imply a huge increment in the number of degrees of freedom in the deformable mirrors. It is necessary to study new reconstruction algorithms to implement the real time control in Adaptive Optics at the required speed. We have studied the performance, applied to the case of the European ELT, of three different algorithms: the matrix-vector multiplication (MVM) algorithm, considered as a reference; the Fractal Iterative Method (FrIM); and the Fourier Transform Reconstructor (FTR). The algorithms have been tested on ESO's OCTOPUS software, which simulates the atmosphere, the deformable mirror, the sensor and the closed-loop control. The MVM is the default reconstruction and control method implemented in OCTOPUS, but it scales in O(N2) operations per loop so it is not considered as a fast algorithm for wave-front reconstruction and control on an Extremely Large Telescope. The two other methods are the fast algorithms studied in the E-ELT Design Study. The performance, as well as their response in the presence of noise and with various atmospheric conditions, has been compared using a Single Conjugate Adaptive Optics configuration for a 42 m diameter ELT, with a total amount of 5402 actuators. Those comparisons made on a common simulator allow to enhance the pros and cons of the various methods, and give us a better understanding of the type of reconstruction algorithm that an ELT demands.
Hyperspectral image compressing using wavelet-based method
NASA Astrophysics Data System (ADS)
Yu, Hui; Zhang, Zhi-jie; Lei, Bo; Wang, Chen-sheng
2017-10-01
Hyperspectral imaging sensors can acquire images in hundreds of continuous narrow spectral bands. Therefore each object presented in the image can be identified from their spectral response. However, such kind of imaging brings a huge amount of data, which requires transmission, processing, and storage resources for both airborne and space borne imaging. Due to the high volume of hyperspectral image data, the exploration of compression strategies has received a lot of attention in recent years. Compression of hyperspectral data cubes is an effective solution for these problems. Lossless compression of the hyperspectral data usually results in low compression ratio, which may not meet the available resources; on the other hand, lossy compression may give the desired ratio, but with a significant degradation effect on object identification performance of the hyperspectral data. Moreover, most hyperspectral data compression techniques exploits the similarities in spectral dimensions; which requires bands reordering or regrouping, to make use of the spectral redundancy. In this paper, we explored the spectral cross correlation between different bands, and proposed an adaptive band selection method to obtain the spectral bands which contain most of the information of the acquired hyperspectral data cube. The proposed method mainly consist three steps: First, the algorithm decomposes the original hyperspectral imagery into a series of subspaces based on the hyper correlation matrix of the hyperspectral images between different bands. And then the Wavelet-based algorithm is applied to the each subspaces. At last the PCA method is applied to the wavelet coefficients to produce the chosen number of components. The performance of the proposed method was tested by using ISODATA classification method.
Extracting biomedical events from pairs of text entities
2015-01-01
Background Huge amounts of electronic biomedical documents, such as molecular biology reports or genomic papers are generated daily. Nowadays, these documents are mainly available in the form of unstructured free texts, which require heavy processing for their registration into organized databases. This organization is instrumental for information retrieval, enabling to answer the advanced queries of researchers and practitioners in biology, medicine, and related fields. Hence, the massive data flow calls for efficient automatic methods of text-mining that extract high-level information, such as biomedical events, from biomedical text. The usual computational tools of Natural Language Processing cannot be readily applied to extract these biomedical events, due to the peculiarities of the domain. Indeed, biomedical documents contain highly domain-specific jargon and syntax. These documents also describe distinctive dependencies, making text-mining in molecular biology a specific discipline. Results We address biomedical event extraction as the classification of pairs of text entities into the classes corresponding to event types. The candidate pairs of text entities are recursively provided to a multiclass classifier relying on Support Vector Machines. This recursive process extracts events involving other events as arguments. Compared to joint models based on Markov Random Fields, our model simplifies inference and hence requires shorter training and prediction times along with lower memory capacity. Compared to usual pipeline approaches, our model passes over a complex intermediate problem, while making a more extensive usage of sophisticated joint features between text entities. Our method focuses on the core event extraction of the Genia task of BioNLP challenges yielding the best result reported so far on the 2013 edition. PMID:26201478
Fast generation of computer-generated hologram by graphics processing unit
NASA Astrophysics Data System (ADS)
Matsuda, Sho; Fujii, Tomohiko; Yamaguchi, Takeshi; Yoshikawa, Hiroshi
2009-02-01
A cylindrical hologram is well known to be viewable in 360 deg. This hologram depends high pixel resolution.Therefore, Computer-Generated Cylindrical Hologram (CGCH) requires huge calculation amount.In our previous research, we used look-up table method for fast calculation with Intel Pentium4 2.8 GHz.It took 480 hours to calculate high resolution CGCH (504,000 x 63,000 pixels and the average number of object points are 27,000).To improve quality of CGCH reconstructed image, fringe pattern requires higher spatial frequency and resolution.Therefore, to increase the calculation speed, we have to change the calculation method. In this paper, to reduce the calculation time of CGCH (912,000 x 108,000 pixels), we employ Graphics Processing Unit (GPU).It took 4,406 hours to calculate high resolution CGCH on Xeon 3.4 GHz.Since GPU has many streaming processors and a parallel processing structure, GPU works as the high performance parallel processor.In addition, GPU gives max performance to 2 dimensional data and streaming data.Recently, GPU can be utilized for the general purpose (GPGPU).For example, NVIDIA's GeForce7 series became a programmable processor with Cg programming language.Next GeForce8 series have CUDA as software development kit made by NVIDIA.Theoretically, calculation ability of GPU is announced as 500 GFLOPS. From the experimental result, we have achieved that 47 times faster calculation compared with our previous work which used CPU.Therefore, CGCH can be generated in 95 hours.So, total time is 110 hours to calculate and print the CGCH.
Medical management of the consequences of the Fukushima nuclear power plant incident.
Hachiya, Misao; Tominaga, Takako; Tatsuzaki, Hideo; Akashi, Makoto
2014-02-01
A huge earthquake struck the northeast coast of the main island of Japan on March 11, 2011, triggering a tsunami with 14-15 meter-high waves hitting the area. The earthquake was followed by numerous sustained aftershocks. The earthquake affected the nuclear power plant (NPP) in Fukushima prefecture, resulting in large amounts of radioactive materials being released into the environment. The major nuclides released on land were ¹³¹I, ¹³⁴Cs, and ¹³⁷Cs. Therefore, almost 170,000 people had to be evacuated or stay indoors. Besides the NPP and the telecommunications system, the earthquake also affected infrastructures such as the supplies of water and electricity as well as the radiation monitoring system. The local hospital system was dysfunctional; hospitals designated as radiation-emergency facilities were not able to function because of damage from the earthquake and tsunami, and some of them were located within a 20 km radius of the NPP, the designated evacuation zone. Local fire department personnel were also asked to evacuate. Furthermore, the affected hospitals had not established their evacuation plans at that time. We have learned from this "combined disaster" that the potential for damage to lifelines as well as the monitoring systems for radiation in case of an earthquake requires our intense focus and vigilance, and that hospitals need comprehensive plans for evacuation, including patients requiring life support equipment during and after a nuclear disaster. There is an urgent need for a "combined disaster" strategy, and this should be emphasized in current disaster planning and response. © 2013 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Befort, Daniel J.; Kruschke, Tim; Leckebusch, Gregor C.
2017-04-01
Tropical Cyclones over East Asia have huge socio-economic impacts due to their strong wind fields and large rainfall amounts. Especially, the most severe events are associated with huge economic losses, e.g. Typhoon Herb in 1996 is related to overall losses exceeding 5 billion US (Munich Re, 2016). In this study, an objective tracking algorithm is applied to JRA55 reanalysis data from 1979 to 2014 over the Western North Pacific. For this purpose, a purely wind based algorithm, formerly used to identify extra-tropical wind storms, has been further developed. The algorithm is based on the exceedance of the local 98th percentile to define strong wind fields in gridded climate data. To be detected as a tropical cyclone candidate, the following criteria must be fulfilled: 1) the wind storm must exist for at least eight 6-hourly time steps and 2) the wind field must exceed a minimum size of 130.000km2 for each time step. The usage of wind information is motivated to focus on damage related events, however, a pre-selection based on the affected region is necessary to remove events of extra-tropical nature. Using IBTrACS Best Tracks for validation, it is found that about 62% of all detected tropical cyclone events in JRA55 reanalysis can be matched to an observed best track. As expected the relative amount of matched tracks increases with the wind intensity of the event, with a hit rate of about 98% for Violent Typhoons, above 90% for Very Strong Typhoons and about 75% for Typhoons. Overall these results are encouraging as the parameters used to detect tropical cyclones in JRA55, e.g. minimum area, are also suitable to detect TCs in most CMIP5 simulations and will thus allow estimates of potential future changes.
Van Regenmortel, Marc H. V.
2018-01-01
Hypotheses and theories are essential constituents of the scientific method. Many vaccinologists are unaware that the problems they try to solve are mostly inverse problems that consist in imagining what could bring about a desired outcome. An inverse problem starts with the result and tries to guess what are the multiple causes that could have produced it. Compared to the usual direct scientific problems that start with the causes and derive or calculate the results using deductive reasoning and known mechanisms, solving an inverse problem uses a less reliable inductive approach and requires the development of a theoretical model that may have different solutions or none at all. Unsuccessful attempts to solve inverse problems in HIV vaccinology by reductionist methods, systems biology and structure-based reverse vaccinology are described. The popular strategy known as rational vaccine design is unable to solve the multiple inverse problems faced by HIV vaccine developers. The term “rational” is derived from “rational drug design” which uses the 3D structure of a biological target for designing molecules that will selectively bind to it and inhibit its biological activity. In vaccine design, however, the word “rational” simply means that the investigator is concentrating on parts of the system for which molecular information is available. The economist and Nobel laureate Herbert Simon introduced the concept of “bounded rationality” to explain why the complexity of the world economic system makes it impossible, for instance, to predict an event like the financial crash of 2007–2008. Humans always operate under unavoidable constraints such as insufficient information, a limited capacity to process huge amounts of data and a limited amount of time available to reach a decision. Such limitations always prevent us from achieving the complete understanding and optimization of a complex system that would be needed to achieve a truly rational design process. This is why the complexity of the human immune system prevents us from rationally designing an HIV vaccine by solving inverse problems. PMID:29387066
CO2 mineral sequestration in oil-shale wastes from Estonian power production.
Uibu, Mai; Uus, Mati; Kuusik, Rein
2009-02-01
In the Republic of Estonia, local low-grade carbonaceous fossil fuel--Estonian oil-shale--is used as a primary energy source. Combustion of oil-shale is characterized by a high specific carbon emission factor (CEF). In Estonia, the power sector is the largest CO(2) emitter and is also a source of huge amounts of waste ash. Oil-shale has been burned by pulverized firing (PF) since 1959 and in circulating fluidized-bed combustors (CFBCs) since 2004-2005. Depending on the combustion technology, the ash contains a total of up to 30% free Ca-Mg oxides. In consequence, some amount of emitted CO(2) is bound by alkaline transportation water and by the ash during hydraulic transportation and open-air deposition. The goal of this study was to investigate the possibility of improving the extent of CO(2) capture using additional chemical and technological means, in particular the treatment of aqueous ash suspensions with model flue gases containing 10-15% CO(2). The results indicated that both types of ash (PF and CFBC) could be used as sorbents for CO(2) mineral sequestration. The amount of CO(2) captured averaged 60-65% of the carbonaceous CO(2) and 10-11% of the total CO(2) emissions.
NASA Astrophysics Data System (ADS)
Kaya, N.; Iwashita, M.; Nakasuka, S.; Summerer, L.; Mankins, J.
2004-12-01
Construction technology of huge structures is essential for the future space development as well as the Solar Power Satellite (SPS). The SPS needs huge antennas to transmit the generated electric power toward the ground, while the huge antenna have many useful applications in space as well as on the ground, for example, telecommunication for cellular phones, radars for remote sensing, navigation and observation, and so on. A parabola antenna was mostly used for the space antenna. However, it is very difficult for the larger parabola antenna to keep accuracy of the reflectors and the beam control, because the surfaces of the reflectors are mechanically supported and controlled. The huge space antenna with flexible and ultra-light structures is essential and necessary for the future applications. An active phased array antenna is more suitable and promising for the huge flexible antenna than the parabola antenna. We are proposing to apply the Furoshiki satellite [1] with robots for construction of the huge structures. While a web is deployed using the Furoshiki satellite in the same size of the huge antenna, all of the antenna elements crawl on the web with their own legs toward their allocated locations. We are verifying the deployment concept of the Furoshiki satellite using a sounding rocket with robots crawling on the deployed web. The robots are internationally being developed by NASA, ESA and Kobe University. The paper describes the concept of the crawling robot developed by Kobe University as well as the plan of the rocket experiment.
ERIC Educational Resources Information Center
Eldridge, Chris
2016-01-01
Medieval history is on the rise. Among the many recent reforms in the history curriculum is a requirement for medieval themes at GCSE and across the country the new linear A-level offers fresh opportunities for teachers to look beyond the traditional diet of Tudors and modern history. The huge divide between us and the medieval mind can make the…
Enduring Partner Capacity: African Civil Affairs
2012-05-17
conflicts in Sierra Leone, Liberia, Nigeria and Ivory Coast all created political refugees. In 2011, the political and electoral crisis in the Ivory ...34 Ivory Coast Refugees Question Security of Returning Home," http://reliefweb.int/node/447842 (accessed November 14, 20 11). 23 There is an established...no disruptive deviant behavior in Botswana, as it also has a huge poaching problem requiring regular military action to control. In 20 I 0, Botswana
ERIC Educational Resources Information Center
Katz, Adrienne
2016-01-01
The internet and mobile devices play a huge role in teenagers' home and school life, and it is becoming more and more important to effectively address e-safety in secondary schools. This practical book provides guidance on how to teach and promote e-safety and tackle cyberbullying with real-life examples from schools of what works and what schools…
NASA Astrophysics Data System (ADS)
Avolio, G.; D'Ascanio, M.; Lehmann-Miotto, G.; Soloviev, I.
2017-10-01
The Trigger and Data Acquisition (TDAQ) system of the ATLAS detector at the Large Hadron Collider at CERN is composed of a large number of distributed hardware and software components (about 3000 computers and more than 25000 applications) which, in a coordinated manner, provide the data-taking functionality of the overall system. During data taking runs, a huge flow of operational data is produced in order to constantly monitor the system and allow proper detection of anomalies or misbehaviours. In the ATLAS trigger and data acquisition system, operational data are archived and made available to applications by the P-BEAST (Persistent Back-End for the Atlas Information System of TDAQ) service, implementing a custom time-series database. The possibility to efficiently visualize both realtime and historical operational data is a great asset facilitating both online identification of problems and post-mortem analysis. This paper will present a web-based solution developed to achieve such a goal: the solution leverages the flexibility of the P-BEAST archiver to retrieve data, and exploits the versatility of the Grafana dashboard builder to offer a very rich user experience. Additionally, particular attention will be given to the way some technical challenges (like the efficient visualization of a huge amount of data and the integration of the P-BEAST data source in Grafana) have been faced and solved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Uisung; Han, Jeongwoo; Wang, Michael
The amount of municipal solid waste (MSW) generated in the United States was estimated at 254 million wet tons in 2013, and around half of that generated waste was landfilled. There is a huge potential in recovering energy from that waste, since around 60% of landfilled material is biomass-derived waste that has high energy content. In addition, diverting waste for fuel production avoids huge fugitive emissions from landfills, especially uncontrolled CH 4 emissions, which are the third largest anthropogenic CH 4 source in the United States. Lifecycle analysis (LCA) is typically used to evaluate the environmental impact of alternative fuelmore » production pathways. LCA of transportation fuels is called well-to-wheels (WTW) and covers all stages of the fuel production pathways, from feedstock recovery (well) to vehicle operation (wheels). In this study, the Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation (GREET ®) model developed by Argonne National Laboratory is used to evaluate WTW greenhouse gas (GHG) emissions and fossil fuel consumption of waste-derived fuels. Two waste-to-energy (WTE) pathways have been evaluated – one for compressed natural gas (CNG) production using food waste via anaerobic digestion, and the other for ethanol production from yard trimmings via fermentation processes. Because the fuel production pathways displace current waste management practices (i.e., landfilling waste), we use a marginal approach that considers only the differences in emissions between the counterfactual case and the alternative fuel production case.« less
Beam On Target (BOT) Produces Gamma Ray Burst (GRB) Fireballs and Afterglows
NASA Astrophysics Data System (ADS)
Greyber, H. D.
1997-12-01
Unlike the myriads of ad hoc models that have been offered to explain GRB, the BOT process is simply the very common process used worldwide in accelerator laboratories to produce gamma rays. The Strong Magnetic Field (SMF) model postulates an extremely intense, highly relativistic current ring formed during the original gravitational collapse of a distant galaxy when the plasma cloud was permeated by a primordial magnetic field. GRB occur when solid matter (asteroid, white dwarf, neutron star, planet) falls rapidly through the Storage Ring beam producing a very strongly collimated electromagnetic shower, and a huge amount of matter from the target, in the form of a giant, hot, expanding plasma cloud, or ``Fireball,'' is blown off. BOT satisfies all the ``severe constraints imposed on the source of this burst --'' concluded by the CGRO team (Sommer et al, Astrophys. J. 422 L63 (1994)) for the huge intense burst GRB930131, whereas neutron star merger models are ``difficult to reconcile.'' BOT expects the lowest energy gamma photons to arrive very slightly later than higher energy photons due to the time for the shower to penetrate the target. The millisecond spikes in bursts are due to the slender filaments of current that make up the Storage Ring beam. Delayed photons can be explained by a broken target ``rock.'' See H. Greyber in the book ``Compton Gamma Ray Observatory,'' AIP Conf. Proc. 280, 569 (1993).
Adaptive platform for fluorescence microscopy-based high-content screening
NASA Astrophysics Data System (ADS)
Geisbauer, Matthias; Röder, Thorsten; Chen, Yang; Knoll, Alois; Uhl, Rainer
2010-04-01
Fluorescence microscopy has become a widely used tool for the study of medically relevant intra- and intercellular processes. Extracting meaningful information out of a bulk of acquired images is usually performed during a separate post-processing task. Thus capturing raw data results in an unnecessary huge number of images, whereas usually only a few images really show the particular information that is searched for. Here we propose a novel automated high-content microscope system, which enables experiments to be carried out with only a minimum of human interaction. It facilitates a huge speed-increase for cell biology research and its applications compared to the widely performed workflows. Our fluorescence microscopy system can automatically execute application-dependent data processing algorithms during the actual experiment. They are used for image contrast enhancement, cell segmentation and/or cell property evaluation. On-the-fly retrieved information is used to reduce data and concomitantly control the experiment process in real-time. Resulting in a closed loop of perception and action the system can greatly decrease the amount of stored data on one hand and increases the relative valuable data content on the other hand. We demonstrate our approach by addressing the problem of automatically finding cells with a particular combination of labeled receptors and then selectively stimulate them with antagonists or agonists. The results are then compared against the results of traditional, static systems.
Multisource Data Integration in Remote Sensing
NASA Technical Reports Server (NTRS)
Tilton, James C. (Editor)
1991-01-01
Papers presented at the workshop on Multisource Data Integration in Remote Sensing are compiled. The full text of these papers is included. New instruments and new sensors are discussed that can provide us with a large variety of new views of the real world. This huge amount of data has to be combined and integrated in a (computer-) model of this world. Multiple sources may give complimentary views of the world - consistent observations from different (and independent) data sources support each other and increase their credibility, while contradictions may be caused by noise, errors during processing, or misinterpretations, and can be identified as such. As a consequence, integration results are very reliable and represent a valid source of information for any geographical information system.
Sentiment analysis in twitter data using data analytic techniques for predictive modelling
NASA Astrophysics Data System (ADS)
Razia Sulthana, A.; Jaithunbi, A. K.; Sai Ramesh, L.
2018-04-01
Sentiment analysis refers to the task of natural language processing to determine whether a piece of text contains subjective information and the kind of subjective information it expresses. The subjective information represents the attitude behind the text: positive, negative or neutral. Understanding the opinions behind user-generated content automatically is of great concern. We have made data analysis with huge amount of tweets taken as big data and thereby classifying the polarity of words, sentences or entire documents. We use linear regression for modelling the relationship between a scalar dependent variable Y and one or more explanatory variables (or independent variables) denoted X. We conduct a series of experiments to test the performance of the system.
Document Clustering Approach for Meta Search Engine
NASA Astrophysics Data System (ADS)
Kumar, Naresh, Dr.
2017-08-01
The size of WWW is growing exponentially with ever change in technology. This results in huge amount of information with long list of URLs. Manually it is not possible to visit each page individually. So, if the page ranking algorithms are used properly then user search space can be restricted up to some pages of searched results. But available literatures show that no single search system can provide qualitative results from all the domains. This paper provides solution to this problem by introducing a new meta search engine that determine the relevancy of query corresponding to web page and cluster the results accordingly. The proposed approach reduces the user efforts, improves the quality of results and performance of the meta search engine.
Post-crisis analysis of an ineffective tsunami alert: the 2010 earthquake in Maule, Chile.
Soulé, Bastien
2014-04-01
Considering its huge magnitude and its location in a densely populated area of Chile, the Maule seism of 27 February 2010 generated a low amount of victims. However, post-seismic tsunamis were particularly devastating on that day; surprisingly, no full alert was launched, not at the national, regional or local level. This earthquake and associated tsunamis are of interest in the context of natural hazards management as well as crisis management planning. Instead of focusing exclusively on the event itself, this article places emphasis on the process, systems and long-term approach that led the tsunami alert mechanism to be ineffectual. Notably, this perspective reveals interrelated forerunner signs of vulnerability. © 2014 The Author(s). Disasters © Overseas Development Institute, 2014.
Potentials of Web 2.0 for Diabetes Education of Adolescent Patients
NASA Astrophysics Data System (ADS)
Shabestari, Omid; Roudsari, Abdul
Diabetes is a very common chronic disease which produces compli-cations in almost all body organs and consumes a huge amount of the health budget. Although education has proved to be useful in diabetes management, there is a great need to improve the availability of these courses for the increasing number of diabetic patients. E-learning can facilitate this service, but the current education system should be tailored towards e-learning standards. Amongst diabetic patients, adolescents as computer natives are suggested as the best target to e-learning diabetes education. With regards to its features, Web 2.0 can be a very good technology to build a framework for diabetes education and consequent evaluation of this education.
Retrieval Algorithms for Road Surface Modelling Using Laser-Based Mobile Mapping.
Jaakkola, Anttoni; Hyyppä, Juha; Hyyppä, Hannu; Kukko, Antero
2008-09-01
Automated processing of the data provided by a laser-based mobile mapping system will be a necessity due to the huge amount of data produced. In the future, vehiclebased laser scanning, here called mobile mapping, should see considerable use for road environment modelling. Since the geometry of the scanning and point density is different from airborne laser scanning, new algorithms are needed for information extraction. In this paper, we propose automatic methods for classifying the road marking and kerbstone points and modelling the road surface as a triangulated irregular network. On the basis of experimental tests, the mean classification accuracies obtained using automatic method for lines, zebra crossings and kerbstones were 80.6%, 92.3% and 79.7%, respectively.
NASA Astrophysics Data System (ADS)
Depalo, Rosanna;
2018-01-01
A precise knowledge of the cross section of nuclear fusion reactions is a crucial ingredient in understanding stellar evolution and nucleosynthesis. At stellar temperatures, fusion cross sections are extremely small and difficult to measure. Measuring nuclear cross sections at astrophysical energies is a challenge that triggered a huge amount of experimental work. A breakthrough in this direction was the first operation of an underground accelerator at the Laboratory for Underground Nuclear Astrophysics (LUNA) in Gran Sasso, Italy. The 1400 meters of rocks above the laboratory act as a natural shield against cosmic radiation, suppressing the background by orders of magnitude. The latest results achieved at LUNA are discussed, with special emphasis on the 22Ne(p,γ)23Na reaction. Future perspectives of the LUNA experiment are also illustrated.
The Astrobiological Case for Our Cosmic Ancestry
NASA Astrophysics Data System (ADS)
Wickramasinghe, Chandra
With steadily mounting evidence that points to a cosmic origin of terrestrial life, a cultural barrier prevails against admitting that such a connection exists. Astronomy continues to reveal the presence of organic molecules and organic dust on a huge cosmic scale, amounting to a third of interstellar carbon tied up in this form. Just as the overwhelming bulk of organics on Earth stored over geological timescales are derived from the degradation of living cells, so it seems most likely that interstellar organics in large measure also derive from biology. As we enter a new decade -- the year 2010 -- a clear pronouncement of our likely alien ancestry and of the existence of extraterrestrial life on a cosmic scale would seem to be overdue.
Huang, Xiaoxi; Zhang, Tao; Asefa, Tewodros
2017-07-01
A simple, new synthetic method that produces hollow, mesoporous carbon microparticles, each with a single hole on its surface, is reported. The synthesis involves unique templates, which are composed of gaseous bubbles and colloidal silica, and poly(furfuryl alcohol) as a carbon precursor. The conditions that give these morphologically unique carbon microparticles are investigated, and the mechanisms that result in their unique structures are proposed. Notably, the amount of colloidal silica and the type of polymer are found to hugely dictate whether or not the synthesis results in hollow asymmetrical microparticles, each with a single hole. The potential application of the particles as self-propelled micromotors is demonstrated. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
The astrobiological case for our cosmic ancestry
NASA Astrophysics Data System (ADS)
Wickramasinghe, Chandra
2010-04-01
With steadily mounting evidence that points to a cosmic origin of terrestrial life, a cultural barrier prevails against admitting that such a connection exists. Astronomy continues to reveal the presence of organic molecules and organic dust on a huge cosmic scale, amounting to a third of interstellar carbon tied up in this form. Just as the overwhelming bulk of organics on Earth stored over geological timescales are derived from the degradation of living cells, so it seems likely that interstellar organics in large measure also derive from biology. As we enter a new decade - the year 2010 - a clear pronouncement of our likely alien ancestry and of the existence of extraterrestrial life on a cosmic scale would seem to be overdue.
Lv, Guoping; Che, Chengchuan; Li, Li; Xu, Shujing; Guan, Wanyi; Zhao, Baohua; Ju, Jiansong
2017-07-06
The traditional CaCO3-based fermentation process generates huge amount of insoluble CaSO4 waste. To solve this problem, we have developed an efficient and green D-lactic acid fermentation process by using ammonia as neutralizer. The 106.7 g/L of D-lactic acid production and 0.89 g per g of consumed sugar were obtained by Sporolactobacillus inulinus CASD with a high optical purity of 99.7% by adding 100 mg/L betaine in the simple batch fermentation process. The addition of betaine was experimentally proven to protect cell at high concentration of ammonium ion, increase the D-lactate dehydrogenase specific activity and thus promote the production of D-lactic acid.
A new centrality measure for identifying influential nodes in social networks
NASA Astrophysics Data System (ADS)
Rhouma, Delel; Ben Romdhane, Lotfi
2018-04-01
The identification of central nodes has been a key problem in the field of social network analysis. In fact, it is a measure that accounts the popularity or the visibility of an actor within a network. In order to capture this concept, various measures, either sample or more elaborate, has been developed. Nevertheless, many of "traditional" measures are not designed to be applicable to huge data. This paper sets out a new node centrality index suitable for large social network. It uses the amount of the neighbors of a node and connections between them to characterize a "pivot" node in the graph. We presented experimental results on real data sets which show the efficiency of our proposal.
Combination of visual and symbolic knowledge: A survey in anatomy.
Banerjee, Imon; Patané, Giuseppe; Spagnuolo, Michela
2017-01-01
In medicine, anatomy is considered as the most discussed field and results in a huge amount of knowledge, which is heterogeneous and covers aspects that are mostly independent in nature. Visual and symbolic modalities are mainly adopted for exemplifying knowledge about human anatomy and are crucial for the evolution of computational anatomy. In particular, a tight integration of visual and symbolic modalities is beneficial to support knowledge-driven methods for biomedical investigation. In this paper, we review previous work on the presentation and sharing of anatomical knowledge, and the development of advanced methods for computational anatomy, also focusing on the key research challenges for harmonizing symbolic knowledge and spatial 3D data. Copyright © 2016 Elsevier Ltd. All rights reserved.
Moghaddam, Ramin; Badredine, Hala
2006-01-01
Iranian Social Security Organization(ISSO) is going to enable the sharing of health related information in a secure environment by means of reliable data in the right time to improve health of insured people throughout the country. There are around 7000 pharmacy throughout the country that ISSO contracted with them in order to deliver seamless services to 30 million insured people. The management of the huge amount of prescriptions based on a scientific basis with considering the financial issues of rising the cost of medicaments certainley needs a sophisticated business process reeingineering using ICT ; the work that is going to be completed in the ISSO in next few months. PMID:17238655
NASA Astrophysics Data System (ADS)
Faizrahnemoon, Mahsa; Schlote, Arieh; Maggi, Lorenzo; Crisostomi, Emanuele; Shorten, Robert
2015-11-01
This paper describes a Markov-chain-based approach to modelling multi-modal transportation networks. An advantage of the model is the ability to accommodate complex dynamics and handle huge amounts of data. The transition matrix of the Markov chain is built and the model is validated using the data extracted from a traffic simulator. A realistic test-case using multi-modal data from the city of London is given to further support the ability of the proposed methodology to handle big quantities of data. Then, we use the Markov chain as a control tool to improve the overall efficiency of a transportation network, and some practical examples are described to illustrate the potentials of the approach.
NASA Astrophysics Data System (ADS)
Shahiri, Amirah Mohamed; Husain, Wahidah; Rashid, Nur'Aini Abd
2017-10-01
Huge amounts of data in educational datasets may cause the problem in producing quality data. Recently, data mining approach are increasingly used by educational data mining researchers for analyzing the data patterns. However, many research studies have concentrated on selecting suitable learning algorithms instead of performing feature selection process. As a result, these data has problem with computational complexity and spend longer computational time for classification. The main objective of this research is to provide an overview of feature selection techniques that have been used to analyze the most significant features. Then, this research will propose a framework to improve the quality of students' dataset. The proposed framework uses filter and wrapper based technique to support prediction process in future study.
A new ImageJ plug-in "ActogramJ" for chronobiological analyses.
Schmid, Benjamin; Helfrich-Förster, Charlotte; Yoshii, Taishi
2011-10-01
While the rapid development of personal computers and high-throughput recording systems for circadian rhythms allow chronobiologists to produce huge amounts of data, the software to analyze them often lags behind. Here, we announce newly developed chronobiology software that is easy to use, compatible with many different systems, and freely available. Our system can perform the most frequently used analyses: actogram drawing, periodogram analysis, and waveform analysis. The software is distributed as a pure Java plug-in for ImageJ and so works on the 3 main operating systems: Linux, Macintosh, and Windows. We believe that this free software raises the speed of data analyses and makes studying chronobiology accessible to newcomers. © 2011 The Author(s)
NASA Astrophysics Data System (ADS)
Yamada, Susumu; Kitamura, Akihiro; Kurikami, Hiroshi; Machida, Masahiko
2015-04-01
Fukushima Daiichi Nuclear Power Plant (FDNPP) accident on March 2011 released significant quantities of radionuclides to atmosphere. The most significant nuclide is radioactive cesium isotopes. Therefore, the movement of the cesium is one of the critical issues for the environmental assessment. Since the cesium is strongly sorbed by soil particles, the cesium transport can be regarded as the sediment transport which is mainly brought about by the aquatic system such as a river and a lake. In this research, our target is the sediment transport on Ogaki dam reservoir which is located in about 16 km northwest from FDNPP. The reservoir is one of the principal irrigation dam reservoirs in Fukushima Prefecture and its upstream river basin was heavily contaminated by radioactivity. We simulate the sediment transport on the reservoir using 2-D river simulation code named Nays2D originally developed by Shimizu et al. (The latest version of Nays2D is available as a code included in iRIC (http://i-ric.org/en/), which is a river flow and riverbed variation analysis software package). In general, a 2-D simulation code requires a huge amount of calculation time. Therefore, we parallelize the code and execute it on a parallel computer. We examine the relationship between the behavior of the sediment transport and the height of the reservoir exit. The simulation result shows that almost all the sand that enter into the reservoir deposit close to the entrance of the reservoir for any height of the exit. The amounts of silt depositing within the reservoir slightly increase by raising the height of the exit. However, that of the clay dramatically increases. Especially, more than half of the clay deposits, if the exit is sufficiently high. These results demonstrate that the water level of the reservoir has a strong influence on the amount of the clay discharged from the reservoir. As a result, we conclude that the tuning of the water level has a possibility for controlling the recontamination to the downstream.
Rodríguez-Salgado, Isabel; Paradelo-Pérez, Marcos; Pérez-Rodríguez, Paula; Cutillas-Barreiro, Laura; Fernández-Calviño, David; Nóvoa-Muñoz, Juan Carlos; Arias-Estévez, Manuel
2014-01-01
In spite of its wide-world economic relevance, wine production generates a huge amount of waste that threatens the environment. A batch experiment was designed to assess the effect of the amendment of an agricultural soil with two winery wastes (perlite and bentonite wastes) in the immobilization of cyprodinil. Waste addition (0, 10, 20, 40, and 80 Mg ha(-1)) and different times of incubation of soil-waste mixtures (1, 30, and 120 days) were tested. The addition of wastes improved the soil's ability to immobilize cyprodinil, which was significantly correlated to total C content in soil-waste mixtures. Longer incubation times decreased the cyprodinil sorption possibly due to the mineralization of organic matter but also as a consequence of the high pH values reached after bentonite waste addition (up to 10.0). Cyprodinil desorption increased as the amount of waste added to soil, and the incubation time increased. The use of these winery wastes contributes to a more sustainable agriculture preventing fungicide mobilization to groundwater.
(GaIn)(NAs) growth using di-tertiary-butyl-arsano-amine (DTBAA)
NASA Astrophysics Data System (ADS)
Sterzer, E.; Ringler, B.; Nattermann, L.; Beyer, A.; von Hänisch, C.; Stolz, W.; Volz, K.
2017-06-01
III/V semiconductors containing small amounts of Nitrogen (N) are very interesting for a variety of optoelectronic applications. Unfortunately, the conventionally used N precursor 1,1-dimethylhydrazine (UDMHy) has an extremely low N incorporation efficiency in GaAs when grown using metal organic vapor phase epitaxy. Alloying Ga(NAs) with Indium (In) even leads to an exponential reduction of N incorporation. The huge amount of UDMHy in turn changes drastically the growth conditions. Furthermore, the application of this material is still hampered by the large carbon incorporation, most probably originating from the metal organic precursors. Hence, novel precursors for dilute nitride growth are needed. This paper will show (GaIn)(NAs) growth studies with the novel precursor di-tertiary-butyl-arsano-amine in combination with tri-ethyl-gallium and tri-methyl-indium. We show an extremely high N incorporation efficiency in the In containing (GaIn)(NAs). The (GaIn)(NAs) samples investigated in this study have been examined using high resolution X-Ray diffraction, room temperature photoluminescence and atomic force microscope measurements as well as secondary ion mass spectrometry.
Possibility of Coal Combustion Product Conditioning
NASA Astrophysics Data System (ADS)
Błaszczyński, Tomasz Z.; Król, Maciej R.
2018-03-01
This paper is focused on properties of materials known as green binders. They can be used to produce aluminium-siliceous concrete and binders known also as geopolymers. Comparing new ecological binders to ordinary cements we can see huge possibility of reducing amount of main greenhouse gas which is emitted to atmosphere by 3 to even 10 times depending of substrate type used to new green material production. Main ecological source of new materials obtaining possibility is to use already available products which are created in coal combustion and steel smelting process. Most of them are already used in many branches of industry. They are mostly civil engineering, chemistry or agriculture. Conducted research was based on less popular in civil engineering fly ash based on lignite combustion. Materials were examine in order to verify possibility of obtaining hardened mortars based of different factors connected with process of geopolymerization, which are temperature, amount of reaction reagent and time of heat treatment. After systematizing the matrices for the basic parameters affecting the strength of the hardened mortars, the influence of the fly ash treatment for increasing the strength was tested.
Koch, Ina; Schueler, Markus; Heiner, Monika
2005-01-01
To understand biochemical processes caused by, e. g., mutations or deletions in the genome, the knowledge of possible alternative paths between two arbitrary chemical compounds is of increasing interest for biotechnology, pharmacology, medicine, and drug design. With the steadily increasing amount of data from high-throughput experiments new biochemical networks can be constructed and existing ones can be extended, which results in many large metabolic, signal transduction, and gene regulatory networks. The search for alternative paths within these complex and large networks can provide a huge amount of solutions, which can not be handled manually. Moreover, not all of the alternative paths are generally of interest. Therefore, we have developed and implemented a method, which allows us to define constraints to reduce the set of all structurally possible paths to the truly interesting path set. The paper describes the search algorithm and the constraints definition language. We give examples for path searches using this dedicated special language for a Petri net model of the sucrose-to-starch breakdown in the potato tuber.
Koch, Ina; Schüler, Markus; Heiner, Monika
2011-01-01
To understand biochemical processes caused by, e.g., mutations or deletions in the genome, the knowledge of possible alternative paths between two arbitrary chemical compounds is of increasing interest for biotechnology, pharmacology, medicine, and drug design. With the steadily increasing amount of data from high-throughput experiments new biochemical networks can be constructed and existing ones can be extended, which results in many large metabolic, signal transduction, and gene regulatory networks. The search for alternative paths within these complex and large networks can provide a huge amount of solutions, which can not be handled manually. Moreover, not all of the alternative paths are generally of interest. Therefore, we have developed and implemented a method, which allows us to define constraints to reduce the set of all structurally possible paths to the truly interesting path set. The paper describes the search algorithm and the constraints definition language. We give examples for path searches using this dedicated special language for a Petri net model of the sucrose-to-starch breakdown in the potato tuber. http://sanaga.tfh-berlin.de/~stepp/
Silva, Luiziana Ferreira; Taciro, Marilda Keico; Raicher, Gil; Piccoli, Rosane Aparecida Moniz; Mendonça, Thatiane Teixeira; Lopes, Mateus Schreiner Garcez; Gomez, José Gregório Cabrera
2014-11-01
Polyhydroxyalkanoates (PHA) are biodegradable and biocompatible bacterial thermoplastic polymers that can be obtained from renewable resources. The high impact of the carbon source in the final cost of this polymer has been one of the major limiting factors for PHA production and agricultural residues, mainly lignocellulosic materials, have gained attention to overcome this problem. In Brazil, production of 2nd generation ethanol from the glucose fraction, derived from sugarcane bagasse hydrolysate has been studied. The huge amounts of remaining xylose will create an opportunity for the development of other bioprocesses, generating new products to be introduced into a biorefinery model. Although PHA production from sucrose integrated to a 1G ethanol and sugar mill has been proposed in the past, the integration of the process of 2G ethanol in the context of a biorefinery will provide enormous amounts of xylose, which could be applied to produce PHA, establishing a second-generation of PHA production process. Those aspects and perspectives are presented in this article. Copyright © 2014 Elsevier B.V. All rights reserved.
Binary video codec for data reduction in wireless visual sensor networks
NASA Astrophysics Data System (ADS)
Khursheed, Khursheed; Ahmad, Naeem; Imran, Muhammad; O'Nils, Mattias
2013-02-01
Wireless Visual Sensor Networks (WVSN) is formed by deploying many Visual Sensor Nodes (VSNs) in the field. Typical applications of WVSN include environmental monitoring, health care, industrial process monitoring, stadium/airports monitoring for security reasons and many more. The energy budget in the outdoor applications of WVSN is limited to the batteries and the frequent replacement of batteries is usually not desirable. So the processing as well as the communication energy consumption of the VSN needs to be optimized in such a way that the network remains functional for longer duration. The images captured by VSN contain huge amount of data and require efficient computational resources for processing the images and wide communication bandwidth for the transmission of the results. Image processing algorithms must be designed and developed in such a way that they are computationally less complex and must provide high compression rate. For some applications of WVSN, the captured images can be segmented into bi-level images and hence bi-level image coding methods will efficiently reduce the information amount in these segmented images. But the compression rate of the bi-level image coding methods is limited by the underlined compression algorithm. Hence there is a need for designing other intelligent and efficient algorithms which are computationally less complex and provide better compression rate than that of bi-level image coding methods. Change coding is one such algorithm which is computationally less complex (require only exclusive OR operations) and provide better compression efficiency compared to image coding but it is effective for applications having slight changes between adjacent frames of the video. The detection and coding of the Region of Interest (ROIs) in the change frame efficiently reduce the information amount in the change frame. But, if the number of objects in the change frames is higher than a certain level then the compression efficiency of both the change coding and ROI coding becomes worse than that of image coding. This paper explores the compression efficiency of the Binary Video Codec (BVC) for the data reduction in WVSN. We proposed to implement all the three compression techniques i.e. image coding, change coding and ROI coding at the VSN and then select the smallest bit stream among the results of the three compression techniques. In this way the compression performance of the BVC will never become worse than that of image coding. We concluded that the compression efficiency of BVC is always better than that of change coding and is always better than or equal that of ROI coding and image coding.
Black hole firewalls require huge energy of measurement
NASA Astrophysics Data System (ADS)
Hotta, Masahiro; Matsumoto, Jiro; Funo, Ken
2014-06-01
The unitary moving mirror model is one of the best quantum systems for checking the reasoning of the original firewall paradox of Almheiri et al. [J. High Energy Phys. 02 (2013) 062] in quantum black holes. Though the late-time part of radiations emitted from the mirror is fully entangled with the early part, no firewall exists with a deadly, huge average energy flux in this model. This is because the high-energy entanglement structure of the discretized systems in almost maximally entangled states is modified so as to yield the correct description of low-energy effective field theory. Furthermore, the strong subadditivity paradox of firewalls is resolved using nonlocality of general one-particle states and zero-point fluctuation entanglement. Due to the Reeh-Schlieder theorem in quantum field theory, another firewall paradox is inevitably raised with quantum remote measurements in the model. We resolve this paradox from the viewpoint of the energy cost of measurements. No firewall appears, as long as the energy for the measurement is much smaller than the ultraviolet cutoff scale.
Taniguchi, Yoshiki; Takahashi, Tsuyoshi; Nakajima, Kiyokazu; Higashi, Shigeyoshi; Tanaka, Koji; Miyazaki, Yasuhiro; Makino, Tomoki; Kurokawa, Yukinori; Yamasaki, Makoto; Takiguchi, Shuji; Mori, Masaki; Doki, Yuichiro
2017-12-01
Epiphrenic esophageal diverticulum is a rare condition that is often associated with a concomitant esophageal motor disorder. Some patients have the chief complaints of swallowing difficulty and gastroesophageal reflux; traditionally, such diverticula have been resected via right thoracotomy. Here, we describe a case with huge multiple epiphrenic diverticula with motility disorder, which were successfully resected using a video-assisted thoracic and laparoscopic procedure. A 63-year-old man was admitted due to dysphagia, heartburn, and vomiting. An esophagogram demonstrated an S-shaped lower esophagus with multiple epiphrenic diverticula (75 × 55 mm and 30 × 30 mm) and obstruction by the lower esophageal sphincter (LES). Esophageal manometry showed normal peristaltic contractions in the esophageal body, whereas the LES pressure was high (98.6 mmHg). The pressure vector volume of LES was 23,972 mmHg 2 cm. Based on these findings, we diagnosed huge multiple epiphrenic diverticula with a hypertensive lower esophageal sphincter and judged that resection might be required. We performed lower esophagectomy with gastric conduit reconstruction using a video-assisted thoracic and hand-assisted laparoscopic procedure. The postoperative course was uneventful, and the esophagogram demonstrated good passage, with no leakage, stenosis, or diverticula. The most common causes of mid-esophageal and epiphrenic diverticula are motility disorders of the esophageal body; appropriate treatment should be considered based on the morphological and motility findings.
Huge maternal hydronephrosis: a rare complication in pregnancy.
Peng, Hsiu-Huei; Wang, Chin-Jung; Yen, Chih-Feng; Chou, Chien-Chung; Lee, Chyi-Long
2003-06-10
A huge maternal hydronephrosis is uncommon in pregnancy and might be mistaken as a pelvic mass. A 21-year-old primigravida was noted at 25th week of gestation to have a visible bulging mass on her left flank. The mass was originally mistaken as a large ovarian cyst but later proved to be a huge hydronephrosis. Retrograde insertion of ureteroscope and a ureteric stent failed, so we performed repeated ultrasound-guided needle aspiration to decompress the huge hydronephrosis, which enabled the patient to proceed to a successful term vaginal delivery. Nephrectomy was performed after delivery and proved the diagnosis of congenital ureteropelvic junction obstruction.
2000-03-01
languages yet still be able to access the legacy relational databases that businesses have huge investments in. JDBC is a low-level API designed for...consider the return of investment . The system requirements, discussed in Chapter II, are the main source of input to developing the relational...1996. Inprise, Gatekeeper Guide, Inprise Corporation, 1999. Kroenke, D., Database Processing Fundementals , Design, and Implementation, Sixth Edition
An Assessment of Brazil’s Economic and Energy Problems
1988-04-01
regions of Brazil have similar problems. The * huge Campo Cerrado region covering 500 million acres, or an area equal to 12 of the midwestern states...OF BRAZIL’S ECONOMIC AND ENERGY PROBLEMS by Keith D. Hawkins Lieutenant Colonel, USAF A RESEARCH REPORT SUBMITTED TO THE FACULTY IN FULFILLMENT OF... THE RESEARCH REQUIREMENT Research Advisor: Lieutenant Colonel George M. Lauderbuagh MAXWELL AIR FORCE BASE, ALABAMA April 1988 L. - -I ._ .! I I El II
Immersive Learning Simulations in Aircraft Maintenance Training
2010-02-15
do not have a chance to use in normal, daily activities. Like training, video games are a huge business. The video game industry recorded over...18 billion in sales last year.1 What if you could combine the engaging aspects of video gaming with the requirements of a training program? You...interaction.”4 In other words, a video game that trains. This definition of ILS will be used throughout this paper, since discussing serious games
Molecular dynamics simulations through GPU video games technologies
Loukatou, Styliani; Papageorgiou, Louis; Fakourelis, Paraskevas; Filntisi, Arianna; Polychronidou, Eleftheria; Bassis, Ioannis; Megalooikonomou, Vasileios; Makałowski, Wojciech; Vlachakis, Dimitrios; Kossida, Sophia
2016-01-01
Bioinformatics is the scientific field that focuses on the application of computer technology to the management of biological information. Over the years, bioinformatics applications have been used to store, process and integrate biological and genetic information, using a wide range of methodologies. One of the most de novo techniques used to understand the physical movements of atoms and molecules is molecular dynamics (MD). MD is an in silico method to simulate the physical motions of atoms and molecules under certain conditions. This has become a state strategic technique and now plays a key role in many areas of exact sciences, such as chemistry, biology, physics and medicine. Due to their complexity, MD calculations could require enormous amounts of computer memory and time and therefore their execution has been a big problem. Despite the huge computational cost, molecular dynamics have been implemented using traditional computers with a central memory unit (CPU). A graphics processing unit (GPU) computing technology was first designed with the goal to improve video games, by rapidly creating and displaying images in a frame buffer such as screens. The hybrid GPU-CPU implementation, combined with parallel computing is a novel technology to perform a wide range of calculations. GPUs have been proposed and used to accelerate many scientific computations including MD simulations. Herein, we describe the new methodologies developed initially as video games and how they are now applied in MD simulations. PMID:27525251
Virtual Global Magnetic Observatory - Concept and Implementation
NASA Astrophysics Data System (ADS)
Papitashvili, V.; Clauer, R.; Petrov, V.; Saxena, A.
2002-12-01
The existing World Data Centers (WDC) continue to serve excellently the worldwide scientific community in providing free access to a huge number of global geophysical databases. Various institutions at different geographic locations house these Centers, mainly organized by a scientific discipline. However, population of the Centers requires mandatory or voluntary submission of locally collected data. Recently many digital geomagnetic datasets have been placed on the World Wide Web and some of these sets have not been even submitted to any data center. This has created an urgent need for more sophisticated search engines capable of identifying geomagnetic data on the Web and then retrieving a certain amount of data for the scientific analysis. In this study, we formulate a concept of the virtual global magnetic observatory (VGMO) that currently uses a pre-set list of the Web-based geomagnetic data holders (including WDC) as retrieving a requested case-study interval. Saving the retrieved data locally over the multiple requests, a VGMO user begins to build his/her own data sub-center, which does not need to search the Web if the newly requested interval will be within a span of the earlier retrieved data. At the same time, this self-populated sub-center becomes available to other VGMO users down on the requests chain. Some aspects of the Web``crawling'' helping to identify the newly ``webbed'' digital geomagnetic data are also considered.
Zhao, Yu; Liu, Yide; Lai, Ivan K W; Zhang, Hongfeng; Zhang, Yi
2016-03-18
As one of the latest revolutions in networking technology, social networks allow users to keep connected and exchange information. Driven by the rapid wireless technology development and diffusion of mobile devices, social networks experienced a tremendous change based on mobile sensor computing. More and more mobile sensor network applications have appeared with the emergence of a huge amount of users. Therefore, an in-depth discussion on the human-computer interaction (HCI) issues of mobile sensor computing is required. The target of this study is to extend the discussions on HCI by examining the relationships of users' compound attitudes (i.e., affective attitudes, cognitive attitude), engagement and electronic word of mouth (eWOM) behaviors in the context of mobile sensor computing. A conceptual model is developed, based on which, 313 valid questionnaires are collected. The research discusses the level of impact on the eWOM of mobile sensor computing by considering user-technology issues, including the compound attitude and engagement, which can bring valuable discussions on the HCI of mobile sensor computing in further study. Besides, we find that user engagement plays a mediating role between the user's compound attitudes and eWOM. The research result can also help the mobile sensor computing industry to develop effective strategies and build strong consumer user-product (brand) relationships.
A comparative study of sorption of chromium (III) onto chitin and chitosan
NASA Astrophysics Data System (ADS)
Singh, Pooja; Nagendran, R.
2016-06-01
Heavy metals have always been the most hazardous components in the wastewater of industries like electroplating, automobiles, mining facilities and fertilizer manufacturers. Treatment of heavy metal laden wastewater requires expensive operational and maintenance systems. Food processing industries create a huge amount of shell waste which is sold to poultry farms in powdered form but the quantity thus used is still not comparable to the left over waste. The shell contains chitin which acts as an adsorbent for the heavy metals and can be used to treat heavy metal wastewater. The paper presents a study on the use of chitin and its processed product, chitosan, to remove chromium. Shake flask experiment was conducted to compare the adsorptive capacity of chitin and chitosan for chromium removal from simulated solution and isotherm studies were carried out. The studies showed that the chitosan was a better adsorbent than chitin. Both chitin and chitosan gave best adsorption results at pH 3. Chitin exhibited maximum chromium removal of 49.98 % in 20 min, whereas chitosan showed 50 % removal efficiency at a contact time of 20 min showing higher adsorptive capacity for chromium than chitin. The Langmiur and Freundlich isotherm studies showed very good adsorption capacity and monolayer interaction according to the regression coefficient 0.973 for chitosan and 0.915 for chitin. The regression coefficient for Freundlich isotherm was 0.894 and 0.831 for chitosan and chitin, respectively.
Pyro-electrification of polymer membranes for cell patterning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rega, R.; Gennari, O.; Mecozzia, L.
2016-05-18
In the recent years, much attention has been devoted to the possibility of charging polymer-based materials, due to their potential in developing large-scale and inexpensive flexible thin-film technology. The availability of localized electrostatic fields is in of great interest for a huge amount of applications such as distribution of biomolecules and cells from the liquid phase. Here we report a voltage-free pyro-electrification (PE) process able to induce permanent dipoles into polymer layers; the lithium niobate (LN) crystal is the key component that plays the multi-purpose role of sustaining, heating and poling the polymer layer that is then peeled-off easily inmore » order to have a free-standing charged membrane. The results show the fascinating application for the living cell patterning. It well known that cell behaviour is affected by chemical and topographical cues of substrate. In fact, polymers, such as polystyrene (PS) and poly(methyl methacrylate) (PMMA), are naturally cytophobic and require specific functionalization treatments in order to promote cell adhesion. Through our proposal technique, it’s possible to obtain spontaneous organization and a driven growth of SH-SY5Y cells that is solely dictated by the nature of the charge polymer surface, opening, in this way, the innovative chance to manipulate and transfer biological samples on a free-standing polymer layer [1].« less
Evaluation of NoSQL databases for DIRAC monitoring and beyond
NASA Astrophysics Data System (ADS)
Mathe, Z.; Casajus Ramo, A.; Stagni, F.; Tomassetti, L.
2015-12-01
Nowadays, many database systems are available but they may not be optimized for storing time series data. Monitoring DIRAC jobs would be better done using a database optimised for storing time series data. So far it was done using a MySQL database, which is not well suited for such an application. Therefore alternatives have been investigated. Choosing an appropriate database for storing huge amounts of time series data is not trivial as one must take into account different aspects such as manageability, scalability and extensibility. We compared the performance of Elasticsearch, OpenTSDB (based on HBase) and InfluxDB NoSQL databases, using the same set of machines and the same data. We also evaluated the effort required for maintaining them. Using the LHCb Workload Management System (WMS), based on DIRAC as a use case we set up a new monitoring system, in parallel with the current MySQL system, and we stored the same data into the databases under test. We evaluated Grafana (for OpenTSDB) and Kibana (for ElasticSearch) metrics and graph editors for creating dashboards, in order to have a clear picture on the usability of each candidate. In this paper we present the results of this study and the performance of the selected technology. We also give an outlook of other potential applications of NoSQL databases within the DIRAC project.
Advances in Mössbauer data analysis
NASA Astrophysics Data System (ADS)
de Souza, Paulo A.
1998-08-01
The whole Mössbauer community generates a huge amount of data in several fields of human knowledge since the first publication of Rudolf Mössbauer. Interlaboratory measurements of the same substance may result in minor differences in the Mössbauer Parameters (MP) of isomer shift, quadrupole splitting and internal magnetic field. Therefore, a conventional data bank of published MP will be of limited help in identification of substances. Data bank search for exact information became incapable to differentiate the values of Mössbauer parameters within the experimental errors (e.g., IS = 0.22 mm/s from IS = 0.23 mm/s), but physically both values may be considered the same. An artificial neural network (ANN) is able to identify a substance and its crystalline structure from measured MP, and its slight variations do not represent an obstacle for the ANN identification. A barrier to the popularization of Mössbauer spectroscopy as an analytical technique is the absence of a full automated equipment, since the analysis of a Mössbauer spectrum normally is time-consuming and requires a specialist. In this work, the fitting process of a Mössbauer spectrum was completely automated through the use of genetic algorithms and fuzzy logic. Both software and hardware systems were implemented turning out to be a fully automated Mössbauer data analysis system. The developed system will be presented.
Automation on the generation of genome-scale metabolic models.
Reyes, R; Gamermann, D; Montagud, A; Fuente, D; Triana, J; Urchueguía, J F; de Córdoba, P Fernández
2012-12-01
Nowadays, the reconstruction of genome-scale metabolic models is a nonautomatized and interactive process based on decision making. This lengthy process usually requires a full year of one person's work in order to satisfactory collect, analyze, and validate the list of all metabolic reactions present in a specific organism. In order to write this list, one manually has to go through a huge amount of genomic, metabolomic, and physiological information. Currently, there is no optimal algorithm that allows one to automatically go through all this information and generate the models taking into account probabilistic criteria of unicity and completeness that a biologist would consider. This work presents the automation of a methodology for the reconstruction of genome-scale metabolic models for any organism. The methodology that follows is the automatized version of the steps implemented manually for the reconstruction of the genome-scale metabolic model of a photosynthetic organism, Synechocystis sp. PCC6803. The steps for the reconstruction are implemented in a computational platform (COPABI) that generates the models from the probabilistic algorithms that have been developed. For validation of the developed algorithm robustness, the metabolic models of several organisms generated by the platform have been studied together with published models that have been manually curated. Network properties of the models, like connectivity and average shortest mean path of the different models, have been compared and analyzed.
Towards a New Generation of Time-Series Visualization Tools in the ESA Heliophysics Science Archives
NASA Astrophysics Data System (ADS)
Perez, H.; Martinez, B.; Cook, J. P.; Herment, D.; Fernandez, M.; De Teodoro, P.; Arnaud, M.; Middleton, H. R.; Osuna, P.; Arviset, C.
2017-12-01
During the last decades a varied set of Heliophysics missions have allowed the scientific community to gain a better knowledge on the solar atmosphere and activity. The remote sensing images of missions such as SOHO have paved the ground for Helio-based spatial data visualization software such as JHelioViewer/Helioviewer. On the other hand, the huge amount of in-situ measurements provided by other missions such as Cluster provide a wide base for plot visualization software whose reach is still far from being fully exploited. The Heliophysics Science Archives within the ESAC Science Data Center (ESDC) already provide a first generation of tools for time-series visualization focusing on each mission's needs: visualization of quicklook plots, cross-calibration time series, pre-generated/on-demand multi-plot stacks (Cluster), basic plot zoom in/out options (Ulysses) and easy navigation through the plots in time (Ulysses, Cluster, ISS-Solaces). However, as the needs evolve and the scientists involved in new missions require to plot multi-variable data, heat maps stacks interactive synchronization and axis variable selection among other improvements. The new Heliophysics archives (such as Solar Orbiter) and the evolution of existing ones (Cluster) intend to address these new challenges. This paper provides an overview of the different approaches for visualizing time-series followed within the ESA Heliophysics Archives and their foreseen evolution.
NASA Astrophysics Data System (ADS)
Tahtah, Reda; Bouchoucha, Ali; Abid, Cherifa; Kadja, Mahfoud; Benkafada, Fouzia
2017-02-01
The sun provides the earth with huge amounts of energy that can be exploited in various forms. Its exploitation can be done by using a parabolic through solar concentrator integrated with thermal storage tank, that we already made, and it is our main study. This study obviously requires special attention to the effect of the parameters of the fluids, in addition to thermal performances of this system. To do this, we studied the thermal behavior of this concentrator, and by choosing the summer period because of its stable illumination (clear sky). Before starting the test, it is necessary to check the flow circuit and the storage tank which completely filled with fluid, started the measures on the morning, the concentrator directed towards the sun until the sunset, we recorded the variation of different temperatures such as Tin, Tout, Tsur, Tfluid and Tamb. We have compared the evaluation of temperatures between water and thermal oil in order to determine the best thermal behavior and the importance of the specific heat of each fluid. The obtained results of this paper show that by using water inside the receiver, we obtained better performance than by using oil. It can be observed that the oil temperature increasing rapidly compared to water, however, water temperature takes long time to cool down compared to the first fluid which will help in the storage of heat.
Donard, O F X; Bruneau, F; Moldovan, M; Garraud, H; Epov, V N; Boust, D
2007-03-28
Among the transuranic elements present in the environment, plutonium isotopes are mainly attached to particles, and therefore they present a great interest for the study and modelling of particle transport in the marine environment. Except in the close vicinity of industrial sources, plutonium concentration in marine sediments is very low (from 10(-4) ng kg(-1) for (241)Pu to 10 ng kg(-1) for (239)Pu), and therefore the measurement of (238)Pu, (239)Pu, (240)Pu, (241)Pu and (242)Pu in sediments at such concentration level requires the use of very sensitive techniques. Moreover, sediment matrix contains huge amounts of mineral species, uranium and organic substances that must be removed before the determination of plutonium isotopes. Hence, an efficient sample preparation step is necessary prior to analysis. Within this work, a chemical procedure for the extraction, purification and pre-concentration of plutonium from marine sediments prior to sector-field inductively coupled plasma mass spectrometry (SF-ICP-MS) analysis has been optimized. The analytical method developed yields a pre-concentrated solution of plutonium from which (238)U and (241)Am have been removed, and which is suitable for the direct and simultaneous measurement of (239)Pu, (240)Pu, (241)Pu and (242)Pu by SF-ICP-MS.
Zhao, Yu; Liu, Yide; Lai, Ivan K. W.; Zhang, Hongfeng; Zhang, Yi
2016-01-01
As one of the latest revolutions in networking technology, social networks allow users to keep connected and exchange information. Driven by the rapid wireless technology development and diffusion of mobile devices, social networks experienced a tremendous change based on mobile sensor computing. More and more mobile sensor network applications have appeared with the emergence of a huge amount of users. Therefore, an in-depth discussion on the human–computer interaction (HCI) issues of mobile sensor computing is required. The target of this study is to extend the discussions on HCI by examining the relationships of users’ compound attitudes (i.e., affective attitudes, cognitive attitude), engagement and electronic word of mouth (eWOM) behaviors in the context of mobile sensor computing. A conceptual model is developed, based on which, 313 valid questionnaires are collected. The research discusses the level of impact on the eWOM of mobile sensor computing by considering user-technology issues, including the compound attitude and engagement, which can bring valuable discussions on the HCI of mobile sensor computing in further study. Besides, we find that user engagement plays a mediating role between the user’s compound attitudes and eWOM. The research result can also help the mobile sensor computing industry to develop effective strategies and build strong consumer user—product (brand) relationships. PMID:26999155
Study to the current protection of personal data in the educational sector in Indonesia
NASA Astrophysics Data System (ADS)
Rosmaini, E.; Kusumasari, T. F.; Lubis, M.; Lubis, A. R.
2018-03-01
This study examines how legal expert interpret UU ITE to protect personal data based on privacy principle by using content analysis. This act has importance in order to govern the process of collection, use, transfer, disclose and store personal data for profit or other commercial purposes. By recognizing both the right of individual for privacy and the need of organization to utilize the customer data, the Act, which was amended by Parliament at October, 27th 2016 have critical role for protection guideline in Indonesia. Increasingly, with the use of advanced technology, data protection became one of the main issues on various sectors, especially in the educational sector. Educational institutions require large amount of personal data to run their business process to support learning, teaching, research and administration process. It involves wide range of personal data from institution, agencies, colleges, lecturers, students and parents, which might consist of several sensitive and confidential data such as historical, health, financial, academic and experience background. Underestimating and ignoring these issues can lead to disaster such as blackmailing, stalking, bullying or improper use of personal data. In aggregation, they might deliver huge loss to institution either financially or trust. Thus, this study analyse the privacy principle of UU ITE through 21 coders from legal expert to obtain more understanding of appropriate approach to implement privacy policy in the educational sector.
Parallel computing method for simulating hydrological processesof large rivers under climate change
NASA Astrophysics Data System (ADS)
Wang, H.; Chen, Y.
2016-12-01
Climate change is one of the proverbial global environmental problems in the world.Climate change has altered the watershed hydrological processes in time and space distribution, especially in worldlarge rivers.Watershed hydrological process simulation based on physically based distributed hydrological model can could have better results compared with the lumped models.However, watershed hydrological process simulation includes large amount of calculations, especially in large rivers, thus needing huge computing resources that may not be steadily available for the researchers or at high expense, this seriously restricted the research and application. To solve this problem, the current parallel method are mostly parallel computing in space and time dimensions.They calculate the natural features orderly thatbased on distributed hydrological model by grid (unit, a basin) from upstream to downstream.This articleproposes ahigh-performancecomputing method of hydrological process simulation with high speedratio and parallel efficiency.It combinedthe runoff characteristics of time and space of distributed hydrological model withthe methods adopting distributed data storage, memory database, distributed computing, parallel computing based on computing power unit.The method has strong adaptability and extensibility,which means it canmake full use of the computing and storage resources under the condition of limited computing resources, and the computing efficiency can be improved linearly with the increase of computing resources .This method can satisfy the parallel computing requirements ofhydrological process simulation in small, medium and large rivers.
Requirements for a multifunctional code architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tiihonen, O.; Juslin, K.
1997-07-01
The present paper studies a set of requirements for a multifunctional simulation software architecture in the light of experiences gained in developing and using the APROS simulation environment. The huge steps taken in the development of computer hardware and software during the last ten years are changing the status of the traditional nuclear safety analysis software. The affordable computing power on the safety analysts table by far exceeds the possibilities offered to him/her ten years ago. At the same time the features of everyday office software tend to set standards to the way the input data and calculational results aremore » managed.« less
Lin, Jyh-Miin; Patterson, Andrew J; Chang, Hing-Chiu; Gillard, Jonathan H; Graves, Martin J
2015-10-01
To propose a new reduced field-of-view (rFOV) strategy for iterative reconstructions in a clinical environment. Iterative reconstructions can incorporate regularization terms to improve the image quality of periodically rotated overlapping parallel lines with enhanced reconstruction (PROPELLER) MRI. However, the large amount of calculations required for full FOV iterative reconstructions has posed a huge computational challenge for clinical usage. By subdividing the entire problem into smaller rFOVs, the iterative reconstruction can be accelerated on a desktop with a single graphic processing unit (GPU). This rFOV strategy divides the iterative reconstruction into blocks, based on the block-diagonal dominant structure. A near real-time reconstruction system was developed for the clinical MR unit, and parallel computing was implemented using the object-oriented model. In addition, the Toeplitz method was implemented on the GPU to reduce the time required for full interpolation. Using the data acquired from the PROPELLER MRI, the reconstructed images were then saved in the digital imaging and communications in medicine format. The proposed rFOV reconstruction reduced the gridding time by 97%, as the total iteration time was 3 s even with multiple processes running. A phantom study showed that the structure similarity index for rFOV reconstruction was statistically superior to conventional density compensation (p < 0.001). In vivo study validated the increased signal-to-noise ratio, which is over four times higher than with density compensation. Image sharpness index was improved using the regularized reconstruction implemented. The rFOV strategy permits near real-time iterative reconstruction to improve the image quality of PROPELLER images. Substantial improvements in image quality metrics were validated in the experiments. The concept of rFOV reconstruction may potentially be applied to other kinds of iterative reconstructions for shortened reconstruction duration.
Monitoring of small laboratory animal experiments by a designated web-based database.
Frenzel, T; Grohmann, C; Schumacher, U; Krüll, A
2015-10-01
Multiple-parametric small animal experiments require, by their very nature, a sufficient number of animals which may need to be large to obtain statistically significant results.(1) For this reason database-related systems are required to collect the experimental data as well as to support the later (re-) analysis of the information gained during the experiments. In particular, the monitoring of animal welfare is simplified by the inclusion of warning signals (for instance, loss in body weight >20%). Digital patient charts have been developed for human patients but are usually not able to fulfill the specific needs of animal experimentation. To address this problem a unique web-based monitoring system using standard MySQL, PHP, and nginx has been created. PHP was used to create the HTML-based user interface and outputs in a variety of proprietary file formats, namely portable document format (PDF) or spreadsheet files. This article demonstrates its fundamental features and the easy and secure access it offers to the data from any place using a web browser. This information will help other researchers create their own individual databases in a similar way. The use of QR-codes plays an important role for stress-free use of the database. We demonstrate a way to easily identify all animals and samples and data collected during the experiments. Specific ways to record animal irradiations and chemotherapy applications are shown. This new analysis tool allows the effective and detailed analysis of huge amounts of data collected through small animal experiments. It supports proper statistical evaluation of the data and provides excellent retrievable data storage. © The Author(s) 2015.
Son, Byungjik; Jeon, Seunggon
2018-01-01
A disaster preventive structural health monitoring (SHM) system needs to be equipped with the following abilities: First, it should be able to simultaneously measure diverse types of data (e.g., displacement, velocity, acceleration, strain, load, temperature, humidity, etc.) for accurate diagnosis. Second, it also requires standalone power supply to guarantee its immediate response in crisis (e.g., sudden interruption of normal AC power in disaster situations). Furthermore, it should be capable of prompt processing and realtime wireless communication of a huge amount of data. Therefore, this study is aimed at developing a wireless unified-maintenance system (WUMS) that would satisfy all the requirements for a disaster preventive SHM system of civil structures. The WUMS is designed to measure diverse types of structural responses in realtime based on wireless communication, allowing users to selectively use WiFi RF band and finally working in standalone mode by means of the field-programmable gate array (FPGA) technology. To verify its performance, the following tests were performed: (i) A test to see how far communication is possible in open field, (ii) a test on a shaker to see how accurate responses are, (iii) a modal test on a bridge to see how exactly characteristic real-time dynamic responses are of structures. The test results proved that the WUMS was able to secure stable communication far up to nearly 800 m away by acquiring wireless responses in realtime accurately, when compared to the displacement and acceleration responses which were acquired through wired communication. The analysis of dynamic characteristics also showed that the wireless acceleration responses in real-time represented satisfactorily the dynamic properties of structures. Therefore, the WUMS is proved valid as a SHM, and its outstanding performance is also proven. PMID:29747403
Heo, Gwanghee; Son, Byungjik; Kim, Chunggil; Jeon, Seunggon; Jeon, Joonryong
2018-05-09
A disaster preventive structural health monitoring (SHM) system needs to be equipped with the following abilities: First, it should be able to simultaneously measure diverse types of data (e.g., displacement, velocity, acceleration, strain, load, temperature, humidity, etc.) for accurate diagnosis. Second, it also requires standalone power supply to guarantee its immediate response in crisis (e.g., sudden interruption of normal AC power in disaster situations). Furthermore, it should be capable of prompt processing and realtime wireless communication of a huge amount of data. Therefore, this study is aimed at developing a wireless unified-maintenance system (WUMS) that would satisfy all the requirements for a disaster preventive SHM system of civil structures. The WUMS is designed to measure diverse types of structural responses in realtime based on wireless communication, allowing users to selectively use WiFi RF band and finally working in standalone mode by means of the field-programmable gate array (FPGA) technology. To verify its performance, the following tests were performed: (i) A test to see how far communication is possible in open field, (ii) a test on a shaker to see how accurate responses are, (iii) a modal test on a bridge to see how exactly characteristic real-time dynamic responses are of structures. The test results proved that the WUMS was able to secure stable communication far up to nearly 800 m away by acquiring wireless responses in realtime accurately, when compared to the displacement and acceleration responses which were acquired through wired communication. The analysis of dynamic characteristics also showed that the wireless acceleration responses in real-time represented satisfactorily the dynamic properties of structures. Therefore, the WUMS is proved valid as a SHM, and its outstanding performance is also proven.
Panikashvili, David; Shi, Jian Xin; Schreiber, Lukas; Aharoni, Asaph
2009-01-01
The cuticle covering every plant aerial organ is largely made of cutin that consists of fatty acids, glycerol, and aromatic monomers. Despite the huge importance of the cuticle to plant development and fitness, our knowledge regarding the assembly of the cutin polymer and its integration in the complete cuticle structure is limited. Cutin composition implies the action of acyltransferase-type enzymes that mediate polymer construction through ester bond formation. Here, we show that a member of the BAHD family of acyltransferases (DEFECTIVE IN CUTICULAR RIDGES [DCR]) is required for incorporation of the most abundant monomer into the polymeric structure of the Arabidopsis (Arabidopsis thaliana) flower cutin. DCR-deficient plants display phenotypes that are typically associated with a defective cuticle, including altered epidermal cell differentiation and postgenital organ fusion. Moreover, levels of the major cutin monomer in flowers, 9(10),16-dihydroxy-hexadecanoic acid, decreased to an almost undetectable amount in the mutants. Interestingly, dcr mutants exhibit changes in the decoration of petal conical cells and mucilage extrusion in the seed coat, both phenotypes formerly not associated with cutin polymer assembly. Excessive root branching displayed by dcr mutants and the DCR expression pattern in roots pointed to the function of DCR belowground, in shaping root architecture by influencing lateral root emergence and growth. In addition, the dcr mutants were more susceptible to salinity, osmotic, and water deprivation stress conditions. Finally, the analysis of DCR protein localization suggested that cutin polymerization, possibly the oligomerization step, is partially carried out in the cytoplasmic space. Therefore, this study extends our knowledge regarding the functionality of the cuticular layer and the formation of its major constituent the polymer cutin. PMID:19828672
Panikashvili, David; Shi, Jian Xin; Schreiber, Lukas; Aharoni, Asaph
2009-12-01
The cuticle covering every plant aerial organ is largely made of cutin that consists of fatty acids, glycerol, and aromatic monomers. Despite the huge importance of the cuticle to plant development and fitness, our knowledge regarding the assembly of the cutin polymer and its integration in the complete cuticle structure is limited. Cutin composition implies the action of acyltransferase-type enzymes that mediate polymer construction through ester bond formation. Here, we show that a member of the BAHD family of acyltransferases (DEFECTIVE IN CUTICULAR RIDGES [DCR]) is required for incorporation of the most abundant monomer into the polymeric structure of the Arabidopsis (Arabidopsis thaliana) flower cutin. DCR-deficient plants display phenotypes that are typically associated with a defective cuticle, including altered epidermal cell differentiation and postgenital organ fusion. Moreover, levels of the major cutin monomer in flowers, 9(10),16-dihydroxy-hexadecanoic acid, decreased to an almost undetectable amount in the mutants. Interestingly, dcr mutants exhibit changes in the decoration of petal conical cells and mucilage extrusion in the seed coat, both phenotypes formerly not associated with cutin polymer assembly. Excessive root branching displayed by dcr mutants and the DCR expression pattern in roots pointed to the function of DCR belowground, in shaping root architecture by influencing lateral root emergence and growth. In addition, the dcr mutants were more susceptible to salinity, osmotic, and water deprivation stress conditions. Finally, the analysis of DCR protein localization suggested that cutin polymerization, possibly the oligomerization step, is partially carried out in the cytoplasmic space. Therefore, this study extends our knowledge regarding the functionality of the cuticular layer and the formation of its major constituent the polymer cutin.
NASA Astrophysics Data System (ADS)
O'Connor, Sean M.; Lynch, Jerome P.; Gilbert, Anna C.
2013-04-01
Wireless sensors have emerged to offer low-cost sensors with impressive functionality (e.g., data acquisition, computing, and communication) and modular installations. Such advantages enable higher nodal densities than tethered systems resulting in increased spatial resolution of the monitoring system. However, high nodal density comes at a cost as huge amounts of data are generated, weighing heavy on power sources, transmission bandwidth, and data management requirements, often making data compression necessary. The traditional compression paradigm consists of high rate (>Nyquist) uniform sampling and storage of the entire target signal followed by some desired compression scheme prior to transmission. The recently proposed compressed sensing (CS) framework combines the acquisition and compression stage together, thus removing the need for storage and operation of the full target signal prior to transmission. The effectiveness of the CS approach hinges on the presence of a sparse representation of the target signal in a known basis, similarly exploited by several traditional compressive sensing applications today (e.g., imaging, MRI). Field implementations of CS schemes in wireless SHM systems have been challenging due to the lack of commercially available sensing units capable of sampling methods (e.g., random) consistent with the compressed sensing framework, often moving evaluation of CS techniques to simulation and post-processing. The research presented here describes implementation of a CS sampling scheme to the Narada wireless sensing node and the energy efficiencies observed in the deployed sensors. Of interest in this study is the compressibility of acceleration response signals collected from a multi-girder steel-concrete composite bridge. The study shows the benefit of CS in reducing data requirements while ensuring data analysis on compressed data remain accurate.
In vitro production of M. × piperita not containing pulegone and menthofuran.
Bertoli, Alessandra; Leonardi, Michele; Krzyzanowska, Justyna; Oleszek, Wieslaw; Pistelli, Luisa
2012-01-01
The essential oils (EOs) and static headspaces (HSs) of in vitro plantlets and callus of Mentha x piperita were characterized by GC-MS analysis. Leaves were used as explants to induce in vitro plant material. The EO yields of the in vitro biomass were much lower (0.1% v/w) than those of the parent plants (2% v/w). Many typical mint volatiles were emitted by the in vitro production, but the callus and in vitro plantelet EOs were characterized by the lack of both pulegone and menthofuran. This was an important difference between in vitro and in vivo plant material as huge amounts of pulegone and menthofuran may jeopardise the safety of mint essential oil. Regarding the other characteristic volatiles, menthone was present in reduced amounts (2%) in the in vitro plantlets and was not detected in the callus, even if it represented the main constituent of the stem and leaf EOs obtained from the cultivated mint (26% leaves; 33% stems). The M. piperita callus was characterized by menthol (9%) and menthone (2%), while the in vitro plantlet EO showed lower amounts of both these compounds in favour of piperitenone oxide (45%). Therefore, the established callus and in vitro plantlets showed peculiar aromatic profiles characterized by the lack of pulegone and menthofuran which have to be monitored in the mint oil for their toxicity.
NASA Astrophysics Data System (ADS)
Gao, Chuanyu; Liu, Hanxiang; Cong, Jinxin; Han, Dongxue; Zhao, Winston; Lin, Qianxin; Wang, Guoping
2018-05-01
Black carbon (BC), the byproduct of incomplete combustion of fossil fuels and biomass can be stored in soil for a long time and potentially archive changes in natural and human activities. Increasing amounts of BC has been produced from human activities during the past 150 years and has influenced global climate change and carbon cycle. Identifying historical BC sources is important in knowing how historical human activities influenced BC and BC transportation processes in the atmosphere. In this study, PAH components and δ13C-BC in peatland in the Sanjiang Plain were used for identifying and verifying regional BC sources during the last 150 years. Results showed that environment-unfriendly industry developed at the end of the 1950s produced a great amount of BC and contributed the most BC in this period. In other periods, however, BC in the Sanjiang Plain was mainly produced from incomplete biomass burning before the 1990s; particularly, slash-and-burn of pastures and forests during regional reclamation periods between the 1960s and 1980s produced a huge amount of biomass burning BC, which then deposited into the surrounding ecosystems. With the regional reclamation decreasing and environment-friendly industry developing, the proportion of BC emitted and deposited from transportation sources increased and transportation source became an important BC source in the Sanjiang Plain after the 1990s.
Matzenbacher, Cristina Araujo; Garcia, Ana Letícia Hilario; Dos Santos, Marcela Silva; Nicolau, Caroline Cardoso; Premoli, Suziane; Corrêa, Dione Silva; de Souza, Claudia Telles; Niekraszewicz, Liana; Dias, Johnny Ferraz; Delgado, Tânia Valéria; Kalkreuth, Wolfgang; Grivicich, Ivana; da Silva, Juliana
2017-02-15
Coal mining and combustion generating huge amounts of bottom and fly ash are major causes of environmental pollution and health hazards due to the release of polycyclic aromatic hydrocarbons (PAH) and heavy metals. The Candiota coalfield in Rio Grande do Sul, is one of the largest open-cast coal mines in Brazil. The aim of this study was to evaluate genotoxic and mutagenic effects of coal, bottom ash and fly ash samples from Candiota with the comet assay (alkaline and modified version) and micronucleus test using the lung fibroblast cell line (V79). Qualitative and quantitative analysis of PAH and inorganic elements was carried out by High Performance Liquid Chromatography (HPLC) and by Particle-Induced X-ray Emission (PIXE) techniques respectively. The samples demonstrated genotoxic and mutagenic effects. The comet assay modified using DNA-glicosilase formamidopirimidina (FPG) endonuclease showed damage related to oxidative stress mechanisms. The amount of PAHs was higher in fly ash followed by pulverized coal. The amount of inorganic elements was highest in fly ash, followed by bottom ash. It is concluded that the samples induce DNA damage by mechanisms that include oxidative stress, due to their complex composition, and that protective measures have to be taken regarding occupational and environmental hazards. Copyright © 2016 Elsevier B.V. All rights reserved.
Hemangiopericytoma of Greater Omentum Presenting as a Huge Abdominal Lump
Chatterjee, Damodar; Sarkar, Pradip; Sengupta, Niladri; Singh, W. Gopimohan
2008-01-01
Hemangiopericytoma is a rare neoplasm that can occur in any part of the human body, but it rarely develops in the greater omentum. We report a case of a patient who presented with a huge abdominal lump. At laparotomy, a huge vascular tumor, which was observed originating from the greater omentum, was resected. Histopathology investigation revealed this tumor as a benign hemangiopericytoma with a malignant potential. PMID:19568508
Improving transmission efficiency of large sequence alignment/map (SAM) files.
Sakib, Muhammad Nazmus; Tang, Jijun; Zheng, W Jim; Huang, Chin-Tser
2011-01-01
Research in bioinformatics primarily involves collection and analysis of a large volume of genomic data. Naturally, it demands efficient storage and transfer of this huge amount of data. In recent years, some research has been done to find efficient compression algorithms to reduce the size of various sequencing data. One way to improve the transmission time of large files is to apply a maximum lossless compression on them. In this paper, we present SAMZIP, a specialized encoding scheme, for sequence alignment data in SAM (Sequence Alignment/Map) format, which improves the compression ratio of existing compression tools available. In order to achieve this, we exploit the prior knowledge of the file format and specifications. Our experimental results show that our encoding scheme improves compression ratio, thereby reducing overall transmission time significantly.
The data acquisition and reduction challenge at the Large Hadron Collider.
Cittolin, Sergio
2012-02-28
The Large Hadron Collider detectors are technological marvels-which resemble, in functionality, three-dimensional digital cameras with 100 Mpixels-capable of observing proton-proton (pp) collisions at the crossing rate of 40 MHz. Data handling limitations at the recording end imply the selection of only one pp event out of each 10(5). The readout and processing of this huge amount of information, along with the selection of the best approximately 200 events every second, is carried out by a trigger and data acquisition system, supplemented by a sophisticated control and monitor system. This paper presents an overview of the challenges that the development of these systems has presented over the past 15 years. It concludes with a short historical perspective, some lessons learnt and a few thoughts on the future.
Building an Ontology-driven Database for Clinical Immune Research
Ma, Jingming
2006-01-01
The clinical researches of immune response usually generate a huge amount of biomedical testing data over a certain period of time. The user-friendly data management systems based on the relational database will help immunologists/clinicians to fully manage the data. On the other hand, the same biological assays such as ELISPOT and flow cytometric assays are involved in immunological experiments no matter of different study purposes. The reuse of biological knowledge is one of driving forces behind this ontology-driven data management. Therefore, an ontology-driven database will help to handle different clinical immune researches and help immunologists/clinicians easily understand the immunological data from each other. We will discuss some outlines for building an ontology-driven data management for clinical immune researches (ODMim). PMID:17238637
Financial Time-series Analysis: a Brief Overview
NASA Astrophysics Data System (ADS)
Chakraborti, A.; Patriarca, M.; Santhanam, M. S.
Prices of commodities or assets produce what is called time-series. Different kinds of financial time-series have been recorded and studied for decades. Nowadays, all transactions on a financial market are recorded, leading to a huge amount of data available, either for free in the Internet or commercially. Financial time-series analysis is of great interest to practitioners as well as to theoreticians, for making inferences and predictions. Furthermore, the stochastic uncertainties inherent in financial time-series and the theory needed to deal with them make the subject especially interesting not only to economists, but also to statisticians and physicists [1]. While it would be a formidable task to make an exhaustive review on the topic, with this review we try to give a flavor of some of its aspects.
Optical research of biomaterials of Sorbulak
NASA Astrophysics Data System (ADS)
Esyrev, O. V.; Kupchishin, A. A.; Kupchishin, A. I.; Voronova, N. A.
2016-02-01
Within the framework of optical research it was established that on the unpolluted samples of sedge stems occurs structuring of material, whereas on contaminated and irradiated blurring of its structure takes place. Sampling of sedges and rushes for research was carried out in areas near the first dam Sorbulak. For comparison, samples of same materials were taken far away from populated areas. Irradiation was carried out with high-energy electrons with energy of 2 MeV and integral dose of 3·105 Gr. Irradiation leads to a more pronounced structuredness of material. There is a significant difference in the structural elements (epidermis, vascular bundles, parenchymal cells, etc.). There are traced dark spots and bands associated with the presence of huge amounts of heavy metals against the background of a green matrix.
Visual Analytics for MOOC Data.
Qu, Huamin; Chen, Qing
2015-01-01
With the rise of massive open online courses (MOOCs), tens of millions of learners can now enroll in more than 1,000 courses via MOOC platforms such as Coursera and edX. As a result, a huge amount of data has been collected. Compared with traditional education records, the data from MOOCs has much finer granularity and also contains new pieces of information. It is the first time in history that such comprehensive data related to learning behavior has become available for analysis. What roles can visual analytics play in this MOOC movement? The authors survey the current practice and argue that MOOCs provide an opportunity for visualization researchers and that visual analytics systems for MOOCs can benefit a range of end users such as course instructors, education researchers, students, university administrators, and MOOC providers.
Pose tracking for augmented reality applications in outdoor archaeological sites
NASA Astrophysics Data System (ADS)
Younes, Georges; Asmar, Daniel; Elhajj, Imad; Al-Harithy, Howayda
2017-01-01
In recent years, agencies around the world have invested huge amounts of effort toward digitizing many aspects of the world's cultural heritage. Of particular importance is the digitization of outdoor archaeological sites. In the spirit of valorization of this digital information, many groups have developed virtual or augmented reality (AR) computer applications themed around a particular archaeological object. The problem of pose tracking in outdoor AR applications is addressed. Different positional systems are analyzed, resulting in the selection of a monocular camera-based user tracker. The limitations that challenge this technique from map generation, scale, anchoring, to lighting conditions are analyzed and systematically addressed. Finally, as a case study, our pose tracking system is implemented within an AR experience in the Byblos Roman theater in Lebanon.
Modeling gene regulatory networks: A network simplification algorithm
NASA Astrophysics Data System (ADS)
Ferreira, Luiz Henrique O.; de Castro, Maria Clicia S.; da Silva, Fabricio A. B.
2016-12-01
Boolean networks have been used for some time to model Gene Regulatory Networks (GRNs), which describe cell functions. Those models can help biologists to make predictions, prognosis and even specialized treatment when some disturb on the GRN lead to a sick condition. However, the amount of information related to a GRN can be huge, making the task of inferring its boolean network representation quite a challenge. The method shown here takes into account information about the interactome to build a network, where each node represents a protein, and uses the entropy of each node as a key to reduce the size of the network, allowing the further inferring process to focus only on the main protein hubs, the ones with most potential to interfere in overall network behavior.
Mice prefer draught-free housing.
Krohn, T C; Hansen, A K
2010-10-01
An increasing number of rodents are housed in individually ventilated cage (IVC) systems, as these seem to be very effective for the protection of animals against infections, as well as protecting the staff against allergens. For the IVC systems to be properly ventilated, a huge amount of air has to be blown into the cage, which may cause a draught at animal level inside the cage. The aim of the present study was to evaluate the preferences of mice for differing levels of air speeds and air changes inside the cage. It has been concluded that mice do react to draughts, whereas they do not seem to be affected by a high number of air changes delivered without draught, which underlines the importance of applying draught-free IVC systems for mice.
Turkish meteor surveillance systems and network: Impact craters and meteorites database
NASA Astrophysics Data System (ADS)
Unsalan, O.; Ozel, M. E.; Derman, I. E.; Terzioglu, Z.; Kaygisiz, E.; Temel, T.; Topoyan, D.; Solmaz, A.; Yilmaz Kocahan, O.; Esenoglu, H. H.; Emrahoglu, N.; Yilmaz, A.; Yalcinkaya, B. O.
2014-07-01
In our project, we aim toward constructing Turkish Meteor Surveillance Systems and Network in Turkey. For this goal, video observational systems from SonotaCo (Japan) were chosen. Meteors are going to be observed with the specific cameras, their orbits will be calculated by the software from SonotaCo, and the places where they will be falling / impacting will be examined by field trips. The collected meteorites will be investigated by IR-Raman Spectroscopic techniques and SEM-EDX analyses in order to setup a database. On the other hand, according to our Prime Ministry Ottoman Archives, there are huge amounts of reports of falls for the past centuries. In order to treat these data properly, it is obvious that processing systems should be constructed and developed.
A Haptic-Enhanced System for Molecular Sensing
NASA Astrophysics Data System (ADS)
Comai, Sara; Mazza, Davide
The science of haptics has received an enormous attention in the last decade. One of the major application trends of haptics technology is data visualization and training. In this paper, we present a haptically-enhanced system for manipulation and tactile exploration of molecules.The geometrical models of molecules is extracted either from theoretical or empirical data using file formats widely adopted in chemical and biological fields. The addition of information computed with computational chemistry tools, allows users to feel the interaction forces between an explored molecule and a charge associated to the haptic device, and to visualize a huge amount of numerical data in a more comprehensible way. The developed tool can be used either for teaching or research purposes due to its high reliance on both theoretical and experimental data.
A Binary Array Asynchronous Sorting Algorithm with Using Petri Nets
NASA Astrophysics Data System (ADS)
Voevoda, A. A.; Romannikov, D. O.
2017-01-01
Nowadays the tasks of computations speed-up and/or their optimization are actual. Among the approaches on how to solve these tasks, a method applying approaches of parallelization and asynchronization to a sorting algorithm is considered in the paper. The sorting methods are ones of elementary methods and they are used in a huge amount of different applications. In the paper, we offer a method of an array sorting that based on a division into a set of independent adjacent pairs of numbers and their parallel and asynchronous comparison. And this one distinguishes the offered method from the traditional sorting algorithms (like quick sorting, merge sorting, insertion sorting and others). The algorithm is implemented with the use of Petri nets, like the most suitable tool for an asynchronous systems description.
Saini, Jitendra Kumar; Saini, Reetu; Tewari, Lakshmi
2015-08-01
Production of liquid biofuels, such as bioethanol, has been advocated as a sustainable option to tackle the problems associated with rising crude oil prices, global warming and diminishing petroleum reserves. Second-generation bioethanol is produced from lignocellulosic feedstock by its saccharification, followed by microbial fermentation and product recovery. Agricultural residues generated as wastes during or after processing of agricultural crops are one of such renewable and lignocellulose-rich biomass resources available in huge amounts for bioethanol production. These agricultural residues are converted to bioethanol in several steps which are described here. This review enlightens various steps involved in production of the second-generation bioethanol. Mechanisms and recent advances in pretreatment, cellulases production and second-generation ethanol production processes are described here.
Adaptive distributed outlier detection for WSNs.
De Paola, Alessandra; Gaglio, Salvatore; Lo Re, Giuseppe; Milazzo, Fabrizio; Ortolani, Marco
2015-05-01
The paradigm of pervasive computing is gaining more and more attention nowadays, thanks to the possibility of obtaining precise and continuous monitoring. Ease of deployment and adaptivity are typically implemented by adopting autonomous and cooperative sensory devices; however, for such systems to be of any practical use, reliability and fault tolerance must be guaranteed, for instance by detecting corrupted readings amidst the huge amount of gathered sensory data. This paper proposes an adaptive distributed Bayesian approach for detecting outliers in data collected by a wireless sensor network; our algorithm aims at optimizing classification accuracy, time complexity and communication complexity, and also considering externally imposed constraints on such conflicting goals. The performed experimental evaluation showed that our approach is able to improve the considered metrics for latency and energy consumption, with limited impact on classification accuracy.
NASA Astrophysics Data System (ADS)
Vidmachenko, A. P.
2018-05-01
Water consists of two most common chemical elements in the universe: hydrogen and oxygen. At the study of the solar and other planetary systems, water was found on planets, their satellites, in cometary nuclei, in asteroids, dwarf planets such as Ceres and Pluto. Water also occurs in the giant molecular clouds at interstellar space, in the materials of protoplanetary disks, in the atmospheres of exoplanets. In addition, in liquid form, water can also be under the surface. Most of the satellites of the giant planets also contain a huge amount of water ice. Some satellites of Saturn and Jupiter even give evidence of the presence of oceans under their surface. These include, for example, Enceladus, Titan and Dione in Saturn; Europe, Ganymede and Callisto near Jupiter; Here we will also include the satellite of Neptune - Triton.
2011-02-16
year of 1944 (the year of highest Manhattan Project expenditures). Richard G. Hewlett and Oscar E. Anderson, Jr., The New World: A History of the...technological developments in the biological sciences may provide 39 A case in point is the Manhattan ... Project undertaken by the United States to produce the first atomic weapon. A huge national effort was required to create the first atomic weapon in
Quantum Strategies: Proposal to Experimentally Test a Quantum Economics Protocol
2009-04-09
fact that this al- gorithm requires only bipartite entangled states what makes it feasible to implement, and a key focus of a larger program in quantum...passes through what is effectively a huge Mach-Zender fiber-interferometer bounded by the Sagnac loop and PPBS1- is affected by this time-varying...strategy, no matter what the other players do. As we noted above, this means that there is no (classical) correlated equilibrium other than the Nash
BelleII@home: Integrate volunteer computing resources into DIRAC in a secure way
NASA Astrophysics Data System (ADS)
Wu, Wenjing; Hara, Takanori; Miyake, Hideki; Ueda, Ikuo; Kan, Wenxiao; Urquijo, Phillip
2017-10-01
The exploitation of volunteer computing resources has become a popular practice in the HEP computing community as the huge amount of potential computing power it provides. In the recent HEP experiments, the grid middleware has been used to organize the services and the resources, however it relies heavily on the X.509 authentication, which is contradictory to the untrusted feature of volunteer computing resources, therefore one big challenge to utilize the volunteer computing resources is how to integrate them into the grid middleware in a secure way. The DIRAC interware which is commonly used as the major component of the grid computing infrastructure for several HEP experiments proposes an even bigger challenge to this paradox as its pilot is more closely coupled with operations requiring the X.509 authentication compared to the implementations of pilot in its peer grid interware. The Belle II experiment is a B-factory experiment at KEK, and it uses DIRAC for its distributed computing. In the project of BelleII@home, in order to integrate the volunteer computing resources into the Belle II distributed computing platform in a secure way, we adopted a new approach which detaches the payload running from the Belle II DIRAC pilot which is a customized pilot pulling and processing jobs from the Belle II distributed computing platform, so that the payload can run on volunteer computers without requiring any X.509 authentication. In this approach we developed a gateway service running on a trusted server which handles all the operations requiring the X.509 authentication. So far, we have developed and deployed the prototype of BelleII@home, and tested its full workflow which proves the feasibility of this approach. This approach can also be applied on HPC systems whose work nodes do not have outbound connectivity to interact with the DIRAC system in general.
34 CFR Appendix C to Part 379 - Calculating Required Matching Amount
Code of Federal Regulations, 2013 CFR
2013-07-01
... 34 Education 2 2013-07-01 2013-07-01 false Calculating Required Matching Amount C Appendix C to Part 379 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF.... C Appendix C to Part 379—Calculating Required Matching Amount 1. The method for calculating the...
34 CFR Appendix C to Part 379 - Calculating Required Matching Amount
Code of Federal Regulations, 2014 CFR
2014-07-01
... 34 Education 2 2014-07-01 2013-07-01 true Calculating Required Matching Amount C Appendix C to Part 379 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF.... C Appendix C to Part 379—Calculating Required Matching Amount 1. The method for calculating the...
Byurakan Cosmogony Concept in the Light of Modern Observational Data: Why We Need to Recall it?
NASA Astrophysics Data System (ADS)
Harutyunian, H. A.
2017-07-01
Some physically possible consequences of interaction between baryonic matter and dark energy are considered. We are arguing that the modern cosmogony and cosmology based on the hypothesis of Kant and Laplace and its further modifications are not adequate to the nowadays growing base of observational data. A thought experiment is conducted in the framework of generally accepted physical concepts and laws to study the most prominent consequences of interactions between various types of substances with the dark energy carrier. Such experiments allow one to arrive at a conclusion that owing to continuous exchanges of energy between the atomic nuclei and the bearer of dark energy, the binding energy of nuclei should reduce and their mass had increase over time. This process can be considered as the Universe total mass growth at the expense of dark energy. Then one would be able to explain the long standing paradox: why the Universe did not collapse immediately after the mass formation event at the very beginning of the Universe formation. On the other hand, this way of thinking leads to a physical picture of the Universe where huge amounts of embryonic baryons possessing of negligible masses can exist in the interiors of large cosmic objects to transform into the ordinary baryonic matter of vast masses in the future. As a result, clumps of matter of huge masses can be ejected from the cores of such objects.
Hua, Xing; Liu, Shao-Jie; Lu, Lin; Li, Chao-Xia; Yu, Li-Na
2012-08-01
To study the clinicopathological characteristics and diagnosis of true hermaphroditism complicated with seminoma. We retrospectively analyzed the clinicopathological data of a case of true hermaphroditism complicated with seminoma and reviewed the related literature. The patient was a 42-year-old male, admitted for bilateral lower back pain and discomfort. CT showed a huge mass in the lower middle abdomen. Gross pathological examination revealed a mass of uterine tissue, 7 cm x 2 cm x 6 cm in size, with bilateral oviducts and ovarian tissue. There was a cryptorchidism (4.0 cm x 2.5 cm x 1.5 cm) on the left and a huge tumor (22 cm x9 cm x6 cm) on the right of the uterine tissue. The tumor was completely encapsulated, with some testicular tissue. Microscopically, the tumor tissue was arranged in nests or sheets divided and surrounded by fibrous tissue. The tumor cells were large, with abundant and transparent cytoplasm, deeply stained nuclei, coarse granular chromatins, visible mitosis, and infiltration of a small number of lymphocytes in the stroma. The karyotype was 46, XX. Immunohistochemistry showed that PLAP and CD117 were positive, while the AFP, Vimentin, EMA, S100, CK-LMW, Desmin, CD34 and CD30 were negative, and Ki-67 was 20% positive. A small amount of residual normal testicular tissue was seen in the tumor tissue. True hermaphroditism complicated with seminoma is rare. Histopathological analysis combined with immunohistochemical detection is of great value for its diagnosis and differential diagnosis.
NASA Astrophysics Data System (ADS)
Sun, Qizhen; Li, Xiaolei; Zhang, Manliang; Liu, Qi; Liu, Hai; Liu, Deming
2013-12-01
Fiber optic sensor network is the development trend of fiber senor technologies and industries. In this paper, I will discuss recent research progress on high capacity fiber sensor networks with hybrid multiplexing techniques and their applications in the fields of security monitoring, environment monitoring, Smart eHome, etc. Firstly, I will present the architecture of hybrid multiplexing sensor passive optical network (HSPON), and the key technologies for integrated access and intelligent management of massive fiber sensor units. Two typical hybrid WDM/TDM fiber sensor networks for perimeter intrusion monitor and cultural relics security are introduced. Secondly, we propose the concept of "Microstructure-Optical X Domin Refecltor (M-OXDR)" for fiber sensor network expansion. By fabricating smart micro-structures with the ability of multidimensional encoded and low insertion loss along the fiber, the fiber sensor network of simple structure and huge capacity more than one thousand could be achieved. Assisted by the WDM/TDM and WDM/FDM decoding methods respectively, we built the verification systems for long-haul and real-time temperature sensing. Finally, I will show the high capacity and flexible fiber sensor network with IPv6 protocol based hybrid fiber/wireless access. By developing the fiber optic sensor with embedded IPv6 protocol conversion module and IPv6 router, huge amounts of fiber optic sensor nodes can be uniquely addressed. Meanwhile, various sensing information could be integrated and accessed to the Next Generation Internet.
Huang, Tongtong; Anselme, Karine; Sarrailh, Segolene; Ponche, Arnaud
2016-01-30
The purpose of this study is to evaluate the potential of simple high performance liquid chromatography (HPLC) setup for quantification of adsorbed proteins on various type of plane substrates with limited area (<3 cm(2)). Protein quantification was investigated with a liquid chromatography chain equipped with a size exclusion column or a reversed-phase column. By evaluating the validation of the method according to guidelines of the International Conference on Harmonization of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH), all the results obtained by HPLC were reliable. By simple adsorption test at the contact of hydrophilic (glass) and hydrophobic (polydimethylsiloxane: PDMS) surfaces, kinetics of adsorption were determined and amounts of adsorbed bovine serum albumin, myoglobin and lysozyme were obtained: as expected for each protein, the amount adsorbed at the plateau on glass (between 0.15 μg/cm(2) and 0.4 μg/cm(2)) is lower than for hydrophobic PDMS surfaces (between 0.45 μg/cm(2) and 0.8 μg/cm(2)). These results were consistent with bicinchoninic acid protein determination. According to ICH guidelines, both Reversed Phase and Size Exclusion HPLC can be validated for quantification of adsorbed protein. However, we consider the size exclusion approach more interesting in this field because additional informations can be obtained for aggregative proteins. Indeed, monomer, dimer and oligomer of bovine serum albumin (BSA) were observed in the chromatogram. On increasing the temperature, we found a decrease of peak intensity of bovine serum albumin as well as the fraction of dimer and oligomer after contact with PDMS and glass surface. As the surface can act as a denaturation parameter, these informations can have a huge impact on the elucidation of the interfacial behavior of protein and in particular for aggregation processes in pharmaceutical applications. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Gitari, M. W.; Akinyemi, S. A.; Thobakgale, R.; Ngoejana, P. C.; Ramugondo, L.; Matidza, M.; Mhlongo, S. E.; Dacosta, F. A.; Nemapate, N.
2018-01-01
The mining industries in South Africa generates huge amounts of mine waste that includes tailings; waste rocks and spoils. The tailings materials are dumped in surface impoundments that turn to be sources of hazards to the environment and the surrounding communities. The main environmental hazards posed by these tailings facilities are associated with their chemical constituents. Exposure to chemical constituents can occur through windblown dust, erosion to surface water bodies, inhalation by human beings and animals and through bioaccumulation and bio magnification by plants. Numerous un-rehabilitated tailings dumps exist in Limpopo province of South Africa. The communities found around these mines are constantly exposed to the environmental hazards posed by these tailing facilities. Development of a cost-effective technology that can beneficially utilize these tailings can reduce the environmental hazards and benefit the communities. This paper presents the initial evaluation of the copper and gold mine tailings in Limpopo, South Africa with a view to assessing the suitability of conversion into beneficial geopolymeric materials. Copper tailings leachates had alkaline pH (7.34-8.49) while the gold tailings had acidic pH. XRD confirmed presence of aluminosilicate minerals. Geochemical fractionation indicates that majority of the major and trace species are present in residual fraction. A significant amount of Ca, Cu and K was available in the mobile fraction and is expected to be released on tailings contacting aqueous solutions. Results from XRF indicates the tailings are rich in SiO2, Al2O3 and CaO which are the main ingredients in geopolymerization process. The SiO2/Al2O3 ratios indicates the tailings would require blending with Al2O3 rich feedstock for them to develop maximum strength. Moreover, the tailings have particle size in the range of fine sand which indicates potential application as aggregates in conventional brick manufacture.
Using Unplanned Fires to Help Suppressing Future Large Fires in Mediterranean Forests
Regos, Adrián; Aquilué, Núria; Retana, Javier; De Cáceres, Miquel; Brotons, Lluís
2014-01-01
Despite the huge resources invested in fire suppression, the impact of wildfires has considerably increased across the Mediterranean region since the second half of the 20th century. Modulating fire suppression efforts in mild weather conditions is an appealing but hotly-debated strategy to use unplanned fires and associated fuel reduction to create opportunities for suppression of large fires in future adverse weather conditions. Using a spatially-explicit fire–succession model developed for Catalonia (Spain), we assessed this opportunistic policy by using two fire suppression strategies that reproduce how firefighters in extreme weather conditions exploit previous fire scars as firefighting opportunities. We designed scenarios by combining different levels of fire suppression efficiency and climatic severity for a 50-year period (2000–2050). An opportunistic fire suppression policy induced large-scale changes in fire regimes and decreased the area burnt under extreme climate conditions, but only accounted for up to 18–22% of the area to be burnt in reference scenarios. The area suppressed in adverse years tended to increase in scenarios with increasing amounts of area burnt during years dominated by mild weather. Climate change had counterintuitive effects on opportunistic fire suppression strategies. Climate warming increased the incidence of large fires under uncontrolled conditions but also indirectly increased opportunities for enhanced fire suppression. Therefore, to shift fire suppression opportunities from adverse to mild years, we would require a disproportionately large amount of area burnt in mild years. We conclude that the strategic planning of fire suppression resources has the potential to become an important cost-effective fuel-reduction strategy at large spatial scale. We do however suggest that this strategy should probably be accompanied by other fuel-reduction treatments applied at broad scales if large-scale changes in fire regimes are to be achieved, especially in the wider context of climate change. PMID:24727853
Using unplanned fires to help suppressing future large fires in Mediterranean forests.
Regos, Adrián; Aquilué, Núria; Retana, Javier; De Cáceres, Miquel; Brotons, Lluís
2014-01-01
Despite the huge resources invested in fire suppression, the impact of wildfires has considerably increased across the Mediterranean region since the second half of the 20th century. Modulating fire suppression efforts in mild weather conditions is an appealing but hotly-debated strategy to use unplanned fires and associated fuel reduction to create opportunities for suppression of large fires in future adverse weather conditions. Using a spatially-explicit fire-succession model developed for Catalonia (Spain), we assessed this opportunistic policy by using two fire suppression strategies that reproduce how firefighters in extreme weather conditions exploit previous fire scars as firefighting opportunities. We designed scenarios by combining different levels of fire suppression efficiency and climatic severity for a 50-year period (2000-2050). An opportunistic fire suppression policy induced large-scale changes in fire regimes and decreased the area burnt under extreme climate conditions, but only accounted for up to 18-22% of the area to be burnt in reference scenarios. The area suppressed in adverse years tended to increase in scenarios with increasing amounts of area burnt during years dominated by mild weather. Climate change had counterintuitive effects on opportunistic fire suppression strategies. Climate warming increased the incidence of large fires under uncontrolled conditions but also indirectly increased opportunities for enhanced fire suppression. Therefore, to shift fire suppression opportunities from adverse to mild years, we would require a disproportionately large amount of area burnt in mild years. We conclude that the strategic planning of fire suppression resources has the potential to become an important cost-effective fuel-reduction strategy at large spatial scale. We do however suggest that this strategy should probably be accompanied by other fuel-reduction treatments applied at broad scales if large-scale changes in fire regimes are to be achieved, especially in the wider context of climate change.
45 CFR 160.404 - Amount of a civil money penalty.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 45 Public Welfare 1 2011-10-01 2011-10-01 false Amount of a civil money penalty. 160.404 Section... RELATED REQUIREMENTS GENERAL ADMINISTRATIVE REQUIREMENTS Imposition of Civil Money Penalties § 160.404 Amount of a civil money penalty. (a) The amount of a civil money penalty will be determined in accordance...
45 CFR 160.404 - Amount of a civil money penalty.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 45 Public Welfare 1 2014-10-01 2014-10-01 false Amount of a civil money penalty. 160.404 Section... RELATED REQUIREMENTS GENERAL ADMINISTRATIVE REQUIREMENTS Imposition of Civil Money Penalties § 160.404 Amount of a civil money penalty. (a) The amount of a civil money penalty will be determined in accordance...
45 CFR 160.404 - Amount of a civil money penalty.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 45 Public Welfare 1 2012-10-01 2012-10-01 false Amount of a civil money penalty. 160.404 Section... RELATED REQUIREMENTS GENERAL ADMINISTRATIVE REQUIREMENTS Imposition of Civil Money Penalties § 160.404 Amount of a civil money penalty. (a) The amount of a civil money penalty will be determined in accordance...
45 CFR 160.404 - Amount of a civil money penalty.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 1 2013-10-01 2013-10-01 false Amount of a civil money penalty. 160.404 Section... RELATED REQUIREMENTS GENERAL ADMINISTRATIVE REQUIREMENTS Imposition of Civil Money Penalties § 160.404 Amount of a civil money penalty. (a) The amount of a civil money penalty will be determined in accordance...
45 CFR 160.404 - Amount of a civil money penalty.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 45 Public Welfare 1 2010-10-01 2010-10-01 false Amount of a civil money penalty. 160.404 Section... RELATED REQUIREMENTS GENERAL ADMINISTRATIVE REQUIREMENTS Imposition of Civil Money Penalties § 160.404 Amount of a civil money penalty. (a) The amount of a civil money penalty will be determined in accordance...
30 CFR 243.8 - When will MMS suspend my obligation to comply with an order?
Code of Federal Regulations, 2010 CFR
2010-07-01
...), and: (1) If the amount under appeal is less than $10,000 or does not require payment of a specified... the amount under appeal is less than $1,000 or does not require payment, MMS will suspend your... paying any demanded amount or complying with any other requirement pending appeal. However, voluntarily...
NASA Astrophysics Data System (ADS)
Garapati, N.; Randolph, J.; Saar, M. O.
2013-12-01
CO2-Plume Geothermal (CPG) involves injection of CO2 as a working fluid to extract heat from naturally high permeable sedimentary basins. The injected CO2 forms a large subsurface CO2 plume that absorbs heat from the geothermal reservoir and eventually buoyantly rises to the surface. The heat density of sedimentary basins is typically relatively low.However, this drawback is likely counteracted by the large accessible volume of natural reservoirs compared to artificial, hydrofractured, and thus small-scale, reservoirs. Furthermore, supercritical CO2has a large mobility (inverse kinematic viscosity) and expansibility compared to water resulting in the formation of a strong thermosiphon which eliminates the need for parasitic pumping power requirements and significantly increasing electricity production efficiency. Simultaneously, the life span of the geothermal power plant can be increased by operating the CPG system such that it depletes the geothermal reservoir heat slowly. Because the produced CO2 is reinjected into the ground with the main CO2 sequestration stream coming from a CO2 emitter, all of the CO2 is ultimately geologically sequestered resulting in a CO2 sequestering geothermal power plant with a negative carbon footprint. Conventional geothermal process requires pumping of huge amount of water for the propagation of the fractures in the reservoir, but CPG process eliminates this requirement and conserves water resources. Here, we present results for performance of a CPG system as a function of various geologic properties of multilayered systemsincludingpermeability anisotropy, rock thermal conductivity, geothermal gradient, reservoir depth and initial native brine salinity as well as spacing between the injection and production wells. The model consists of a 50 m thick, radially symmetric grid with a semi-analytic heat exchange and no fluid flow at the top and bottom boundaries and no fluid and heat flow at the lateral boundaries. We design Plackett-Burman experiments resulting in 16 simulations for the seven parameters investigated. The reservoir is divided into 3-, 4-, or 5- layer systems with log-normal permeability distributions. We consider 10 sets of values for each case resulting in a total of 16x3x10 =480 simulations.We analyze the performance of the system to maximize the amount of heat energy extracted, minimize reservoir temperature depletion and maximize the CO2concentration in the produced fluid. Achieving the latter objective reduces power system problems as Welch and Boyle (GRC Trans. 2009) found that CO2 concentration should be >94% in the systems they investigated.
Analysis of post-earthquake landslide activity and geo-environmental effects
NASA Astrophysics Data System (ADS)
Tang, Chenxiao; van Westen, Cees; Jetten, Victor
2014-05-01
Large earthquakes can cause huge losses to human society, due to ground shaking, fault rupture and due to the high density of co-seismic landslides that can be triggered in mountainous areas. In areas that have been affected by such large earthquakes, the threat of landslides continues also after the earthquake, as the co-seismic landslides may be reactivated by high intensity rainfall events. Earthquakes create Huge amount of landslide materials remain on the slopes, leading to a high frequency of landslides and debris flows after earthquakes which threaten lives and create great difficulties in post-seismic reconstruction in the earthquake-hit regions. Without critical information such as the frequency and magnitude of landslides after a major earthquake, reconstruction planning and hazard mitigation works appear to be difficult. The area hit by Mw 7.9 Wenchuan earthquake in 2008, Sichuan province, China, shows some typical examples of bad reconstruction planning due to lack of information: huge debris flows destroyed several re-constructed settlements. This research aim to analyze the decay in post-seismic landslide activity in areas that have been hit by a major earthquake. The areas hit by the 2008 Wenchuan earthquake will be taken a study area. The study will analyze the factors that control post-earthquake landslide activity through the quantification of the landslide volume changes well as through numerical simulation of their initiation process, to obtain a better understanding of the potential threat of post-earthquake landslide as a basis for mitigation planning. The research will make use of high-resolution stereo satellite images, UAV and Terrestrial Laser Scanning(TLS) to obtain multi-temporal DEM to monitor the change of loose sediments and post-seismic landslide activities. A debris flow initiation model that incorporates the volume of source materials, vegetation re-growth, and intensity-duration of the triggering precipitation, and that evaluates different initiation mechanisms such as erosion and landslide reactivation will be developed. The developed initiation model will be integrated with run-out model to simulate the dynamic process of post-earthquake debris flows in the study area for a future period and make a prediction about the decay of landslide activity in future.
Newtonian self-gravitating system in a relativistic huge void universe model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nishikawa, Ryusuke; Nakao, Ken-ichi; Yoo, Chul-Moon, E-mail: ryusuke@sci.osaka-cu.ac.jp, E-mail: knakao@sci.osaka-cu.ac.jp, E-mail: yoo@gravity.phys.nagoya-u.ac.jp
We consider a test of the Copernican Principle through observations of the large-scale structures, and for this purpose we study the self-gravitating system in a relativistic huge void universe model which does not invoke the Copernican Principle. If we focus on the the weakly self-gravitating and slowly evolving system whose spatial extent is much smaller than the scale of the cosmological horizon in the homogeneous and isotropic background universe model, the cosmological Newtonian approximation is available. Also in the huge void universe model, the same kind of approximation as the cosmological Newtonian approximation is available for the analysis of themore » perturbations contained in a region whose spatial size is much smaller than the scale of the huge void: the effects of the huge void are taken into account in a perturbative manner by using the Fermi-normal coordinates. By using this approximation, we derive the equations of motion for the weakly self-gravitating perturbations whose elements have relative velocities much smaller than the speed of light, and show the derived equations can be significantly different from those in the homogeneous and isotropic universe model, due to the anisotropic volume expansion in the huge void. We linearize the derived equations of motion and solve them. The solutions show that the behaviors of linear density perturbations are very different from those in the homogeneous and isotropic universe model.« less
Long term volcanic hazard analysis in the Canary Islands
NASA Astrophysics Data System (ADS)
Becerril, L.; Galindo, I.; Laín, L.; Llorente, M.; Mancebo, M. J.
2009-04-01
Historic volcanism in Spain is restricted to the Canary Islands, a volcanic archipelago formed by seven volcanic islands. Several historic eruptions have been registered in the last five hundred years. However, and despite the huge amount of citizens and tourist in the archipelago, only a few volcanic hazard studies have been carried out. These studies are mainly focused in the developing of hazard maps in Lanzarote and Tenerife islands, especially for land use planning. The main handicap for these studies in the Canary Islands is the lack of well reported historical eruptions, but also the lack of data such as geochronological, geochemical or structural. In recent years, the use of Geographical Information Systems (GIS) and the improvement in the volcanic processes modelling has provided an important tool for volcanic hazard assessment. Although this sophisticated programs are really useful they need to be fed by a huge amount of data that sometimes, such in the case of the Canary Islands, are not available. For this reason, the Spanish Geological Survey (IGME) is developing a complete geo-referenced database for long term volcanic analysis in the Canary Islands. The Canarian Volcanic Hazard Database (HADA) is based on a GIS helping to organize and manage volcanic information efficiently. HADA includes the following groups of information: (1) 1:25.000 scale geologic maps, (2) 1:25.000 topographic maps, (3) geochronologic data, (4) geochemical data, (5) structural information, (6) climatic data. Data must pass a quality control before they are included in the database. New data are easily integrated in the database. With the HADA database the IGME has started a systematic organization of the existing data. In the near future, the IGME will generate new information to be included in HADA, such as volcanological maps of the islands, structural information, geochronological data and other information to assess long term volcanic hazard analysis. HADA will permit having enough quality information to map volcanic hazards and to run more reliable models of volcanic hazards, but in addition it aims to become a sharing system, improving communication between researchers, reducing redundant work and to be the reference for geological research in the Canary Islands.
NASA Astrophysics Data System (ADS)
Liotta, Marcello; D'Alessandro, Walter
2016-04-01
At Mt. Etna the presence of a persistent volcanic plume provides large amounts of volcanogenic elements to the bulk deposition along its flanks. The volcanic plume consists of solid particles, acidic droplets and gaseous species. After H2O and CO2, S, Cl and F represent the most abundant volatile elements emitted as gaseous species from the craters. During rain events acidic gases interact rapidly with droplets lowering the pH of rain. This process favors the dissolution and dissociation of the most acidic gases. Under these conditions, the chemical weathering of volcanic rocks and ashes is promoted by the acid rain during its infiltration. Subsequently during groundwater circulation, chemical weathering of volcanic rocks is also driven by the huge amount of deep magmatic carbon dioxide (CO2) coming up through the volcanic edifice and dissolving in the water. These two different weathering steps occur under very different conditions. The former occurs in a highly acidic environment (pH < 4) and the reaction rates depend strongly on the pH, while the latter usually occurs under slightly acidic conditions since the pH has been already neutralized by the interaction with volcanics rocks. The high content of chlorine is mainly derived from interactions between the plume and rainwater, while the total alkalinity can be completely ascribed to the dissociation of carbonic acid (H2CO3) after the hydration of CO2. The relative contributions of plume-derived elements/weathering and CO2-driven weathering has been computed for each element. In addition, the comparison between the chemical compositions of the bulk deposition and of groundwater provides a new understanding about the mobility of volatile elements. Other processes such as ion exchange, iddingsite formation, and carbonate precipitation can also play roles, but only to minor extents. The proposed approach has revealed that the persistent plume strongly affects the chemical composition of groundwater at Mt. Etna and probably also at other volcanoes characterized by huge open-conduit degassing activity.
Molecular Nanotechnology and Space Settlement
NASA Technical Reports Server (NTRS)
Globus, Al; Saini, Subhash (Technical Monitor)
1998-01-01
Atomically precise manipulation of matter is becoming increasingly common in laboratories around the world. As this control moves into aerospace systems, huge improvements in computers, high-strength materials, and other systems are expected. For example, studies suggest that it may be possible to build: 10(exp 18) MIPS computers, 10(exp 15) bytes/sq cm write once memory, $153-412/kg-of-cargo single- stage-to-orbit launch vehicles and active materials which sense their environment and react intelligently. All of NASA's enterprises should benefit significantly from molecular nanotechnology. Although the time may be measured in decades and the precise path to molecular nanotechnology is unclear, all paths (diamondoid, fullerene, self-assembly, biomolecular, etc.) will require very substantial computation. This talk will discuss fullerene nanotechnology and early work on hypothetical active materials consisting of large numbers of identical machines. The speaker will also discuss aerospace applications, particularly missions leading to widespread space settlement (e.g., small near-Earth - object retrieval). It is interesting to note that control of the tiny - individual atoms and molecules - may lead to colonization of the huge -first the solar system, then the galaxy.
Plutonium: Advancing our Understanding to Support Sustainable Nuclear Fuel Cycles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lines, Amanda M.; Adami, Susan R.; Casella, Amanda
With Global energy needs increasing, real energy solutions to meet demands now, are needed. Fossil fuels are not an ideal candidate to meet these needs because of their negative impact on the environment. Renewables such as wind and solar have huge potential, but still need major technological advancements (particularly in the area of battery storage) before they can effectively meet growing world needs. The best option for meeting large energy needs without a large carbon footprint is nuclear energy. Of course, nuclear energy can face a fair amount of opposition and concern. However, through modern engineering and science many ofmore » these concerns can now be addressed. Many safety concerns can be met by engineering advancements, but perhaps the biggest area of concern is what to do with the used nuclear fuel after it is removed from the reactor. Currently the United States (and several other countries) utilize an open fuel cycle, meaning fuel is only used once and then discarded. It should be noted that fuel coming out of a reactor has utilized approximately 1% of the total energy that could be produced by the uranium in the fuel rod. The answer here is to close the fuel cycle and recycle the nuclear materials. By reprocessing used nuclear fuel, all the U can be repurposed without requiring disposal. The various fission products can be removed and either discarded (hugely reduced waste volume) or more reasonably, utilized in specialty reactors to make more energy or needed research/medical isotopes. While reprocessing technology is currently advanced enough to meet energy needs, completing research to improve and better understand these techniques is still needed. Better understanding behavior of fission products is one area of important research. Despite it being discovered over 75 years ago, plutonium is still an exciting element to study because of the complex solution chemistry it exhibits. In aqueous solutions Pu can exist simultaneously in multiple oxidation states, including 3+, 4+, and 6+. It also readily forms a variety of metal-ligand complexes depending on solution pH and available ligands. Understanding of the behavior of Pu in solution remains an important area of research today, with relevance to developing sustainable nuclear fuel cycles, minimizing its impact on the environment, and detecting and preventing the spread of nuclear weapons technology.« less
Chng, Chu’Er; Sofer, Zdenek; Pumera, Martin; Bonanni, Alessandra
2016-01-01
There is a huge interest in doped graphene and how doping can tune the material properties for the specific application. It was recently demonstrated that the effect of doping can have different influence on the electrochemical detection of electroactive probes, depending on the analysed probe, on the structural characteristics of the graphene materials and on the type and amount of heteroatom used for the doping. In this work we wanted to investigate the effect of doping on graphene materials used as platform for the detection of catechin, a standard probe which is commonly used for the measurement of polyphenols in food and beverages. To this aim we compared undoped graphene with boron-doped graphene and nitrogen doped graphene platforms for the electrochemical detection of standard catechin oxidation. Finally, the material providing the best electrochemical performance was employed for the analysis of real samples. We found that the undoped graphene, possessing lower amount of oxygen functionalities, higher density of defects and larger electroactive surface area provided the best electroanalytical performance for the determination of catechin in commercial beer samples. Our findings are important for the development of novel graphene platforms for the electrochemical assessment of food quality. PMID:26861507
NASA Astrophysics Data System (ADS)
Xiaoyang, Zhong; Hong, Ren; Jingxin, Gao
2018-03-01
With the gradual maturity of the real estate market in China, urban housing prices are also better able to reflect changes in market demand and the commodity property of commercial housing has become more and more obvious. Many scholars in our country have made a lot of research on the factors that affect the price of commercial housing in the city and the number of related research papers increased rapidly. These scholars’ research results provide valuable wealth to solve the problem of urban housing price changes in our country. However, due to the huge amount of literature, the vast amount of information is submerged in the library and cannot be fully utilized. Text mining technology has been widely concerned and developed in the field of Humanities and Social Sciences in recent years. But through the text mining technology to obtain the influence factors on the price of urban commercial housing is still relatively rare. In this paper, the research results of the existing scholars were excavated by text mining algorithm based on support vector machine in order to further make full use of the current research results and to provide a reference for stabilizing housing prices.
Sampling model of government travel vouchers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, P.S.; Wright, T.
1987-02-01
A pilot survey was designed and executed to better understand the structure of the universe of all government travel vouchers. Thirteen civilian and military sites were selected for the pilot survey. A total of 3916 travel vouchers with attached tickets were sampled. During the course of the pilot survey, it was felt that the compounding problems of the relative rarity of the expired, unused tickets and the enormously huge universe were too much of an obstacle to overcome in sampling the entire universe (including the US Air Force, US Army, US Navy, US Marines, other Department of Defense offices, andmore » civil) in the first year. The universe was then narrowed to the US Air Force, and US Army which have to two largest government travel expenditures. Based on the results of the pilot survey, ORNL recommends a stratified two-stage cluster sampling model. With probability of 0.90, a sample of size 78 (sites) will be needed to estimate the amounts per airline which will not be more than $50,000 from the true values. This sampling model allows one to estimate the total dollar amounts of expired, unused tickets for individual airlines.« less
Geological applications of machine learning on hyperspectral remote sensing data
NASA Astrophysics Data System (ADS)
Tse, C. H.; Li, Yi-liang; Lam, Edmund Y.
2015-02-01
The CRISM imaging spectrometer orbiting Mars has been producing a vast amount of data in the visible to infrared wavelengths in the form of hyperspectral data cubes. These data, compared with those obtained from previous remote sensing techniques, yield an unprecedented level of detailed spectral resolution in additional to an ever increasing level of spatial information. A major challenge brought about by the data is the burden of processing and interpreting these datasets and extract the relevant information from it. This research aims at approaching the challenge by exploring machine learning methods especially unsupervised learning to achieve cluster density estimation and classification, and ultimately devising an efficient means leading to identification of minerals. A set of software tools have been constructed by Python to access and experiment with CRISM hyperspectral cubes selected from two specific Mars locations. A machine learning pipeline is proposed and unsupervised learning methods were implemented onto pre-processed datasets. The resulting data clusters are compared with the published ASTER spectral library and browse data products from the Planetary Data System (PDS). The result demonstrated that this approach is capable of processing the huge amount of hyperspectral data and potentially providing guidance to scientists for more detailed studies.
Optimizing a Query by Transformation and Expansion.
Glocker, Katrin; Knurr, Alexander; Dieter, Julia; Dominick, Friederike; Forche, Melanie; Koch, Christian; Pascoe Pérez, Analie; Roth, Benjamin; Ückert, Frank
2017-01-01
In the biomedical sector not only the amount of information produced and uploaded into the web is enormous, but also the number of sources where these data can be found. Clinicians and researchers spend huge amounts of time on trying to access this information and to filter the most important answers to a given question. As the formulation of these queries is crucial, automated query expansion is an effective tool to optimize a query and receive the best possible results. In this paper we introduce the concept of a workflow for an optimization of queries in the medical and biological sector by using a series of tools for expansion and transformation of the query. After the definition of attributes by the user, the query string is compared to previous queries in order to add semantic co-occurring terms to the query. Additionally, the query is enlarged by an inclusion of synonyms. The translation into database specific ontologies ensures the optimal query formulation for the chosen database(s). As this process can be performed in various databases at once, the results are ranked and normalized in order to achieve a comparable list of answers for a question.
Event-Based User Classification in Weibo Media
Wang, Wendong; Cheng, Shiduan; Que, Xirong
2014-01-01
Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately. PMID:25133235
NASA Astrophysics Data System (ADS)
Chng, Chu'Er; Sofer, Zdenek; Pumera, Martin; Bonanni, Alessandra
2016-02-01
There is a huge interest in doped graphene and how doping can tune the material properties for the specific application. It was recently demonstrated that the effect of doping can have different influence on the electrochemical detection of electroactive probes, depending on the analysed probe, on the structural characteristics of the graphene materials and on the type and amount of heteroatom used for the doping. In this work we wanted to investigate the effect of doping on graphene materials used as platform for the detection of catechin, a standard probe which is commonly used for the measurement of polyphenols in food and beverages. To this aim we compared undoped graphene with boron-doped graphene and nitrogen doped graphene platforms for the electrochemical detection of standard catechin oxidation. Finally, the material providing the best electrochemical performance was employed for the analysis of real samples. We found that the undoped graphene, possessing lower amount of oxygen functionalities, higher density of defects and larger electroactive surface area provided the best electroanalytical performance for the determination of catechin in commercial beer samples. Our findings are important for the development of novel graphene platforms for the electrochemical assessment of food quality.
Event-based user classification in Weibo media.
Guo, Liang; Wang, Wendong; Cheng, Shiduan; Que, Xirong
2014-01-01
Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately.
The GMOS cyber(e)-infrastructure: advanced services for supporting science and policy.
Cinnirella, S; D'Amore, F; Bencardino, M; Sprovieri, F; Pirrone, N
2014-03-01
The need for coordinated, systematized and catalogued databases on mercury in the environment is of paramount importance as improved information can help the assessment of the effectiveness of measures established to phase out and ban mercury. Long-term monitoring sites have been established in a number of regions and countries for the measurement of mercury in ambient air and wet deposition. Long term measurements of mercury concentration in biota also produced a huge amount of information, but such initiatives are far from being within a global, systematic and interoperable approach. To address these weaknesses the on-going Global Mercury Observation System (GMOS) project ( www.gmos.eu ) established a coordinated global observation system for mercury as well it retrieved historical data ( www.gmos.eu/sdi ). To manage such large amount of information a technological infrastructure was planned. This high-performance back-end resource associated with sophisticated client applications enables data storage, computing services, telecommunications networks and all services necessary to support the activity. This paper reports the architecture definition of the GMOS Cyber(e)-Infrastructure and the services developed to support science and policy, including the United Nation Environmental Program. It finally describes new possibilities in data analysis and data management through client applications.
NASA Astrophysics Data System (ADS)
Dittmar, N.; Haberstroh, Ch.; Hesse, U.; Krzyzowski, M.
2016-10-01
In part one of this publication experimental results for a single-channel transfer line used at liquid helium (LHe) decant stations are presented. The transfer of LHe into mobile dewars is an unavoidable process since the places of storage and usage are generally located apart from each other. The experimental results have shown that reasonable amounts of LHe evaporate due to heat leak and pressure drop. Thus, generated helium cold gas has to be collected and reliquefied, demanding a huge amount of electrical energy. Although this transfer process is common in cryogenic laboratories, no existing code could be found to model it. Therefore, a thermohydraulic model has been developed to model the LHe flow at operating conditions using published heat transfer and pressure drop correlations. This paper covers the basic equations used to calculate heat transfer and pressure drop, as well as the validation of the thermohydraulic code, and its application within the optimisation process. The final transfer line design features reduced heat leak and pressure drop values based on a combined measurement and modelling campaign in the range of 0.112 < pin < 0.148 MPa, 190 < G < 450 kg/(m2 s), and 0.04 < xout < 0.12.
Sulfur Isotopes in Gas-rich Impact-Melt Glasses in Shergottites
NASA Technical Reports Server (NTRS)
Rao, M. N.; Hoppe, P.; Sutton, S. R.; Nyquist, Laurence E.; Huth, J.
2010-01-01
Large impact melt glasses in some shergottites contain huge amounts of Martian atmospheric gases and they are known as gas-rich impact-melt (GRIM) glasses. By studying the neutron-induced isotopic deficits and excesses in Sm-149 and Sm-150 isotopes resulting from Sm-149 (n,gamma) 150Sm reaction and 80Kr excesses produced by Br-79 (n,gamma) Kr-80 reaction in the GRIM glasses using mass-spectrometric techniques, it was shown that these glasses in shergottites EET79001 and Shergotty contain regolith materials irradiated by a thermal neutron fluence of approx.10(exp 15) n/sq cm near Martian surface. Also, it was shown that these glasses contain varying amounts of sulfates and sulfides based on the release patterns of SO2 (sulfate) and H2S (sulfide) using stepwise-heating mass-spectrometric techniques. Furthermore, EMPA and FE-SEM studies in basaltic-shergottite GRIM glasses EET79001, LithB (,507& ,69), Shergotty (DBS I &II), Zagami (,992 & ,994) showed positive correlation between FeO and "SO3" (sulfide + sulfate), whereas those belonging to olivine-phyric shergottites EET79001, LithA (,506, & ,77) showed positive correlation between CaO/Al2O3 and "SO3".
Learning to rank-based gene summary extraction.
Shang, Yue; Hao, Huihui; Wu, Jiajin; Lin, Hongfei
2014-01-01
In recent years, the biomedical literature has been growing rapidly. These articles provide a large amount of information about proteins, genes and their interactions. Reading such a huge amount of literature is a tedious task for researchers to gain knowledge about a gene. As a result, it is significant for biomedical researchers to have a quick understanding of the query concept by integrating its relevant resources. In the task of gene summary generation, we regard automatic summary as a ranking problem and apply the method of learning to rank to automatically solve this problem. This paper uses three features as a basis for sentence selection: gene ontology relevance, topic relevance and TextRank. From there, we obtain the feature weight vector using the learning to rank algorithm and predict the scores of candidate summary sentences and obtain top sentences to generate the summary. ROUGE (a toolkit for summarization of automatic evaluation) was used to evaluate the summarization result and the experimental results showed that our method outperforms the baseline techniques. According to the experimental result, the combination of three features can improve the performance of summary. The application of learning to rank can facilitate the further expansion of features for measuring the significance of sentences.
Connecting slow earthquakes to huge earthquakes.
Obara, Kazushige; Kato, Aitaro
2016-07-15
Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.
Colossal dielectric constant up to gigahertz at room temperature
NASA Astrophysics Data System (ADS)
Krohns, S.; Lunkenheimer, P.; Kant, Ch.; Pronin, A. V.; Brom, H. B.; Nugroho, A. A.; Diantoro, M.; Loidl, A.
2009-03-01
The applicability of recently discovered materials with extremely high ("colossal") dielectric constants, required for future electronics, suffers from the fact that their dielectric constant ɛ' only is huge in a limited frequency range below about 1 MHz. In the present report, we show that the dielectric properties of a charge-ordered nickelate, La15/8Sr1/8NiO4, surpass those of other materials. Especially, ɛ' retains its colossal magnitude of >10 000 well into the gigahertz range.